Big Data Developer - Spark, Python or Scala

Company: Computer Enterprises, Inc.
Location: Philadelphia, Pennsylvania, United States
Type: Full-time
Posted: 11.FEB.2021
< >


Big Data Engineer Location: Philadelphia, PA 19103 ***Open to Remote - Candidates will need to work on EST*** Group Summary: This is a...


Big Data Engineer

Location: Philadelphia, PA 19103

***Open to Remote - Candidates will need to work on EST***

Group Summary:

This is a position within the Mobile Strategic Development organization. This group has responsibility for all Data Management and Data Exchanges with all parties interfacing with our Mobile Group. One of the major goals is to harmonize the data ingestion and consumption layer and a single version of truth that all operational systems and reporting can consume.

Position Summary:

The Strategic Development Team is looking for a knowledgeable, self-driven Bigdata developer who takes responsibility and ownership in providing software solutions and contributing to the overall success of the team and Mobile Team as a whole. The Data Management team is currently managing development and deployment of time sensitive Data with strict data delivery SLAs. The individual in this position will manage exciting multi-technology stack focused on data management and data services for development, delivery, and operational efficiencies through best practices, industry standards and high quality of engineering.


  • The individual will collaborate and partner with the technical and business team to develop and deliver solutions as per strategies, requirements, roadmap.
  • The individual will help support all project related architecture, design, development, deployment of data-oriented integration across platforms and projects as matrix organization.
  • The individual will provide s/w solutions to technically challenging business requirements (complex transformations, complex class hierarchy, high data volume…).
  • The ideal candidate will have extensive experience in dealing with high volume, velocity, and variety data, understanding of technical and functional designs for ETL, Bigdata tools and technologies, Micro services, Bigdata Platforms, SQL and Non-SQL Databases, Data Warehousing, Database designs (normalized, dimensional, vault) and reporting areas. This job plays a key role in providing data for and operational systems projects.
  • This position will be responsible for developing, maintaining, testing, tuning, deployment, and operations of software solutions on various platforms and technologies.
  • The Engineer will work closely with the rest of the development team, and DevOps teams to ensure development and deployment of high available and resilient applications.
  • This position will also work closely with business, development leads, architects and other development teams in an agile manner to quickly realize business value.

Top Requirements:

  • Spark
  • Python or Scala
  • Object Oriented Programming


  • 5- 8 years of IT experience in Object Oriented Programming concepts and experience in OOP using Java, Python or Scala.
  • Expert on Spark, Python
  • Some working Knowledge of Pentaho
  • Must be able to create Class definitions, and multi-level class hierarchies.
  • Expert in Big Data Development using Java, Scala, Python on Hadoop, Spark, Structured Streaming, Kafka/Kafka-like messaging systems
  • Be able to stand up RESTful data services using microservice frameworks like Flask, Django or Spring in python
  • Expert in NoSQL and relational data modeling, object-relational integration (ORM), physical design/tuning using python
  • Be able to call APIs
  • Be able to manipulate XML and JSON (Parsing and creation)
  • Experience in building secure applications
  • Experience in the use of Maven, sbt for build and Jenkins for deployment pipelines
  • Experience in using industry standard IDEs like IntelliJ, Eclipse, PyCharm
  • Interfacing with Comcast security model and Codebig using spring boot
  • Experience with RDBMS concepts like writing Queries, Functions, Triggers, Stored Procedures and PL/SQL packages.
  • Experience in working version control using git and GitHub
  • Experience with Linux operating Systems
  • Experience in modifying and compiling Scala and spark applications
  • Excellent skills in Shell Scripting
  • Having knowledge and hands-on experience with Log4j or logging tools/methods.
  • AWS solutions using E2C, S3, RDS, and other cloud components like data bricks - a plus
  • Experience in building and maintaining micro services in Cloud Foundry - a plus
  • Have experience in software Delivery using agile methodologies - Scrum and TDD, BDD.

Soft Skills:

  • Skills in navigating a large organization in order to accomplish results required.
  • Ability to initiate and follow through on complex projects of both short- and long-term duration required.
  • Excellent organizational and time management skills required.
  • Excellent analytical and problem-solving skills required.
  • Experience in teleconferencing and presenting over web-based tools.
  • Excellent Verbal and written communication.

Apply Now


Loader2 Processing ...