The Hadoop Developer is responsible for designing, developing, testing, tuning and building a large-scale data processing syst...
The Hadoop Developer is responsible for designing, developing, testing, tuning and building a large-scale data processing system, for Data Ingestion and Data products that allow Hilton to improve quality, velocity and monetization of our data assets for both Operational Applications and Analytical needs. This position supports this goal with strong experience in software engineering and development of solutions within the Hadoop Ecosystem. They will utilize their experience hand coding ETL with Scala to extract, transform and load data between zones in the data lake (Raw Zone to Enterprise Zone to Business Zone). The Raw zone data is loaded primarily using HFDS, the Enterprise zone has a large variety of target systems, and the business zone is built primarily with Tableau and MicroStrategy as the tools.
• Works with BA's, end users and architects to define and process requirements, build code efficiently and work in collaboration with the rest of the team for effective solutions.
• Has strong analytical SQL experience working with dimensional modeling.• Research, develop and modify ETL processes and job according to the requirements.
• Troubleshoot and develop on Hadoop technologies including HDFS, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via hand coding with Scala or Java.
• Knowledge of and experience with any Azure Data Platform components - Azure Data Lake, Data Factory, Data Management Gateway, Azure Storage Options, DocumentDB, Data Lake Analytics, Stream Analytics, EventHubs, Azure SQL
• Translate, load and present disparate data sets in multiple formats and multiple sources including JSON, Avro, text files, Kafka queues, and log data.
•Will implement quality logical and physical ETL designs that have been optimized to meet the operational performance requirements for our multiple solutions and products, this will include the implementation of sound architecture, design, and development standards.
• Has the experience to design the optimal performance strategy, and manage the technical metadata across all ETL jobs.
• Responsible for building solutions involving large data sets using SQL methodologies, Data Integration Tools
Top Skills Details:
1) Ability to do ETL hand-coding with Scala - will be a significant portion of the role
2) 5 years or so of Big Data development experience including python (use it to support data science team)
3) Great attitude - ability to manage complex decisions within the code
Additional Skills & Qualifications:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.