Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collab...
Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feels and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below ...ResponsibilitiesFunction as an integrator between business needs and technology solutions, helping to create technology solutions to meet clients' requirements. Be responsible for developing andtesting solutions that aligns with clients' systems strategy, requirements, and design as well as supporting system implementation. Manage data pipeline process starting from acquisition to ingestion, storage, and provisioning of data to point-of-impact by modernizing and enabling new capabilities. Facilitate Data Integration on traditional and Hadoop environments by assessing client's enterprise IT environments. Guide clients to the future IT environment state to support meeting their long-term business goals. Enhance business drivers through enterprise-scale applications that enable visualization, consumption and monetization of both structured and unstructured data.The TeamDeloitte Consulting's Analytics & Cognitive offering leverages the power of analytics, robotics, and cognitive technologies to uncover hidden relationships from vast troves of data, create and manage large-scale organizational intelligence, and generate insights that catalyze growth and efficiencies.QualificationsRequiredStrong technical expertise in most of the following:Hadoop (Cloudera distribution)Spark with Scala or Python programmingHive Tuning, Bucketing, Partitioning, UDF, UDAFNOSQL Data Base such as HBase, MongoDB or CassandraExperience and knowledge working in Kafka, Spark streaming, Sqoop, Oozie, Airflow, Control-M, Presto, No SQL, SQLExpert level usage with Jenkins, GitHub is preferredKnowledge of working in financial/insurance domain6+ years of experience of professional work experienceStrong technical skills including understanding of software development principlesHands-on programming experienceMust live a commutable distance to one of the following cities: Atlanta, GA; Austin, TX; Boston, MA; Charlotte, NC; Chicago, IL; Cincinnati, OH; Cleveland, OH; Dallas, TX; Detroit, MI; Gilbert, AZ; Houston, TX; Indianapolis, IN; Kansas City, MO; Lake Mary, FL; Los Angeles, CA; Mechanicsburg, PA; Miami, FL; McLean, VA; Minneapolis, MN; Nashville, TN; Orange County, CA; Philadelphia, PA; Phoenix, AZ; Pittsburgh, PA; Rosslyn, VA; Sacramento, CA; St. Louis, MO; San Diego, CA; Seattle, WA; Tallahassee, FL; Tampa, FL; or be willing to relocate to one of the following USDC locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA.Limited Immigration sponsorship may be available.Ability to travel up to 15% (While 15% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice.)Preferred3 + years' experience working with Big Data eco-system including tools such as Hadoop, Spark, Map Reduce, Sqoop, HBase, Hive and ImpalaProficiency in one or more modern programming languages like Python or ScalaExperience on data lakes, datahub implementationKnowledge on AWS or Azure platformsKnowledgeable in techniques for designing Hadoop-based file layout optimized to meet business needsAble to translate business requirements into logical and physical file structure designAbility to build and test solution in agile delivery mannerAbility to articulate reasons behind the design choices being madeBachelor of Science in Computer Science, Engineering, or MIS, or equivalent experienceAny bigdata certification is a plus
2023 © All Rights Reserved. Privacy Policy | Terms of Service