About the Team
The team is chartered with developing mission critical products for our cloud offering using cutting edge technologies. The ...
About the Team
The team is chartered with developing mission critical products for our cloud offering using cutting edge technologies. The team is developing a robust data-warehouse infrastructure that drives a variety of business intelligence use cases for our customers. This project involves the development of data solutions that provides intelligent insights to end users and other business stakeholders while honoring the security guidelines around data privacy and collaboration.
We are looking for a software engineer to work on a workflow engine at Intralinks. The project brings business value through execution of predictable and repeatable tasks at scale. This project has lot of visibility across the business with one-pointed focus on the customers to serve a wide range of compliance and regulatory data use cases. You will join us to participate in developing, and maintaining step based and DAG workflows. You will develop Python code to extract, process compress and transfer data from/to S3. You will develop a RESTful service to for customers and operators to track, analyze and manage workflow tasks.
Required Qualifications and Experience:
- Python, Kubernetes
- Bachelor's degree (computer science or equivalent)
- SQL databases, security protocols/ HTTPS/O-Auth 2 to use it.
- 2+ development years of experience in building data solutions using Python
- Experience with deployment automation and Infrastructure
- Experience with RESTful API services, and data storage
- Experience with public clouds like AWS or Google Cloud
- Experience with git - Repository to check code-
- Experience with Writing Python Code & Using Kubernetes
- Experience with workflow engines like Argo, Airflow
- Testing and unit testing frameworks - QA / Jenkins
- Experience with DAG (Directed Acyclic Graph)
- Machine Learning