Project is a technology-driven, diversified principal trading company. The client trades his own capital at his own risk, using a wide range of asset classes, instruments and strategies in financial markets around the world.
What you’ll be working on:
- You will be working on building an efficient, scalable data warehouse for analyzing public and proprietary data feeds across a number data providers and data sets;
- Design and build data pipelines in Python to integrate data sets from various resources;
- Monitor ETL pipelines & data sets, ensuring availability & integrity of our data;
- Transform data for further analysis.
Requirements:
- Python development experience;
- Familiarity with NoSQL databases and distributed file systems;
- Working with Amazon Web Services (S3 and EC2);
- Knowledge of Linux environment.
Bonus points if you have:
- Proficient with containerization and container orchestration tools such as Kubernetes within a Linux environment.