As a part of our team Data Engineer will be responsible for preprocessing, cleaning and storing financial data sets: market data, fundamentals and alternative data sets. The role also includes writing ETL and data processing pipelines used by our engineering and research team.
Responsibilities:
- Processing and storing high-volume financial datasets.
- Creating ETL pipelines using AWS/internal tools.
- Preprocessing data for our research and development team.
- Be part of the team responsible for real-time trading implementation.
Hard Requirements:
- 2+ years experience with Python.
- Good understanding of analytics, data pipelines and data transformation
- Proficiency with Pandas and Numpy.
- Proficiency with SQL and columnar databases.
- Experience in Linux development (bash, cron, jq).
- Experience with AWS or any other cloud computing platform (Digital Ocean, GCP, Azure).
- Experience with Apache Spark/Airflow.
- Hands on experience working with big data.
- Knowledge of statistics.
- Fluent English (At least Upper-Intermediate level).
Soft requirements:
- Experience in financial field.
- Experience with Apache Kafka/RedPanda.