Requirements:
— 2+ years of experience as a Data Engineering
— Proficient with SQL and RDBMS, preferably PostgreSQL, MariaDB
— Professional experience using Python for data processing purposes
— Experience with Airflow, i.e. creating DAGs, designing and monitoring data pipelines
— Strong data munging / processing skills
— Experience in understanding Data Warehouse building principles
— Hands-on experience in designing data pipelines ETL, ELT, exposing processed data to end users
Will be a plus
— Experience with ElasticSearch or other NoSQL data stores
— Experience with streaming based, about real-time Data Warehouses, mostly interested in Apache Kafka, KSQL, Spark or similar
— Familiar with CI / CD using Jenkins pipelines or similar tools
— Data synchronization tools/processes between heterogeneous data sources (SQL, NoSQL, DMS, Debezium, ElasticSearch etc)
— Degree in Data Science, Statistics, Applied Math, Econometrics, Computer Science, or other related fields
We offer
— Competitive compensation depending on experience and skills
— Long-term employment
— Career growth opportunities
— Compensation for sick lists and regular vacations
— English classes with a native speaker
— Health insurance
— Free lunches
— Free fruits and sweets
— Relax zone with PlayStation and TV
— Comfortable office near Dorohozhychi metro station
Duties:
— Architect, Design and implement ETL pipelines for multiple data sources ingesting massive amounts of data daily
— Develop and maintain new projects and tools around the data warehouse
— Optimize, improve, and do refactoring of current applications if deems fit
— Collaborating with software engineers to capture, format and prepare data for various purposes
— To develop and maintain the API interfaces necessary to access the data
Project description:
iGaming platform — microservice based, event-driven, reactive system, delivering high performance and resilient solutions to iGaming operators.