We are seeking for a Data Engineer to join a Swedish project that specializes in automating the operations of data monetization and implementation of successful data strategies.
Duties and Responsibilities:
— Architect and implement effective and robust data solutions
— Make clever engineering and product decisions based on data analysis and collaboration
— 3+ years of experience in data engineering with Python lang
— Master’s or Bachelor’s Degree in a quantitative field (Math, Statistics, Computer Science, Engineering, Data Science, Operations Research, etc.).
— Strong data modeling skills
— 1+ years of working with Snowflake and DBT
— Strong experience in writing complex SQL
— Strong experience in writing ETL / ELT data pipelines
— Experience with data flow automation tools like Airflow / Prefect/ Dagster
— CI/CD processes (GitLab CI/CD, Github CI/CD, Jenkins)
— At least 1 cloud (AWS / GCP / Azure) data engineering stack and experience implementing effective solutions
— Good understanding of distributed systems
— Experience with data quality processes, checks, validations, the definition of data quality metrics, and measurement
Will be a huge plus:
— Experience with Snowpark API
— Experience with big data processing frameworks (PySpark, Dask, etc)
— Experience with Kubernetes / Docker
Skills and abilities:
— Team playing and result orientation skills
— Self-management skills
— Remote-first model of work;
— Registering as a private entrepreneur/ B2B contractor;
— Paid vacation (18 working days per year);
— Paid sick leaves (10 working days per year).
— Paid medical insurance after the probation period.