Привіт!
Ти маєш дещо знати про GoInteractive. Ми — американська компанія, що створює команди девелоперів в Україні для наших західних партнерів.
На відміну від аутсорсу, де завдання приходять від різних замовників, у нас ти станеш повноцінним членом їх команди і працюватимеш виключно з їх продуктом.
We are looking for an experienced Data Engineer to join one of our enterprise clients’ teams.
Responsibilities:
Design, develop, and maintain ETL/ELT pipelines for ingesting, transforming, and loading data from multiple sources (operational systems, APIs, files, etc.).
Collaborate closely with business users and domain experts (Finance, Operations, Procurement, Marketing, etc.) to understand business processes and translate them into data models.
Implement robust data architectures supporting Data Warehouse and Data Lakehouse environments.
Develop data transformation logic using SQL, Python, and Spark.
Ensure data quality, consistency, and performance through monitoring, documentation, and automation.
Work with BI, Data Science, and Data Governance teams to support analytics, reporting, and machine learning initiatives.
Continuously improve data engineering best practices and development standards (CI/CD, DataOps).
Requirements:
3+ years of experience as a Data Engineer or in a similar data-focused role.
Proven experience with Databricks, Apache Spark, or equivalent distributed processing platforms.
Strong SQL skills and hands-on experience in building scalable data pipelines (ETL/ELT).
Proficiency in Python.
Experience with relational and non-relational databases.
Strong analytical and problem-solving skills with attention to detail.
Excellent communication skills and the ability to collaborate effectively with non-technical stakeholders.
Good understanding of business processes (Finance, Supply Chain, Operations, etc.) and their data flows.
English fluency is required.
Will be a plus:
Experience with Databricks Lakehouse, including Delta Lake, Unity Catalog, and Workflows.
Familiarity with Airflow, Data Factory, or similar orchestration tools.
Understanding of Data Governance, Metadata Management, and Data Modeling (Star Schema, Data Vault).
Cloud certifications (e.g., Databricks Certified Data Engineer, Azure/AWS/GCP Data Engineer) — a strong advantage.
We offer:
Long-term employment with competitive compensation, based on experience
Possibility to work remotely
An open, transparent and fun work culture
Multi-national team and collaborative work environment
Continuous knowledge sharing with engaged co-workers