Empeek — аутсорсингова ІТ-компанія, яка займається розробкою програмного забезпечення для клієнтів сфери медичних послуг у США. Завдяки талановитій команді професіоналів, ми створюємо передові ІТ продукти у таких напрямах як телемедицина, Інтернет речей, діджиталізація медичних бізнес-процесів, тощо.
10 березня 2025

Senior Data Engineer (вакансія неактивна)

Львів, віддалено

Description: The project focuses on providing creative and comprehensive travel solutions for athletes, teams, coaches, parents, universities, and fans.

Requirements:

  • 4+ years of experience as a Data Engineer.
  • Strong proficiency in Python and SQL.
  • Strong expertise in data integration techniques and ETL (Extract, Transform, Load) processes.
  • Hands-on experience with cloud platforms such as AWS, GCP, or Azure.
  • Proficiency in deploying and managing data infrastructure on cloud environments.
  • Expertise in big data technologies such as Hadoop, Spark, Hive.
  • Experience with data warehousing solutions like Amazon Redshift, Google BigQuery, or Snowflake.
  • Familiarity with data pipeline and workflow management tools such as Apache Airflow or Luigi.
  • Knowledge of data modeling, data architecture, and database design principles.
  • Understanding of distributed systems, data storage, and data processing concepts.
  • Understanding of data quality principles and experience implementing data validation checks.
  • Knowledge of data governance frameworks and best practices.
  • At least Upper-Intermediate English; fluent in Ukrainian

Preferred Qualifications:

  • Experience with real-time data processing and streaming technologies like Apache Flink or Kafka Streams.
  • Familiarity with data visualization tools such as Tableau, PowerBI, or Looker.
  • Knowledge of machine learning concepts and experience with ML frameworks like TensorFlow or PyTorch.
  • Experience with dbt (Data Build Tool) for data transformation and modeling would be highly preferred.
  • Experience with any Python-based orchestration tool such as Prefect or Dagster

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines for both batch and real-time processing.
  • Develop and optimize a cloud-native data platform for machine learning and analytics.
  • Integrate internal and external applications via bi-directional data pipelines and APIs.
  • Ingest data into our data warehouse and distribute it to various providers (ESPs, internal applications, adtech platforms).
  • Managing the underlying infrastructure for data applications.
  • Ensure data reliability, security, and governance across platforms.
  • Advocate for new technologies and frameworks to improve data accessibility for business users.
  • Collaborate with data scientists and analysts to ensure efficient access to clean and structured data.

We offer:

  • Big stable project with a professional team.
  • Enterprise project.
  • Friendly and supportive work environment.
  • Competitive salary and benefits package.
  • Room for personal and professional growth.
  • Zero bureaucracy.
  • 18 business days of paid vacation + public holidays compensation.
  • Insurance Fund of the company.
  • Coverage of all professional studies.
  • Coverage of sick leaves, sports activities, and English language courses.