Сучасна диджитал-освіта для дітей — безоплатне заняття в GoITeens ×
Протягом 20 рокiв GlobalLogic надає консультацiї, розробляє та створює чудовi цифровi продукти та програмне забезпечення у самiсiнькому серцi Кремнiєвої Долини. Ми гарантуємо вам доброзичливе та iнклюзивне середовище, де у вас буде достатньо викликiв, i де ви будете вчитися та зростати кожен день.
8 серпня 2022

Principal Data Engineer (IRC139545) (вакансія неактивна)

Київ, Харків, Львів, Миколаїв, віддалено

GlobalLogic is inviting an experienced Principal Data Engineer to join our engineering team.

The Client is the largest cloud computing and virtualization technology company.

Our team is working on the development of a sophisticated Analytics as a Service (AaaS) solution.

As a part of the team, you will contribute to the creation of means for data ingestion, validation/verification, multiple storage zones, processing, and analytics on top of that. The final solution can be used by teams of various maturity or size for smaller teams looking for the basics, or large teams interested in advanced APIs and tooling.

Requirements:

  • 4+ years building conceptual, logical, and/or physical database designs, or the pipelines required to build them
  • 2+ years of hands-on Python experience (especially WHL and packed code)
  • Expert-level knowledge of using SQL to write complex, highly optimized queries across large volumes of data.
  • 2+ years of Kimball (Dimensional Modeling) Expertise
  • 2+ years of Spark experience (especially Databricks Spark and Delta Lake)
  • Experience with source control (git) on the command line
  • Strong hands-on experience implementing big-data solutions in the Azure Data Lake ecosystem (Azure Data Lake, Databricks).
  • Ability to work independently and provide guidance to junior data engineers.

Nice to have:

  • Experience with big data pipelines or DAG Tools (dbt, Data Factory, Airflow, or similar)
  • Experience with Kafka or other streaming technology (or a willingness to learn)
  • Experience with database deployment pipelines (i.e., dacpac’s or similar technology)
  • Once or more unit testing or data quality frameworks
  • Experience with MLFlow and other MLOps pipeline technology

Preferences:

  • Python,
  • Azure data factory

Responsibilities:

  • Take major part in requirements definition, architecture, and design;
  • Architect of subsystems and understand the impact across products;
  • Design and implement core data flow components by project standards;
  • Regular communications with the client and teams;
  • Ensure project delivery in a timely matter;
  • Lead engineering efficiency efforts. Improve operational efficiency;
  • Daily project activities (daily standups, technical advisory, reporting, etc);
  • Collaborating with the Dev Leads and Product Manager to ensure that the requirements;
  • Managing daily project activities.

We offer:

  • Interesting and challenging work in a large and dynamically developing company
  • Exciting projects involving the newest technologies
  • Professional development opportunities
  • Excellent compensation and benefits package

Гарячі вакансії

Всі вакансії