20 січня 2021

Strong Middle/Senior Big Data Engineer (вакансія неактивна)

Київ, Харків, Львів, Одеса, Івано-Франківськ

Необхідні навички

At least 2 years of experience in Data Engineers using Spark/Scala;
Proficiency in SQL;
Hands-on experience with Scala/Java;
Working experience with AWS Cloud;
Understanding the principles of Massive Parallel Processing;
BI tools knowledge;
Understanding of data modeling and data warehousing concepts;
Excellent communication and interpersonal skills.

Буде плюсом

ETL development background;
Column-based data warehouse: Amazon Redshift / Snowflake / Google BigTable / Teradata / CosmosDB;
Process orchestration software: Apache Airflow / Prefect / Dagster;
Serverless computation frameworks.

Пропонуємо

Besides such basics as a competitive salary, comfortable and motivating work environment, here at Intellias we offer:

For your professional growth —
Innovative projects with advanced technologies;
Individual approach to professional and career growth (Personal Development Plan);
Regular educational events with leading industry experts;
English courses.

For your comfort —
Flexible working hours;
Spacious office with lots of meeting rooms;
Relocation program;
Kids’ room with professional baby-sitter (offices in Lviv & Kyiv).

For your health —
3 health packages to choose from — medical insurance, sports attendance or mix of both;
Annual vitaminization program;
Annual vaccination and ophthalmologist check-up.

For your leisure —
Corporate celebrations and fun activities;
Beauty parlor (offices in Lviv & Kyiv).

Обов’язки

Developing data pipelines and data-marts on Spark and Scala;
Creating new high-load data processing services;
Architect, design, code and maintain components for aggregating billions of DB records;
Managing the cloud-based data & analytics platform;
Deploying updates and fixes and assist technical support;
Working directly with the business teams to rapidly prototype analytics solutions based upon business requirements;
Exploration and validation of data from various sources, generation of new reports based on investigated data;
Gathering, analysis and documentation of requirements for data processing and reporting;
Development of data-processing pipelines and Operational monitoring of key metrics and determining the causes of deviations from the expected values.

Про проєкт

Our customer is a niche engineering company based in Leuven, Belgium and specialized in building Data Lakes using Scala, Spark, AWS, airflow and bit of Python. Their customers are big enterprises in Benilux region in Telecom, eCommerce, Energy and other industries.

Гарячі вакансії

Всі вакансії