Commited: 50+ воркшопів з ТОПами тестування: M.Bolton, G.Bahmutov, T.King, R.Desyatnikov, J.Bach. Лише за $40 на рік та економія $15 до 20 квітня
GlobalLogic — лідер у сфері послуг із розробки цифрових продуктів. Ми допомагаємо нашим клієнтам проектувати і створювати інноваційні продукти, платформи і цифрові рішення для сучасного світу.
5 марта 2021

Data Engineer (IRC110060) (вакансия неактивна)

Киев

Необходимые навыки

PhD, Master or Bachelor degree in Computer Science, Math, Physics, Engineering, Statistics or another technical field;
Experience in IT for more than 4 years;
Experience in Data Engineering for more than 4 years;
Good understanding of major programming languages like Java, Scala, Python;
Familiar with defining Data Governance concepts (incl. data lineage, data dictionary);
Strong knowledge in data modeling, query optimization on different storage solutions such as RDMS, document stores;

Commercial experience with:
Building and optimizing big data pipelines, architectures, and data sets;
Performing root cause analysis on internal and external data processes;
Distributed systems, Big Data technologies, Streaming technologies (e.g. Hadoop, Spark, Kafka, Data Lakes);
Variety of database and data warehouse technologies (including SQL Server, PostgreSQL, RedShift, and Cassandra);
CI/CD;

Experience with different delivery and development processes and practices:

Troubleshooting, profiling, and debugging;
Agile, SCRUM software processes and technologies;
Code Review process;
Refactoring process;
Upper-Intermediate English.

Desirable

Certifications on:
Cloud Computing Platforms (e.g. AWS, GCP, Azure);
Commercial experience with:
Cloud Computing Platforms (e.g. AWS, GCP, OpenShift);
Orchestration and containerization (Kubernetes, Dockers);
Graph databases, time-series databases, data warehouses;
Creation of software architecture and design of complex solutions;

Будет плюсом

Docker, Kubernetes, Java, Python, Scala

Предлагаем

Interesting and challenging work in a large and dynamically developing company
Exciting projects involving the newest technologies
Professional development opportunities
Excellent compensation and benefits package, performance bonus program
Modern and comfortable office facilities

Обязанности

Develop and maintain optimal data pipeline architecture;
Identify ways to improve data reliability, efficiency, and quality;
Prepare data for predictive and prescriptive modeling Build predictive models and machine-learning algorithms;
Create data tools for data scientist team members;
Taking part in the decision-making process in design, solution development, and code review;
Working in an international distributed team;
Communicate with PM, PO, Developers and other colleagues and stakeholders;
Delivering the product roadmap and plannings, create estimations;

О проекте

The client is a product international company with a common goal to redefine the legacy approach to Privileged Access Management by delivering multi-cloud-architected solutions to enable digital transformation at scale. The client company establishes a root of trust and then grants the least privilege access just-in-time based on verifying who is requesting access, the context of the request, and the risk of the access environment.

The client’s products centralize and orchestrate fragmented identities, improves audit and compliance visibility, and reduces risk, complexity, and costs for the modern, hybrid enterprise. Over half of the Fortune 100, the world’s largest financial institutions, intelligence agencies, and critical infrastructure companies, all trust this company to stop the leading cause of breaches — privileged credential abuse.

Горячие вакансии

Все вакансии