22 листопада 2024
Senior Big Data Engineer (#2447)
Київ, Львів, Дніпро, Вінниця, Івано-Франківськ, Тернопіль, віддалено
N-iX is looking for a Senior Big Data Engineer to join our team!
Our client is a leading provider of technical services, delivering both standard and custom intranet and internet-based software and applications systems. Due to increasing demand for in-house digital projects, the client is seeking to outsource certain development tasks to strategic partners.
The client is looking to build strategic long-term relationships with leading development partners to accelerate business growth through high-quality and cost-efficient software development. The selected partner(s) will support the development of robust and scalable Consumer and Enterprise applications.
Responsibilities:
- Design and develop data pipelines and ETL processes using tools such as Apache Airflow, NiFi, and dbt.
- Implement and manage large-scale, distributed data processing frameworks.
- Collaborate closely with data scientists to ensure data availability and quality for machine learning models.
- Optimize performance of data storage solutions, including SQL/Relational (PostgreSQL, Oracle) and NoSQL (MongoDB, Redis) databases.
- Work with streaming and CDC technologies like Kafka and Debezium to facilitate real-time data processing.
- Participate in all stages of SDLC from conceptualization to deployment, focusing on CI/CD principles using Docker, Kubernetes, and GitLab.
- Set up and manage data warehousing solutions (Postgres, Oracle, Clickhouse) to support analytical requirements.
- Provide technical expertise and best practices for data engineering solutions, including data governance and security.
Requirements:
- Proven experience (5+ years) as a Big Data Engineer or similar role.
- Strong expertise in ETL tools: Apache Airflow, NiFi, Huawei ETL, ODI, and dbt.
- Experience with streaming and change data capture tools: Kafka, Debezium.
- Proficiency in SQL and NoSQL databases, including PostgreSQL, Oracle, MongoDB, and Redis.
- Experience with data warehousing solutions like Clickhouse, Postgres, and Oracle.
- Familiarity with cloud-based data processing environments and containerization tools such as Docker and Kubernetes.
- Excellent problem-solving and analytical skills, with attention to detail.
- Bachelor’s degree in Computer Science, Information Systems, or a related field.
- Upper-Intermediate level of English.
Nice to Have:
- Experience with private cloud setups.
- Familiarity with data science and machine learning frameworks like TensorFlow, PyTorch.
- Knowledge of data visualization tools: QlikView, Tableau, Apache Superset, Grafana.
We offer:
- Flexible working format — remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits