We — Aval — Raiffeisen Bank Aval — Raiffeisen Bank Ukraine — are a Ukrainian bank. We have been creating and developing our country’s banking system #Together_with_Ukraine for 31 years since the first steps of the Independence. We have been and still are one of the largest countries banks, a reliable partner for millions of Ukrainians.
10 вересня 2021

Team lead data engineer (вакансія неактивна)

Київ

Skills and competences

Must have

  • experience in Data Engineering / DB Development / ETL Development (3+ years)
  • Hands-on experience with CDC, transactional & high loaded financial\payment systems & payments processing
  • expert knowledge of Python, SQL, PL/SQL
  • Practical experience with RDBMS (Oracle, PostgreSQL)
  • Practical experience with NoSQL databases
  • experience in Big Data stack (Hadoop, HBase, Kafka, Spark)
  • Hands-on experience with Kafka and Kafka Streams
  • Experience in Java (ver.8+ required; ideally,13)
  • Practical experience for various environments with:
  • Linux OS (RH, Centos, Debian, etc. );
  • K8s cluster: Docker (+ registry); Prometheus, Grafana; EFK stack: ElasticSearch, FluentD, Kibana; Jaeger; Redis as a microservices’ sidecar; Helm; Node exporter; Kafka exporter; Postgresql exporter; Docker image for openjdk; Calico; Ingress Nginx Controller for Kubernetes); PostgreSQL with Patroni (HA) ; Kafka cluster (manager) ;
  • CI/CD server (Jenkins, Gitlab, Nexus, SonarQube); Infrastructure provisioning (TerraForm, Terragrunt);
  • Components Orchestration (Ansible); Security (Hashicorp Vault)
  • Other FWs (Swagger, SpringBoot, etc.)
  • Understanding of Agile/Scrum.
  • good communication skills (English intermediate+) — verbal and written

Nice to have

  • experience in Apache NiFi
  • knowledge of Jenkins and CI/CD principles, experience in pipelines development
  • experience in AWS (S3, Glue, Athena, EMR, Redshift)
  • knowledge and experience in Data Science / Machine Learning
  • experience with processing of large amount of data
  • Deep understanding of microservices architectural principles
  • Pulsar, Debezium (CDC), Flink (streaming);
  • опыт с OLAP DB experience (Greenplum , ClickHouse, etc.) ;
  • Airflow (pipelines) ;
  • PowerBI / Superset (visualization).
  • analytics and design skills
  • experience in Jira and Confluence
  • understanding of microservices architectural principles

Responsibilities

  • design and development of data processing pipelines for Payments solutions
  • building cloud native deliveries for on-premise Kubernetes cluster or AWS deployment
  • interacting with stakeholders for requirements elicitation
  • research and prototyping of promising tools/approaches/practices with further implementation
  • managing knowledge base / technical documentation for developed solutions
  • participating in Enterprise Data Platform design and development

Expectations

  • passion for data
  • good team player
  • fast learner and adaptable to changing environment
  • result oriented and proactive
  • problem solving skills

Project info

We develop an automated and flexibly adjustable microservices based systems. One of them (CM service) that will:

  • provide Integration Wall street system with new CM service
  • ensure STP and eliminate/minimize manual work
  • increase transaction processing capacity and speed up processing time
  • allow for quick adaptation of new products (e.g. derivatives)
  • allow for automated real-time position keeping, mark-to-market, revaluation of portfolios and limit control

CM service to be onboarded onto platform for Developers and QA with SRE practices. The platform must have Kubernetes, Observability, CI/CD, secret management, traffic management, etc. We want to build a hybrid cloud. Focus on open source.

Our stack:

• OS: Linux (RHEL based)

• Orchestration: Kubernetes

• Development stack: Java based microservices (Spring boot and not only)

• Observability: Prometheus, Grafana, Zabbix, Appdynamics, EFK, Opsgenie, Jaeger

• CI/CD: GitlabCI, Jenkins (imperative pipelines)

• IaC and Config management: Ansible, Terraform

• Message broker: Kafka, IBM MQ

• Database: Postgres, Oracle DB (plus integration with other data sources)

• NoSQL: Redis, MongoDB

Гарячі вакансії

Всі вакансії