Raif is the part of the Raiffeisen Bank International AG (RBI) banking group, Austria. Our bank is a reliable business partner in the banking market of Ukraine, we strive to become a bank with the most recommended financial services and create a positive customer experience.
10 сентября 2021

Team lead data engineer (вакансия неактивна)

Киев

Skills and competences

Must have

  • experience in Data Engineering / DB Development / ETL Development (3+ years)
  • Hands-on experience with CDC, transactional & high loaded financial\payment systems & payments processing
  • expert knowledge of Python, SQL, PL/SQL
  • Practical experience with RDBMS (Oracle, PostgreSQL)
  • Practical experience with NoSQL databases
  • experience in Big Data stack (Hadoop, HBase, Kafka, Spark)
  • Hands-on experience with Kafka and Kafka Streams
  • Experience in Java (ver.8+ required; ideally,13)
  • Practical experience for various environments with:
  • Linux OS (RH, Centos, Debian, etc. );
  • K8s cluster: Docker (+ registry); Prometheus, Grafana; EFK stack: ElasticSearch, FluentD, Kibana; Jaeger; Redis as a microservices’ sidecar; Helm; Node exporter; Kafka exporter; Postgresql exporter; Docker image for openjdk; Calico; Ingress Nginx Controller for Kubernetes); PostgreSQL with Patroni (HA) ; Kafka cluster (manager) ;
  • CI/CD server (Jenkins, Gitlab, Nexus, SonarQube); Infrastructure provisioning (TerraForm, Terragrunt);
  • Components Orchestration (Ansible); Security (Hashicorp Vault)
  • Other FWs (Swagger, SpringBoot, etc.)
  • Understanding of Agile/Scrum.
  • good communication skills (English intermediate+) — verbal and written

Nice to have

  • experience in Apache NiFi
  • knowledge of Jenkins and CI/CD principles, experience in pipelines development
  • experience in AWS (S3, Glue, Athena, EMR, Redshift)
  • knowledge and experience in Data Science / Machine Learning
  • experience with processing of large amount of data
  • Deep understanding of microservices architectural principles
  • Pulsar, Debezium (CDC), Flink (streaming);
  • опыт с OLAP DB experience (Greenplum , ClickHouse, etc.) ;
  • Airflow (pipelines) ;
  • PowerBI / Superset (visualization).
  • analytics and design skills
  • experience in Jira and Confluence
  • understanding of microservices architectural principles

Responsibilities

  • design and development of data processing pipelines for Payments solutions
  • building cloud native deliveries for on-premise Kubernetes cluster or AWS deployment
  • interacting with stakeholders for requirements elicitation
  • research and prototyping of promising tools/approaches/practices with further implementation
  • managing knowledge base / technical documentation for developed solutions
  • participating in Enterprise Data Platform design and development

Expectations

  • passion for data
  • good team player
  • fast learner and adaptable to changing environment
  • result oriented and proactive
  • problem solving skills

Project info

We develop an automated and flexibly adjustable microservices based systems. One of them (CM service) that will:

  • provide Integration Wall street system with new CM service
  • ensure STP and eliminate/minimize manual work
  • increase transaction processing capacity and speed up processing time
  • allow for quick adaptation of new products (e.g. derivatives)
  • allow for automated real-time position keeping, mark-to-market, revaluation of portfolios and limit control

CM service to be onboarded onto platform for Developers and QA with SRE practices. The platform must have Kubernetes, Observability, CI/CD, secret management, traffic management, etc. We want to build a hybrid cloud. Focus on open source.

Our stack:

• OS: Linux (RHEL based)

• Orchestration: Kubernetes

• Development stack: Java based microservices (Spring boot and not only)

• Observability: Prometheus, Grafana, Zabbix, Appdynamics, EFK, Opsgenie, Jaeger

• CI/CD: GitlabCI, Jenkins (imperative pipelines)

• IaC and Config management: Ansible, Terraform

• Message broker: Kafka, IBM MQ

• Database: Postgres, Oracle DB (plus integration with other data sources)

• NoSQL: Redis, MongoDB