Починаючи з жовтня 2005 року, Райффайзен Банк Аваль є частиною австрійської банківської групи «Райффайзен Банк Інтернаціональ».
20 апреля 2021

DevOps Engineer (Big Data)


Необходимые навыки

What we need:
— experience in Data Engineering / DB Development / ETL-ELT Development (3+ years)
— expert knowledge of Python, SQL, PL/SQL
— good knowledge of RDBMS (Oracle, PostgreSQL, SAP IQ)
— experience in Big Data stack (Hadoop, HBase, Kafka, Spark, Nifi, Superset)
— experience in AWS (EC2, S3, EBS, EKS, Lambda, Athena, SNS/SQS)
— experience in Kubernetes, Docker, Linux, UNIX
— experience in Git (GitLab), Bitbucket, Ansible, Bash, Nexus
— experience in maintaining and executing build scripts to automate development and production builds

IT skills:
— practical experience in IT systems building & support;
— automation of: installation & maintenance routines, troubleshooting, incident mgmt (logs; monitoring KPIs, triggers & alerts)
— analyzing the needs and proposals of business units in order to improve the services
— control and enforcement of services on-time implementation & delivery
v- experience in Network Protocols: SMTP, SNMP, ICMP, TCP/IP, FTP, TELNET, NIS, LDAP, UDP
— good communication skills (English intermediate+) — verbal and written

Будет плюсом

Would be a plus:
— experience in Apache NiFi, Pulsar, Groovy
— knowledge of Jenkins and CI/CD principles, experience in pipelines development
— knowledge and experience in Data Science / Machine Learning
— experience with processing of large amount of data
— analytics and design skills
— experience in Jira and Confluence


What we offer:
— join a large international company that gives possibility for professional and personal growth
— involvement in challenging, large-scale and diverse projects which have impact for the customers
— knowledge sharing with colleagues from abroad strong IT community including 14 Raiffeisen Group Banks
— competitive salary
— official employment, 28 days of paid vacation
— in-company events and involvement into social projects


What to do:
— design and development of data processing pipelines for ML solutions
— building cloud native deliveries for on-premise Kubernetes cluster or AWS deployment
— interacting with stakeholders for requirements elicitation
— research and prototyping of promising tools/approaches/practices with further implementation
— managing knowledge base / technical documentation for developed solutions
— participating in Enterprise Data Platform design and development
— Deep co-operation with Product Teams (POs, analysts, architectors, devs, testers, IT-deliverers, etc.);
— SW Architecture design & maintenance;
— Release, Delivery , Change Mgmt, Support , Incident mgmt & Troubleshooting:
— Planning & control. Improvement & development
— Technical requirements & specifications preparation & approval
— API/ specifications development, maintenance & approval
— Ensuring quality testing (QAT, integreation)
— Providing quality & on-time SW delivery (installation & configuration) & support

О проекте

We want to develop event triggers based on ML & AI to define and get a next best product &/or service for our customers.
Team will be responsible for platform solution design, development & further support with ambition plan to grow and onboard new teams & business hypothesis to this platform.