• University education (Bachelor’s degree)
• University education (Master’s degree)
• Mathematical insight, Data Analysis, Data Business Intelligence, Predictive Analytics, Machine Learning, Big Data, Hadoop, Spark, Impala, Kafka
• Language skills
English — Advanced (C1)
• UNIX/Linux — advanced
• SQL — advanced
• Spark / pySpark / Python
• S3 — Cloud Storage (HDFS and EMRFS)
• EC2 — Compute Service in the Cloud
• Linux and basic administration
• Bash scripts development
• Batch data processing
• At least one of relational DBs: PostgreSQL, MySQL, Oracle, MS SQL
• Git/SVN
• Data modeling
• Data Streaming and Kafka
• Glue — Fully-managed ETL Service
• Redshift & Relational Database Service (RDS)
• Kinesis — End-to-end Service for Collection, Processing and Analytics of Streaming Data
• Athena — Interactive SQL Service for S3
• Databricks (Basic understanding)
• Python programming language
We are looking for people who share with us the passion that working with data and people is fun and a “mission”, not a “job”.
The job will offer you opportunities to participate in a wide variety of international projects in Western Europe.
You can work remotely or relocate to Bratislava, Slovakia.
• You will be responsible for designing and implementing automated tools for collecting and transferring data from multiple source systems to AWS cloud platform. Our primary clients are international companies including Banking, Telecommunications, Consumer Financing, Automotive and others.
• You will be working on design and development of large-scale data processing and analysis projects and consulting the customers