Сучасна диджитал-освіта для дітей — безоплатне заняття в GoITeens ×
During 2018-2023, Rakuten group has welcomed numerous talented Ukrainian professionals into the family, some of them have joined the Company as the whole teams (Slice, Forma Pro).
14 червня 2021

Data Engineer (Python) (вакансія неактивна)

Київ, Одеса

Необхідні навички

Rakuten Advertising is looking for a Data Engineer to join our Data Engineering team.
You will be part of a diverse, flexible, and collaborative environment where you will be able to apply and develop your skills and knowledge working with modern tools and technologies (Snowflake Data Warehouse, Databricks, DBT, Looker). As a Data Engineer, you will be at the cutting-edge of the alternative data space, where you will help develop our data infrastructure to help our Analysts glean valuable insights from a large set of diverse data sets.

·The ideal candidate has strong Python, SQL, and ETL development experience.
· Minimum of three years of experience working with data and programming in Python
· Able to design efficient and scalable cloud architectures
· Experience writing complex SQL Queries
· Knowledge of ETL patterns, including testing and maintenance
· Familiar with relational / non-relational database approaches and knowing which to apply where and when
· Ability to think holistically about uses of data, designing for ease of data access
· Ability to practice disciplined engineering (testing, code reviews, and writing readable code)
· Work in a small team with a startup mentality.
· Experience working with distributed teams situated globally in different geographies
· Excellent analytical communication and interpersonal skills
· Ability to work well under pressure, prioritize work and be well organized. Relish tackling new challenges, paying attention to details, and, ultimately, growing professionally.
· Ability to take ownership of the deliverables

Буде плюсом

· Experience with data processing frameworks such as Apache Spark
· Familiar with batch data pipelining frameworks such as Apache Airflow
· Understanding of event-driven and stream-based processing patterns and systems such as Spark Streaming, Kafka, or Kinesis
· Web scrape experience
· Bachelor’s degree in Computer Science, Information Science or related technical or quantitative discipline

Пропонуємо

· English lessons
· Work remote from any place in the world 3 months per year
· Only 4 working hours every Friday during the summer
· Bonuses for education and professional development
· Access to training platforms
· Flexible working hours
· Comfortable workplace (remote work during quarantine)
· Variety of knowledge sharing and training opportunities.
· Xbox, PS, library and table tennis in office
· Medical insurance: including dentist and covid-19 cases.
· Sports compensation
· 20 days paid vacation, paid sick leave, maternity leave

Обов’язки

· Maintaining clean and consistent access to all our data sources
· Providing a solid foundation for calculating key business metrics
· Developing ETL pipelines to automate data ingestion, data transformation processes (such as extracting, cleansing, standardizing, mapping data), and ensuring client delivery
· Maintaining data infrastructure to keep up with the product roadmap
· Understanding data lineage and governance for a variety of data sources
· Communicating updates and changes to the broader data team as well as contributing to and maintaining data-related documentation.

Гарячі Python вакансії

Всі Python вакансії