Rakuten Advertising is looking for a Data Engineer to join our Data Engineering team.
The ideal candidate has strong Python, SQL, and ETL development experience.
· Minimum of three years of experience working with data and programming in Python
· Able to design efficient and scalable cloud architectures
· Experience writing complex SQL Queries
· Knowledge of ETL patterns, including testing and maintenance
· Familiar with relational / non-relational database approaches and knowing which to apply where and when
· Ability to think holistically about uses of data, designing for ease of data access
· Ability to practice disciplined engineering (testing, code reviews, and writing readable code)
· Work in a small team with a startup mentality.
· Experience working with distributed teams situated globally in different geographies
· Excellent analytical communication and interpersonal skills
· Ability to work well under pressure, prioritize work and be well organized. Relish tackling new challenges, paying attention to details, and, ultimately, growing professionally.
· Ability to take ownership of the deliverables
· Experience with data processing frameworks such as Apache Spark
· Familiar with batch data pipelining frameworks such as Apache Airflow
· Understanding of event-driven and stream-based processing patterns and systems such as Spark Streaming, Kafka, or Kinesis
· Web scrape experience
— Bachelor’s degree in Computer Science, Information Science or related technical or quantitative discipline
You will be part of a diverse, flexible, and collaborative environment where you will be able to apply and develop your skills and knowledge working with modern tools and technologies (Snowflake Data Warehouse, Databricks, DBT, Looker). As a Data Engineer, you will be at the cutting-edge of the alternative data space, where you will help develop our data infrastructure to help our Analysts glean valuable insights from a large set of diverse data sets.
— Maintaining clean and consistent access to all our data sources
· Providing a solid foundation for calculating key business metrics
· Developing ETL pipelines to automate data ingestion, data transformation processes (such as extracting, cleansing, standardizing, mapping data), and ensuring client delivery
· Maintaining data infrastructure to keep up with the product roadmap
· Understanding data lineage and governance for a variety of data sources
· Communicating updates and changes to the broader data team as well as contributing to and maintaining data-related documentation.