— Data Engineering experience from 2 years
— experience with building pipeline ETL/ELT pipelines
— experience of organizing DWH
— excellent knowledge of SQL/NoSQL (building and optimizing complex queries)
— deep knowledge and practical experience with Python
— experience with Spark, Hadoop
— experience with data warehouse solutions (Clickhouse, Redshift, Bigquery, etc.)
— tools for preprocessing and cleaning data.
— experience with database administration
— understanding machine learning pipelines
— knowledge of deep learning frameworks (PyTorch, tensorflow, etc.).
— work experience in PDL companies / banks
— knowledge of English
— Competitive pay level (depends on candidate level).
— Comfortable office at the city center
— Possibility to influence the development of the project
— Paid vacation(20 working days) and sick leaves(5 days)
— English courses
Functions and area of responsibility:
— collecting and preparing data for the analysts and data scientists
— building a data processing pipeline
— creating correct relationships between data
— preprocessing and data transformation
— ensuring high processing speed
— designing and implementing an optimal data warehouse (DWH) for various use cases, including reporting and machine learning
— feature engineering (search for new data sources and patterns)
— development and maintenance of metadata, data catalogs and documentation for internal business clients.
CashBerry — team of professionals in financial sector who is building online platform for credit services.
We are looking for Data engineer, who will be a part of our Risk department.