5+ years of experience in multi-technology projects
Recent 4+ years in data collection/processing/publishing projects
3+ years in projects in corporate environments in (sectors comparable to) Insurance
Recent 3+ years in agile way of working/scrum projects
IT and data architecture/modelling
Big data file system, file, and field formats knowledge
Data and software pipeline automation
Competitive salary
Professional and friendly team
Flexible work schedule
Social package
Possibility of business travel abroad
As a lead of data hub development team you will be responsible for data ingestion and processing, mapping of ingested data, delivering identification and reporting of data quality issues through data quality rules, change existing ingestions, processing, transformations, and data quality rules due to issues or for evolving requirements, communicate with the project and infrastructure teams, plan and report on the work outcomes to the project lead.
Our partner, a famous international credit insurance company, is looking for senior Big Data (Hadoop) lead engineer to deliver end-to-end ingestion and processing of newly ingested data sources into the data hub on Azure BLOB storage data layer.
Technologies:
Hadoop 3.0+
Golden Gate 12/18+
Oracle, SQL Server, Informix, PostgreSQL RDBM
Hive
Spark
Python 3.6+
Azure cloud
Ranger and Kerberos