● Experience managing a team of data engineers.
● Business and value-focused, stakeholder management.
● Robust SQL, data modeling skills and optimization knowledge for different analytics workloads and scenarios.
● Experience with ETL/ELT processing pipelines and design in batch and near real-time, experience with ETL tools that implement a DAG e.g. Airflow and Luigi.
● Professional knowledge of Python, Pep8 and coding quality and standards.
● A keen interest in new technologies in the big data space and open mindset to always bring innovation.
● Experience with MPP Databases and distributed systems.
● Familiarity with AWS services and concepts (VPC, IAM)
● Experience with Spark and Spark Streaming or similar solutions like Apache Flink or Apache Beam.
● Experience with Distributed SQL Engine like Presto, Impala.
● Experience with PubSub/Stream data such as RabbitMQ, AWS Kinesis, Kafka.
● Experience using Docker.
● Experience using SnowPlow is advantageous.
● Knowledge of Kubernetes and Terraform is advantageous.
● Experience with Golang is advantageous.
• Competitive salary;
• Corporate and social events;
• Regular assessment and salary review;
• Free English classes;
• Opportunity for self-realization;
• Friendly team and enjoyable working environment;
• Flexible working schedule;
• Comfortable open-plan office in Downtown;
• 21 days of annual leave;
• 5 sick-paid days;
• Participation in conferences and seminars;
• Friday team buildings.
● Accountable for owning and managing the team’s backlog.
● Develop and maintain Data Pipeline Applications.
● Be responsible for and own the event tracker system (Snowplow Analytics).
● Keep the architecture clear and introduce appropriate innovations.
● Help and support the B.I. team with Data Warehouse questions and architecture.
● Work closely with our Data Science team, providing insights and support to create the best architecture for our data products.
● Ensure good knowledge sharing and support Junior members in on-boarding and new problem-solving.
● Evaluate and negotiate OKRs per cycle and deliver the outcomes.
We are looking for experienced Data Engineering Lead to work in Dubai for 6 months.
You will direct and manage a growing team of qualified Data Engineers (4+). You will coach, mentor & develop them, will be responsible for the team’s backlog, as well as the prioritization of the tasks and projects for the team, while carefully considering the company’s strategic business priorities and cross-team data requirements.
These must be managed with engineering rigor and careful consideration of security, privacy, and regulatory requirements.
The strategic use of data supports both the ever-improving user-experience for our real-estate customers and the high expectations of real-estate professionals (brokers) in our global markets for data tools to improve the efficiency and profitability of their work.
This is a hands-on role, where you will be able to make use of your senior Engineering skills, working on extending and improving our ecosystem of Data Pipelines together with our Data Science, B.I and Backend Engineers to ensure that data is consistently flowing and that it is collected, stored and delivered in a form that allows the generation of insights and data products.
You will also be responsible for continuously maintaining and innovating our Data Warehouse, Data Lake, data ingestion (‘ETL/ELT’) procedures and internal & external reporting apps (APIs), ensuring that the growth in the volume and variety of data doesn’t compromise existing service commitments.
You will work closely with Backend Engineers to ensure that the data engineering team delivers reliable and performant services to enable widespread consumption of data and works with the most appropriate technologies in the market such as Presto, Spark, Kubernetes, Airflow to ensure our data tech stack is always market leading.