PURPOSE OF THE JOB:
We are looking for Data Engineer to build robust, extensible, and scalable Data and BI solutions for one of the world’s largest social media platform which deals with several Petabytes of data coming to the system daily. These solutions could be a combination of one or more of the following: source-of-truth datasets, daily/hourly pipelines, dashboards, visualization tools, and alerting.
We are building a team within an existing client to work mainly in our office in Vancouver and do periodic business trips to the client in San Francisco. Contribute as part of R&D self-organized team working in a challenging, innovative environment.
MAIN TASKS AND RESPONSIBILITIES:
• Understand the data landscape of complex products
• Design, architect, and implement new source of truth datasets, in partnership with analytics and business teams
• Build the required data and reporting pipelines using internal ETL tools
• Develop dashboards, web-based visualizations (using Tableau or other), and web-based tools (node.js or flask apps)
• Ensure that assigned area/areas are delivered within set deadlines and required quality objectives.
• Provide estimations, agree task duration with the manager and contribute to project plan of assigned area.
• Address area-level risks, provide and implement mitigation plan.
• Report to Team Lead or Project Manager about area readiness/quality, and raise red flags in crisis situations.
• If required, make yourselves available for the visits to the client location.
EDUCATION, SKILLS AND EXPERIENCE:
• University degree in Computer Sciences, Mathematics or similar
• 3 years of experience in the field of data engineering or business intelligence/analytics
• Experience designing, architecting, and maintaining data warehouses that seamlessly stitches together data from production databases and clickstream event data
• Hands-on experience with Hive/Spark SQL query development and optimization, and, building workflows (preferably using Airflow)
• Hands-on experience with building data pipelines in a programming language like Python
• Hands-on experience with building and maintaining Tableau dashboards and/or Jupyter reports
• Working understanding of Hadoop, HDFS and Big data analytics
• Strong communication, collaboration and interpersonal skills
• Ability to learn quickly
• Result oriented approach
• Experience working in Agile environment
• Friendly and highly professional teams
• Flexible working hours with no overtime
• Regular performance reviews
• Internal training
• Comfortable office facilities (kitchens, gym, sports activities, yoga, lounge rooms, coffee machines, etc.)
• Christmas holidays (31st December —7th January) and state holidays
• Fully paid English classes (twice per week) with own English teachers and native speakers
• Premium Medical insurance (medication, massage, and doctor in the office, etc.)
• Paid sick-leaves
• Life insurance
• 20 working days of annual paid vacation
• Incentives (marriage, childbirth)
• Corporate events (corporate parties and sports competitions)
And much more!
Please send your CV or contact us with more questions!