Сучасна диджитал-освіта для дітей — безоплатне заняття в GoITeens ×
Exadel is an international IT company headquartered in the USA. We design software solutions, deliver digital platforms, and have been creating unique products for Fortune 500 customers for 25+ years.With 30+ offices across the US, Europe, Caucasus, and Asia, Exadel addresses the most complex engineering problems with innovative solutions.
31 січня 2022

Python Developer with Data Analysis Experience (id 35/483) + Welcome Bonus $3000 (вакансія неактивна)

Київ, Харків, Львів, Одеса, Вінниця, віддалено

Requirements:
Proficiency in Python
3+ years experience building, maintaining, and supporting complex data flows with structural and unstructural data
Experience working with distributed applications
Ability to use SQL for data profiling and data validation
English level — Intermediate

Nice to have:
PySpark
Hands-on experience working with HDFS / or HIVE / or SQOOP
Understanding of AWS ecosystem and services such as EMR and S3
Familiarity with Apache Kafka and Apache Airflow
Experience in Unix commands and scripting
Experience and understanding of Continuous Integration and Continuous Delivery (CI/CD)
Understanding in performance tuning in distributed computing environment (such as Hadoop cluster or EMR)

Responsibilities:
Build end-to-end data flows from sources to fully curated and enhanced data sets. This can include the effort to locate and analyze source data, create data flows to extract, profile, and store ingested data, define and build data cleansing and imputation, map to a common data model, transform to satisfy business rules and statistical computations, and validate data content
Modify, maintain, and support existing data pipelines to provide business continuity and fulfill product enhancement requests
Provide technical expertise to diagnose errors from production support teams

Company offers:
Vacation is 20 working days / till 20 working days per year for sick leaves
Full payment of taxes
English courses
Flexible work schedule
Friendly environment
Medical insurance
Opportunity for career growth

About the Customer:
The сustomer is a leading provider of vehicle lifecycle solutions, enabling the companies that build, insure, repair, and replace vehicles to power the next generation of transportation. The company delivers advanced mobile, artificial intelligence, and connected car technologies through its platform, connecting a vibrant network of 350+ insurance companies, 24,000+ repair facilities, OEMs, hundreds of parts suppliers, and dozens of third-party data and service providers.

The customer’s collective set of solutions inform decision-making, enhance productivity, and help clients deliver faster and better experiences for end consumers. The сustomer’s company was ranked #17 in the Top 100 Digital Companies in Chicago in 2020 by Built in Chicago, an online community for digital technology entrepreneurs in Chicago, and was named one of Forbes best mid-sized companies to work for in 2019 — an important accolade and retention tool for the 2,600+ full-time company employees (alongside 350 dedicated contractors).

The сompany’s corporate headquarters is in downtown Chicago in the historic Merchandise Mart — a certified LEED (Leadership in Energy and Environmental Design) building that is also known to be a technology hub within the broader metro.

About the Project:
The customer has been working on an analytics platform since 2018. The platform is on Hadoop and the Hortonworks Data Platform, and the customer is planning on moving it to Amazon EMR in 2021. The customer has a variety of products, the data for all of which comes into one data lake on this analytics platform, which also allows the customer to do next generation analytics on the amassed data.

Architecture:
Hortonworks is the current vendor. It will be replaced by Amazon EMR. Tableau is going to be the BI vendor. Microstrategy currently exists and will be phased out by early 2023.

All data is sent to the data lake, and the customer can do industry reporting. These data are used by a data science team to build new products and an AI model.

We will be moving to real-time streaming using Kafka and S3. We are doing POC to use Dremio and Presto for the query engine.

We’re migrating to version 2.0 using Amazon EMR and S3, and Query engine is bucketed under 2.0 project.

Project Advantages:
Cross product analytics
Analytics for every new product customer has. Analytics team products is how the customer sells the products value to clients
Quarterly Business Review meetings use data to explain how customer’s product is helping clients in their business
You’ll get to work with a cross-functional team
You will learn the customer’s company business

Project Tech Stack:
Technologies used are all open source Hadoop, Hive, PySpark, Airflow, Kafka

Project Stage:
Active Development

Гарячі Python вакансії

Всі Python вакансії