Сучасна диджитал-освіта для дітей — безоплатне заняття в GoITeens ×
ISsoft is an international IT company founded in 2004 as a Coherent Solutions Inc. subsidiary and fueled by 2000+ professionals from Ukraine, Poland, Lithuania, Moldova, Bulgaria, Romania, Belarus, and the USA. ISsoft development center in Lviv was opened in 2020 and welcomed IT professionals from Ukraine.
4 травня 2021

Big Data Developer (Media Measurement sphere) (вакансія неактивна)

Львів

Company Background

Our client is the company which pioneers the future of cross-platform media measurement, arming organizations with the insights they need to make decisions with confidence. Central to this aim are our people who work together to simplify the complex on behalf of our clients & partners.

It is a trusted partner for planning, transacting and evaluating media across platforms. With a data footprint that combines digital, linear TV, over-the-top and theatrical viewership intelligence with advanced audience insights, its platform allows media buyers and sellers to quantify their multiscreen behavior and make business decisions with confidence.

Project Description

You’ll be responsible for building next generation data warehouse together with migrating the data and processes from existing one (which would be deprecated later this year). The data is main supplier for a broad range of clients and products, including industry leading ad agencies, national television networks and other products. As a member of this fast-moving team you’ll have large impact on the evolution and adoption of the data processing as well as on the success of the business. It’s worth mentioning that this company processes and stores dozens of petabytes of data which is coming from TV/Web and their current infrastructure processes more than 15 bln requests per day.

Existing data warehouse is organized in Greenplum database and different job/procedures implemented via different set of tools/programming languages. The goal is to migrate it to AWS/Snowflake together with new warehouse design, improved usability and performance optimizations. Jobs should be reviewed, optimized and implemented using Spark (additional tools to be defined).

What You’ll Do

• Work within an agile team to develop new ETL processes and DW design;
• Recommend and implement creative solutions for performance improvement;
• Increase scalability and maintainability to support rapid usage growth;
• Collaborate openly with stakeholders and clients to continuously improve the data usage experience;

Technologies:

SQL
ETL
Java/Scala/Python
Apache Spark
AWS
Apache Parquet

Job Requirements

• Experience with SQL-like databases and data warehouses;
• Having worked with DW design and establishing ETL (ELT) processes would be beneficial to help with successful migration and have the users switch to new system;
• Experience with Java/Scala/Python languages would be a benefit in future design of DW;
• Strong communication skills (written and verbal) along with a track record of success delivering large software projects;
• Demonstrated knowledge of commonly used software engineering concepts, practices, and procedures;

Гарячі вакансії

Всі вакансії