Сучасна диджитал-освіта для дітей — безоплатне заняття в GoITeens ×
We are the provider of next-generation secure and scalable communication services. Our award-winning open source VPN protocol has emerged to establish itself as a de-facto standard in the open source networking space with over 50 million downloads since inception.
20 березня 2021

DevOps Data Engineer (вакансія неактивна)

Київ, Львів, віддалено

Необхідні навички

OpenVPN Inc is seeking a DevOps Data Engineer who loves the numbers and meets next requirements:

— 5+ years of experience in a DevOps/DBA/DBE role
— Extensive experience in SQL (Postgresql, Postgresql, & more Postgresql, and maybe some MySQL)
— Working knowledge of JSON and Postgresql JSONB extensions
— Strong operational experience in Linux/Unix environment and scripting languages like Shell, Perl, Python is required
— Strong troubleshooting skills
— At least upper-intermediate level of English, both spoken and written
— When you talk in your sleep it should be Postgresql compatible SELECT statements.

Буде плюсом

— Experience in horizontal database scaling, Citus Data/TimescaleDB
— Familiarity with PostGIS.
— Experience in NoSQL (MongoDB) databases.
— Experience in Airflow, Kettle, and the Pentaho PDI/PDR.

Пропонуємо

Our philosophy is that we are a small, closely-knit team and we care deeply about you:
— Competitive salary;
— Great new office space;
— Flexible working schedule, remote work possible;
— Working directly with colleagues from Silicon Valley and around the world;
— Team trips, certification and events compensation, medical insurance, sports, etc.;
— Last but not least, we are really fun to work with!

Обов’язки

— Maintain roughly 100 custom data pipelines written in Perl, PHP, Python & Java.
— Create new data pipelines and transformations.
— Assist with data migrations and database component software updates
— Assist with cloud and bare-metal infrastructure buildout
— Build out new services such as data presentation systems (metabase, pentaho data reporter, tableau, etc)
— Work with the BI team to turn raw data into insights as efficiently as possible
— Help troubleshoot any BI tool issues such as failing jobs and reporting layer slowness
— Work with other teams to create ETLs.
— Set up integration environments and provision databases for new projects
— Develop the Dev-Ops process using CI/CD tools
— Repair failing components as needed — be available when needed
— Be given only a mallet and some small wrenches with which to work

Про проєкт

This position involves:

— Caring for our large horizontally scaled database clusters, processes, and custom data pipelines.
— Creating queries and ETLs and improving poorly written ones.
— Optimizing the clusters to run as efficiently as possible, maintaining data consistency.
— Helping data producers and consumers to move their data efficiently and produce the results they want.
— Contributing to the planning of future growth.
— Revision control and migrating data forward through new versions of software and architectures.
— Implementation of our technologies on cloud environments (AWS, Azure, GCP) and bare metal
— Dreaming about the numbers and ways to improve the systems when asleep.
— Implementing solutions that are good enough now, while planning for better solutions in the future.
— Building tools that make others within the company more productive.

Гарячі DevOps вакансії

Всі DevOps вакансії