We are looking for Data Engineer (Middle and Senior) to join our customer’s team.
Our client is the leading marketplace for investing in single-family rental homes that cash flow day one. Their mission is to make ownership of investment real estate radically accessible, cost-effective and simple.
Our client has been recognized as a great workplace by Glassdoor and Great Place to Work and was recently named to the Forbes Fintech 50 and the Red Herring 100 lists of most innovative companies.
Product — a platform that lets everyone from first-time investors to global asset managers evaluate, purchase, and own residential investment properties with confidence from anywhere in the world.
If you thrive in a team environment, are willing to pitch in wherever needed to help the team succeed, are passionate about data and excited about empowering users with data to drive decision making, this is the place for you.
Experience / Skills:
Must have:
3+ years of professional experience in writing production Python code, shell scripts, and complex SQL
3+ years of professional experience building robust data pipelines beyond simple API pulls
1+ year building and deploying data-related infrastructure (messaging, storage, compute, transform, execution via docker, and/or CI/CD pipelines across dev/stage/prod
Experience in data warehousing and dimensional data modeling
Experience with Airflow (or similar tools), AWS, Azure, and DBT is a plus
BS or MS in a technical field: computer science, engineering or similar
Good English
Strong communication and interpersonal skills
Excellent career opportunities
Competitive remuneration based on qualification and contribution
Flexible working schedule, vacations, paid sick leaves
Good working environment
Great team spirit
Perfect office location in the very center of Kyiv
Design, implement and deploy, scalable, fault-tolerant pipelines that ingest, and refine large diverse (structured, semi-structured and unstructured datasets) into simplified, accessible data models in production
Built departmental data-marts for supporting analytics across the company
Collaborate with cross-functional teams to understand data flows and processes to enable the design and creation of the best possible solutions
Provide quality data solutions in a timely manner and be responsible for data governance and integrity while meeting objectives and maintaining SLAs
Build tools and fundamental data sets that encourage self-service
Improve and maintain the data infrastructure