AgileEngine is a privately held company established in 2010 and HQed in the Washington DC area. We rank among the fastest-growing US companies on the Inc 5000 list and the top-3 software developers in DC on Clutch.
1 липня 2025

Data Infrastructure Engineer ID35383

Львів, Буенос-Айрес (Аргентина), Краків (Польща), Мехіко (Мексика), Сан-Паулу (Бразилія), віддалено



AgileEngine is one of the Inc. 5000 fastest-growing companies in the US and a top-3 ranked dev shop according to Clutch. We create award-winning custom software solutions that help companies across 15+ industries change the lives of millions.

If you like a challenging environment where you’re working with the best and are encouraged to learn and experiment every day, there’s no better place — guaranteed! :)

What you will do
● Architect, build, and maintain modern and robust real-time and batch data analytics pipelines;
● Develop and maintain declarative data models and transformations;
● Implement data ingestion integrations for streaming and traditional sources such as Postgres, Kafka, and DynamoDB;
● Deploy and configure BI tooling for data analysis;
● Work closely with product, finance, legal, and compliance teams to build dashboards and reports to support business operations, regulatory obligations, and customer needs;
● Establish, communicate, and enforce data governance policies;
● Document and share best practices with regards to schema management, data integrity, availability, and security;
● Protect and limit access to sensitive data by implementing a secure permissioning model and establishing data masking and tokenization processes;
● Identify and communicate data platform needs, including additional tooling and staffing;
● Work with cross-functional teams to define requirements, plan projects, and execute on the plan;
● At least 4 hs overlap with EST is required.

Must haves
5+ years of engineering and data analytics experience;
● Strong SQL and Python/Scala skills for complex data analysis;
● Experience with modern data pipeline and warehouse tools using Databricks, Spark or AWS Glue;
● Proficiency with declarative data modeling and transformation tools using DBT;
● Experience configuring and maintaining data orchestration platforms with Airflow;
● Experience with infrastructure-as-code tools (e.g., Terraform);
● Upper-intermediate English level.

Nice to haves
● Familiarity with real-time data streaming (e.g., Kafka, Spark);
● Background working with cloud-based data lakes and secure data practices;
● Hands-on experience building automation tooling and pipelines using Python, Go, or TypeScript;
● Familiarity with container orchestration (e.g., Kubernetes);
● Prior experience managing external data vendors;
● Exposure to Web3 / Crypto data systems;
● Background working cross-functionally with compliance, legal, and finance teams;
● Experience driving company-wide data governance or permissioning frameworks;
● Strong bias for simplicity, speed, and avoiding overengineering;
● Ability to work autonomously and drive projects end-to-end.



The benefits of joining us
Professional growth: Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps
Competitive compensation: We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities
A selection of exciting projects: Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands
Flextime: Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office — whatever makes you the happiest and most productive.

LinkedIn