We are looking for talents, who are passionate about data, AI, and innovations!
Remote — US Startup — b2b contract
About Company
At Prosperous Process AI, we’re transforming how enterprises manage procurement and supply chains with our AI-powered platform. Founded in 2023 and based in Denver, Colorado, our mission is to simplify and optimize the source-to-procure process, helping businesses save time and reduce costs.
Our platform leverages advanced AI and machine learning to automate procurement workflows, deliver real-time insights, and enhance decision-making. Key features include:
- Generative Analytics — Providing actionable insights to uncover cost-saving opportunities.
- Demand Forecasting — Predicting future demand to optimize inventory and procurement strategies.
- Automated Supplier Negotiations — Streamlining the negotiation process for better terms and pricing.
- Real-Time Market Insights — Offering visibility into material costs and availability.
With over 600 software integrations, Prosperous AI provides a comprehensive view of business operations, making it especially valuable for industries like construction, manufacturing, real estate, and aerospace. Our platform helps businesses save up to 30% on supply chain costs by identifying inefficiencies and enabling smarter, data-driven decisions—without disrupting existing processes.
Role
We’re looking for a talented Data Engineer to design, build, and maintain the data infrastructure powering our AI-driven platform. You’ll be responsible for developing scalable data pipelines, integrating large datasets, and enabling real-time data processing to drive actionable insights and improve supply chain performance.
Key Skills & Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related fields.
- 7+ years of software engineering experience, with 4+ years working with Python.
- Strong experience with data modeling, ETL processes, and pipeline development.
- Proficiency in AWS, including architecture and infrastructure development from scratch.
- Experience with large-scale datasets and building data-driven solutions.
- Familiarity with TypeScript and optimization tools (e.g., Gurobi, CPLEX) is a plus.
- Strong problem-solving and analytical skills.
- Good communication skills in English (Upper-Intermediate+) is a must.
Key Responsibilities
- Design and Build Data Pipelines: Develop and maintain scalable ETL pipelines to process and integrate large datasets from various sources.
- Optimize Data Infrastructure: Design data models and storage solutions to support real-time data processing and machine learning applications.
- Data Integration: Connect data from multiple platforms (600+ integrations) to enable comprehensive insights across the business.
- Performance Monitoring: Monitor data pipeline performance and troubleshoot issues to ensure high availability and accuracy.
- Support Predictive Analytics: Collaborate with Data Scientists to provide clean, structured data for machine learning and forecasting models.
- Automation: Automate data workflows to enhance efficiency and reduce manual intervention.
- Security and Compliance: Ensure data integrity and security, following best practices for handling sensitive business information.
Details
- Tech Stack: React, Python, Django, Postgres, AWS, OpenAI
- Team: Global, including US and Europe. 5 people team
- Customers: 30+ clients across the construction, manufacturing, real estate, and aerospace industries
- Remote role with overlap required for team synchronization
- Contract: B2B, long-term partnership
Benefits
- Salary: Up to $10,000 gross, depending on experience
- PTO: 20 days vacation, 10 sick days, 11 public holidays
- Equipment: Provided
Interview Stages
- Prescreening Call (30 min)
- Intro Call with the team (30 min)
- Tech Interview (1 h)
Interested? Let’s connect and discuss the details!