Our client is building an AI-driven learning platform that transforms technical documentation (PDFs, presentations, technical manuals) into interactive courses with automated video generation and intelligent AI tutoring. The platform serves pharmaceutical, manufacturing, and R&D sectors where subject matter experts need to convert complex technical content into training materials without relying on traditional learning and development teams.
We are looking for a Senior Full-Stack Engineer with strong backend expertise to be responsible for core backend services and agentic AI workflow integration. You’ll architect and implement event-driven systems that orchestrate multi-stage AI pipelines (document extraction, content structuring, video generation), integrate external AI services, and build reliable job processing with checkpointing and error recovery.
This role requires strong full-stack development skills with a deep focus on the backend, hands-on experience in building or integrating agent-based workflows, and familiarity with LLM-based systems. You’ll work with a team to ship a working proof-of-concept in 2 months, followed by MVP expansion over 4 months.
Project stack:
- Backend: NestJS, Node.js, TypeScript
- Database: PostgreSQL (Supabase), Row-Level Security (RLS)
- AI/LLM: OpenAI API, LangChain/LangGraph, Arize AI/Phoenix
- Job Queues: Redis/BullMQ or AWS SQS
- Storage: AWS S3 with signed URLs
- Email: Resend or SendGrid
- Error Monitoring: Sentry
- Analytics: Mixpanel, Amplitude, or PostHog
- Infrastructure: AWS (Lambda, ECS Fargate, CloudFront), Docker, Terraform/Ansible
- CI/CD: GitHub Actions
- Observability: Sentry, CloudWatch
Team composition: Architect, Software Engineer, Designer, QA Engineer, DevOps Engineer, BA, Delivery Manager
Project Highlights:
- Cutting-edge agentic AI application combining multiple AI services into cohesive workflows
- Real-world impact, helping technical experts create accessible learning content
- Opportunity to work with OpenAI, Anthropic, and specialized AI services (document extraction, video generation)
- Greenfield project with modern tech stack and architectural freedom
What you’ll work on:
- Agentic workflow integration: Design and implement multi-stage workflows (document extraction → course structuring → content generation → video creation) using LangChain/LangGraph or similar orchestration frameworks
- Backend API development: Build NestJS REST APIs with PostgreSQL, implementing authentication, RBAC, and row-level security for multi-tenant data isolation
- External service integrations: Integrate Landing AI (document extraction from PDF, DOCX, PPTX, images), OpenAI and Anthropic (LLM generation), Synthesia (video generation), Arize AI (LLM evaluation/tracing), and Resend/SendGrid (transactional emails)
- Job processing: Implement asynchronous task queues using Redis/BullMQ or AWS SQS with idempotent processing, DLQ patterns, and state management
- Data architecture: Design database schemas with strong referencing, audit trails, and citation tracking for generated content
- Frontend collaboration: Contribute to Next.js/React development when needed to support full-stack feature delivery (working knowledge required, not primary focus)
Scope of tasks and ownership:
- Build backend services and APIs that orchestrate multi-step workflows with checkpointing and error recovery.
- Design and implement agentic systems for course generation, chatbot functionality, and content processing.
- Operationalize prompts and LLM workflows designed by client-side AI and instructional design experts.
- Integrate third-party APIs (LLM providers, video generation) with proper rate limiting, retry logic, and cost monitoring.
- Implement secure authentication flows, RBAC policies, and tenant isolation using Supabase Auth and RLS.
- Write database migrations, design efficient schemas, and optimize queries for performance.
- Contribute to frontend development when needed to support full-stack features.
- Integrate application-level monitoring (Sentry) and analytics tools (Mixpanel/Amplitude/PostHog) with backend services.
- Participate in architecture decisions, code reviews, and sprint planning.
- Collaborate with the solution architect on technical design decisions.
- Work with DevOps engineer on deployment requirements and observability tooling.
- Partner with a frontend-focused engineer on API contracts, data models, and full-stack feature integration.
What You’ll need:
- 5+ years of full-stack development experience with a strong backend focus (Node.js, TypeScript).
- Experience using AI-powered productivity tools (Cursor, VS Code with Copilot, or similar AI-enhanced IDEs) and LLMs for research and problem-solving (ChatGPT, Claude, etc.).
- Experience building or integrating agentic workflows or LLM-powered applications.
- Hands-on experience with LangChain, LangGraph, or similar orchestration frameworks.
- Familiarity with OpenAI and/or Anthropic APIs, prompt engineering, and structured output validation.
- Working knowledge of React and Next.js (able to contribute to full-stack features when needed).
- Solid understanding of PostgreSQL (schema design, migrations, query optimization).
- Experience with event-driven architectures, job queues (Redis/BullMQ, AWS SQS), and asynchronous processing.
- Familiarity with NestJS or similar backend frameworks (Express, Fastify).
- Experience with authentication, authorization, and multi-tenant data isolation patterns.
- Strong problem-solving skills and ability to work autonomously with minimal oversight.
- Clear written and spoken English for async collaboration and documentation.
Nice to have:
- Experience with Supabase (Auth, RLS, Realtime).
- Familiarity with LLM observability tools (Arize AI, Phoenix, LangSmith).
- Experience designing systems for multi-LLM provider support.
- Experience with document processing pipelines or OCR integration.
- Knowledge of video generation APIs (Synthesia or similar).
- Familiarity with error monitoring (Sentry) and product analytics tools (Mixpanel, Amplitude, PostHog).
- Understanding of prompt engineering principles and best practices.
- Experience with evaluation frameworks for LLM-generated content.
- Background in EdTech, content generation, or learning platforms.
- Familiarity with Tailwind CSS and modern UI development patterns.
- GitHub-based workflows and conventional commits.
Our benefits:
- No micromanagement
- Freedom to engage in decision-making and implementation
- Ability to work in a team of professionals (the ratio of middle and above specialists 80/20)
- Participation in the development of high-quality products
- Direct communication with clients on a partnership level
- Professional development opportunities ($600 education budget, well-managed processes, communities, internal library)
- Health insurance
- $600 extra for health care, sports, or mental health
- 20 paid working days off and 10 days sick leave
- Opportunity to work remotely
- Soulful team buildings and corporate events
Join us and be among those who care!