Data & AI Engineer

- Build and deploy AI systems powered by LLMs and external/internal data sources
- Design and maintain RAG pipelines using embeddings, vector databases, and knowledge stores
- Develop AI agents and tools that automate tasks, support users, or enhance creativity
- Evaluate model performance across criteria like hallucination, usefulness, and latency
- Extend and improve our internal product LatitudeGPT. the AI interface layer for European businesses
- Work across open-source and hosted models (OpenAI, Claude, Mistral, Llama2/3, etc.)
- Integrate LangChain, Open WebUI, LiteLLM, N8N, and other orchestrators into custom solutions
- Deploy applications to cloud environments (primarily AWS using ECS, Lambda, and S3)
- Collaborate closely with technical and non-technical stakeholders across our team and client orgs
- Build analytical tools and data infrastructure to reliably make the data available for analysis
This is where we start
This is what you get back from us
This is what you do
This is where we start
- 2+ years of experience in backend or data engineering or ML roles , with at least 1-2 years working directly with LLMs or AI applications
- Hands-on experience with cloud deployment (AWS preferred), including containers or serverless services
- Experience in setting up data pipelines, ETL/ELT
- Strong Python development skills, especially in frameworks like FastAPI, LangChain, or equivalent
- Experience building production-grade LLM pipelines (RAG, embeddings, vector stores like Qdrant or Weaviate)
- Familiarity with agentic AI tools (LangGraph, CrewAI, N8N, etc.) and workflow orchestration
- Understanding of LLM behavior evaluation and guardrail implementation (or strong motivation to learn and lead here)
- Comfort working independently and switching between client work and internal product development
- Startup-style mindset: resourceful, curious, iterative, and willing to wear multiple hats
Nice to Have
- Some basic frontend experience (React or Next.js)
- Familiarity with infrastructure-as-code tools (e.g., Terraform, Pulumi, AWS CDK)
- Experience integrating with third-party APIs, enterprise systems, or secure environments
- Prior work on internal tooling, chat interfaces, or workflow automation
- Contributions to open-source LLM/AI projects, or a personal GitHub with interesting experiments
Working conditions
- Work on both internal product development and high-impact client projects
- Help shape the core technical stack behind next-generation AI tooling
- Collaborate directly with co-founders and engineers
- Flexible hybrid setup: work from our Amsterdam HQ located in the Amsterdam AI Hub
- A fast-moving, high-learning environment with real responsibility and room to grow
Working at Latitude Amsterdam
Latitude is a Data & AI consulting firm building AI Agents & Interfaces for Europe’s leading organisations. We help organisations go beyond just using ChatGPT, our goal is to infuse AI into the core workflows, systems, and decisions that define their business.
Depending on the use case, this can mean building a retrieval-augmented knowledge system, designing agent workflows to automate internal tasks, or prototyping new AI-first products. The right AI tool depends on the business need and we make sure it's done right, end-to-end. We mix or focus as well on integrating classic analytics solutions. From the right data infrastructure to Machine Learning models in production.
We also offer our own internal products:
- LatitudeGPT, a secure, company-specific AI interface that lets teams interact with internal data, systems, and processes through natural language while maintaining full control over privacy, infrastructure, and costs.
- The Marketing Control Room, a central (Causal) AI-/ML-driven intelligence hub that brings together data, models, and insights to help marketing teams make faster, smarter decisions across channels and campaigns—fully integrated, always actionable
Erich Salomonstraat 1
1018 SC Amsterdam
Nederland
Employees
Office
Founding year