Run Odoo + AI Locally — Then Scale to Kubernetes Dev Guide · Odoo + AI Run Odoo + AI Locally, Then Scale to Kubernetes A practical setup using Letta, pgvector, Skaffold, and Minikube — built from a real production repo. Odoo 18 Letta AI Kubernetes pgvector Skaffold Minikube Integrating AI into Odoo is a real advantage — smarter search, automated workflows, natural-language queries over your business data. But most tutorials are either too abstract, or throw Kubernetes at you from day one. This post walks through the exact dev setup I use: start local, iterate fast, then scale to Kubernetes when it matters. What We're Building Three services, wired together inside a Kubernetes namespace: Odoo 18 ERP · port 8069 ↕ PostgreSQL + pgvector odoo DB + letta DB · port 5432 ↕ ...
Cost saving, quality optimizing AI to serve production ERP intelligence. Small open-source LLMs are often dismissed as “not production-ready.” That’s simply not true. When used correctly, they can be incredibly effective—especially for tasks like document validation and attachment analysis—while saving significant costs compared to large proprietary models. In my case, I’m leveraging open-source models to: Validate structured and unstructured documents Analyze Odoo data Provide controlled, task-specific intelligence Are they production-ready? Yes—if you treat them like any other serious system: Testing → Review → Production No shortcuts. From an infrastructure perspective, running these models on a GPU-backed server is essential. CPU-only setups won’t deliver the performance needed for real-world use. On top of that, I’ve built a flexible MCP-based architecture that supports: Multiple users Multiple Odoo instances Custom Odoo models This makes the solution highly adaptable...