PROJECT_CONTEXT.md
Project Overview
LocalFoodAI is a local food AI that provides complete nutritional information on foods and can generate menu proposals based on user specifications. It runs entirely on a local Ubuntu 24.04 VM (8 vCPU, 30 GB RAM, no GPU). No user data leaves the server. The backend is Python-based.
Tech Stack
- Operating System: Ubuntu 24.04 (VM)
- Backend: Python 3.11+
- Database: SQLite (local, no cloud)
- Local LLM: Qwen 3.5 9B (quantized via Ollama, Q4_K_M or equivalent)
- CPU-only compatible
- Fits in 30 GB RAM with quantization
- Instruction-following tuned
- Open-source license (compatible with student projects)
- Local Web Search Tool: SearXNG (fully local, anonymous)
- Version Control: Git via Gogs on git.btshub.lu
- CI / Deployment: Antigravity Agent Manager handles task execution
- LLM Hosting: Ollama local instance, no cloud APIs
Rules & Constraints
- No external APIs or cloud services for computation or data fetching
- All data and computation must remain on the local VM
- All commits must be traceable to a Taiga Task ID
- Antigravity must read this file before starting any task to avoid hallucinating cloud-based solutions
- Model and backend selection must fit VM constraints (CPU-only, RAM limit)
Best Practices
- Use quantized models for CPU efficiency
- Verify all AI-generated Python or database logic before approving commits
- Test database queries and prompt logic locally before integrating
- Attach all artifacts (Implementation Plans, task lists, browser recordings) to the corresponding Taiga task
- Always include the TG- prefix in commit messages