Pārlūkot izejas kodu

TG-1: Initialize Git Repo and AI History

lanfr144 1 mēnesi atpakaļ
revīzija
e8808db650

+ 6 - 0
.gitignore

@@ -0,0 +1,6 @@
+venv/
+.venv/
+__pycache__/
+*.csv
+*.sql.gz
+.mylogin.cnf

+ 38 - 0
AI_History/Conversation_Log.md

@@ -0,0 +1,38 @@
+# Local Food AI - Conversation & Action Log
+
+This document serves as a summary of the requests made by FrancoisLange and the actions successfully executed by the AI (Antigravity).
+
+## 1. Initial Project Scoping & Sprints Definition
+**User Request:**
+Provide user stories from the Scrum process for the backlog based on the Vision Statement ("A local food AI that provides full nutritional value..."). Explain what to do also. Evaluate LLMs to fit within Ubuntu 24.04 VM hardware limits (30GB RAM, 8 vCPUs) and define a public git structure (`git.btshub.lu`).
+
+**AI Action:**
+- Evaluated hardware constraints and proposed lightweight models (e.g., `mistral:7b-instruct-q4`, `llama3:8b-instruct-q4`) via Ollama.
+- Defined 6 comprehensive Epics with nested user stories.
+- Expanded those into a concrete **6-Sprint plan**, including **Acceptance Criteria (Definition of Done)** for every task.
+- Saved the structured Agile plan to `implementation_plan.md` and `task.md`.
+
+## 2. PROJECT_CONTEXT Alignment
+**User Request:**
+Check if the new 6-Sprint Local Food AI plan is compatible with `PROJECT_CONTEXT.md` and change it if necessary.
+
+**AI Action:**
+- Overwrote `PROJECT_CONTEXT.md` to perfectly merge the original "Streamlit UI + MySQL Backend" architecture with the new "Local Food AI" features.
+- Established strict rules focusing on privacy (no data leaves server).
+- Saved as `PROJECT_CONTEXT.md`.
+
+## 3. Taiga Automation & Backlog Setup
+**User Request:**
+Connect to Taiga (`https://192.168.130.161/taiga`) to store all the histories and sprints using the provided credentials (`FrancoisLange` / `BTSai123`).
+
+**AI Action:**
+- Deployed an internal Playwright web-automation subagent that bypassed local SSL warnings.
+- The subagent successfully logged into the local Taiga instance, created a Scrum project exactly named "**Local Food AI**".
+- Automatically populated the backlog via bulk insert with all 13 User Stories.
+- Established Sprints 1 through 6 on the Agile board and assigned preliminary stories.
+- **Recording Exported:** The video recording of the browser subagent executing this on Taiga was saved as an animated `.webp` format (`taiga_backlog_creation.webp`).
+
+## Attached Context
+- `taiga_backlog_creation.webp` (A video capture of the AI interacting with your Taiga interface!)
+- `implementation_plan.md` (The raw Sprint & Story backlog mapping)
+- `task.md` (The AI's internal task tracking state)

+ 20 - 0
AI_History/implementation_plan.md

@@ -0,0 +1,20 @@
+# Local Food AI - Implementation & Verification Plan
+
+## Goal Description
+The objective is to update the project documentation and establish a robust Scrum backlog that flawlessly aligns the user's specific "Local Food AI" requirements with their existing `Streamlit` plus `MySQL` structural decisions mapping.
+
+## Proposed Changes
+### Project Context Update
+#### [MODIFY] [PROJECT_CONTEXT.md](file:///c:/Users/lanfr144/Documents/DOPRO1/Antigravity/Food/PROJECT_CONTEXT.md)
+- Replaced the generic concept with the concrete "Local Food AI" vision statement.
+- Expanded the existing Tech Stack to include user accounts, privacy guarantees, and Git repository constraints.
+- Bridged the original CSV/MySQL/Pandas expectation with the new "Nutritional Database search" features.
+- Replaced the generic roadmap with a detailed 6-Sprint plan adapted for Streamlit and MySQL.
+
+### Scrum Planning Artifacts
+- Created `task.md` outlining the Sprint breakdown.
+
+## Verification Plan
+### Manual Verification
+- The user reviews `PROJECT_CONTEXT.md` to confirm the synthesis between the old environment and the new requirements is accurate.
+- The user approves the 6-Sprint plan before execution begins on Sprint 1 (Git repo & Docker/Deploy setup).

BIN
AI_History/taiga_backlog_creation.webp


+ 26 - 0
AI_History/task.md

@@ -0,0 +1,26 @@
+# Local Food AI - Task Breakdown
+
+- [x] Update PROJECT_CONTEXT.md to merge original architecture with new features
+- [x] Detailing the Sprints in Planning Mode
+- [x] Await user approval on Implementation Plan
+- [x] Integrate Project and Backlog into Taiga
+- [ ] **Execute Sprint 1: Foundation & Authentication**
+  - [ ] Initialize Git Repo at `git.btshub.lu` and push `.gitignore`
+  - [ ] Add `evegi144` as project collaborator
+  - [ ] Finalize `deploy.sh` and Docker setup
+  - [ ] Complete `init.sql` for MySQL Users table
+  - [ ] Build basic Streamlit Login App
+- [ ] **Execute Sprint 2: Core Nutritional Database**
+  - [ ] Import food CSV via Pandas
+  - [ ] Implement search views in Streamlit
+- [ ] **Execute Sprint 3: Food Combinations**
+  - [ ] Build math aggregation logic
+  - [ ] Link combos to MySQL profiles
+- [ ] **Execute Sprint 4: AI & Chat**
+  - [ ] Pull and test quantized Ollama models
+  - [ ] Build Streamlit chat UI
+- [ ] **Execute Sprint 5: AI Menu Proposals & Web Search**
+  - [ ] Write RAG integration
+  - [ ] Configure local proxy web search tool
+- [ ] **Execute Sprint 6: Polish & Documentation**
+  - [ ] Thorough QA and README rewrite

+ 76 - 0
PROJECT_CONTEXT.md

@@ -0,0 +1,76 @@
+# Project Context: Local Food AI
+
+## 🎯 Vision Statement
+A local food AI that provides full nutritional value information on any food and can generate complete menu proposals based on the user's specification. The system is designed with a strict privacy-first focus, ensuring no user data leaves the server, and fits within specific hardware limits.
+
+## 🏗️ Architecture & Tech Stack
+
+### Remote Environment
+- **Server**: Ubuntu 24.04 VM at `192.168.130.170` (8 vCPUs, 30 GB RAM, no dedicated GPU). Accessed via SSH as `francois` or `root`.
+- **Containerization**: Docker (for backend/frontend) or native deployment.
+- **LLM Engine**: Ollama (for running lightweight, quantized local language models like `mistral` or `llama3-8b`).
+- **Database Server**: MySQL (for user data, saved lists, and nutritional database).
+
+### Frontend Web Interface
+- **Framework**: Streamlit (Python)
+- **Purpose**: To provide an interactive chat interface for the AI, search functionality for food nutrition, user account management, and food combination calculators.
+
+### Local Environment
+- **Workspace**: `c:\Users\lanfr144\Documents\DOPRO1\Antigravity\Food`
+- **OS**: Windows
+
+### Python Environment
+Python will be used for scripting, data manipulation, and interacting with the LLM and the Database. Required libraries:
+- `streamlit`: To build the web application.
+- `ollama`: For querying local models.
+- `pandas`: For data processing (e.g., ingesting nutrition CSVs).
+- `mysql-connector-python` or `SQLAlchemy`: For database access.
+- **Web Search Tool**: (e.g., DuckDuckGo API wrapper) for the AI to dynamically gather external information anonymously.
+
+## 🔐 Core Requirements & Privacy
+- **User Accounts**: Secure login and registration system.
+- **Data Privacy**: No user data leaves the server.
+- **Repository**: Public Git repository at `https://git.btshub.lu` named `LocalFoodAI_<your IAM>`. Contains a strict `.gitignore`. Teacher (`evegi144`) added as collaborator.
+- **Ease of Use**: Anyone should be able to clone the repo and run it easily (via Docker/scripts).
+
+## 🚀 Key Features (User Stories)
+1. **Nutritional Information**: View complete macros, minerals, vitamins, etc., for any food.
+2. **Food Combinations**: Enter quantities for multiple foods to get a combined nutritional overview. Store and edit these in named lists.
+3. **Nutrient Search**: Search for specific nutrients and sort foods containing them.
+4. **AI Menu Proposals**: Get AI-generated menu proposals based on nutritional goals and constraints (e.g., allergies).
+5. **AI Nutrition Chat**: Freely chat with the AI about nutrition.
+6. **Anonymous Web Search**: The AI can perform local background web searches for missing information.
+
+## 🚀 Installation Prerequisites & Deployment
+### Server Prerequisites (Ubuntu 24.04 Native)
+- `gcc` and `build-essential`.
+- `python3-venv`, `python3-dev`, and `python3-pip`.
+- `mysql-server` and `curl`.
+
+### Automated Deployment (`deploy.sh`)
+Executing this file on a naked server will automatically:
+1. Fetch and install all apt-level system prerequisites.
+2. Install Ollama natively.
+3. Push custom configurations (`my.cnf`) to MySQL server and configure the local virtual environment.
+4. Pip-install the project dependencies.
+
+## 💾 Database Configuration & Data Loading
+### 1. Initial MySQL Setup
+- `init.sql` script loads into MySQL to create the database, users, and tables for User Profiles, Food Combos, and the Nutrition Data.
+
+### 2. Data Import (CSV)
+- A nutritional database `.csv` ingestion script (using `pandas`) populates the MySQL tables.
+
+### 3. Search Capabilities
+- The MySQL database must be optimized for text/context queries to support the AI's Retrieval-Augmented Generation (RAG).
+
+## 📝 Roadmap & Next Steps (Sprints)
+- [ ] **Sprint 1 (Foundation)**: Initialize Git repository (`LocalFoodAI_<IAM>`), setup `.gitignore`, finalize `deploy.sh`, initialize MySQL (`init.sql`), and build Streamlit user login.
+- [ ] **Sprint 2 (Data Core)**: Import food nutritional CSV via Pandas into MySQL. Build Streamlit pages for food search and details.
+- [ ] **Sprint 3 (Combinations)**: Implement Streamlit logic to combine foods by gram amounts and save lists to MySQL.
+- [ ] **Sprint 4 (Local AI)**: Deploy lightweight Ollama models and build the Streamlit chat interface.
+- [ ] **Sprint 5 (Advanced AI)**: Implement RAG for menu proposals and integrate anonymous web search tool.
+- [ ] **Sprint 6 (Polish)**: Thorough testing and perfect the `README.md`.
+
+---
+*Generated by Antigravity. Update this file as technical requirements and data schemas evolve.*