소스 검색

TG-9: Basic chat interface implementation

FerRo988 1 주 전
커밋
ce4d0cc3e4
6개의 변경된 파일638개의 추가작업 그리고 0개의 파일을 삭제
  1. 32 0
      PROJECT_CONTEXT.md
  2. 72 0
      main.py
  3. 3 0
      requirements.txt
  4. 50 0
      static/index.html
  5. 192 0
      static/script.js
  6. 289 0
      static/style.css

+ 32 - 0
PROJECT_CONTEXT.md

@@ -0,0 +1,32 @@
+# PROJECT_CONTEXT.md
+
+## Project Overview
+LocalFoodAI is a local food AI that provides complete nutritional information on foods and can generate menu proposals based on user specifications. It runs entirely on a local Ubuntu 24.04 VM (8 vCPU, 30 GB RAM, no GPU). No user data leaves the server. The backend is Python-based.
+
+## Tech Stack
+- **Operating System:** Ubuntu 24.04 (VM)
+- **Backend:** Python 3.11+
+- **Database:** SQLite (local, no cloud)
+- **Local LLM:** Llama 3.1 8B (quantized via Ollama, Q4_K_M or equivalent)
+  - CPU-only compatible
+  - Fits in 30 GB RAM with quantization
+  - Instruction-following tuned
+  - Open-source license (compatible with student projects)
+- **Local Web Search Tool:** SearXNG (fully local, anonymous)
+- **Version Control:** Git via Gogs on git.btshub.lu
+- **CI / Deployment:** Antigravity Agent Manager handles task execution
+- **LLM Hosting:** Ollama local instance, no cloud APIs
+
+## Rules & Constraints
+- **No external APIs or cloud services** for computation or data fetching
+- **All data and computation must remain on the local VM**
+- **All commits must be traceable to a Taiga Task ID**
+- Antigravity must **read this file before starting any task** to avoid hallucinating cloud-based solutions
+- Model and backend selection must fit **VM constraints** (CPU-only, RAM limit)
+
+## Best Practices
+- Use quantized models for CPU efficiency
+- Verify all AI-generated Python or database logic before approving commits
+- Test database queries and prompt logic locally before integrating
+- Attach all artifacts (Implementation Plans, task lists, browser recordings) to the corresponding Taiga task
+- Always include the TG-<ID> prefix in commit messages

+ 72 - 0
main.py

@@ -0,0 +1,72 @@
+import json
+import logging
+import httpx
+from fastapi import FastAPI, HTTPException
+from fastapi.responses import HTMLResponse, StreamingResponse
+from fastapi.staticfiles import StaticFiles
+from pydantic import BaseModel
+from typing import List, Generator
+
+logging.basicConfig(level=logging.INFO)
+logger = logging.getLogger(__name__)
+
+app = FastAPI(title="LocalFoodAI Chat")
+
+OLLAMA_URL = "http://localhost:11434/api/chat"
+MODEL_NAME = "llama3.1:8b"
+
+# Mount static files to serve the frontend
+app.mount("/static", StaticFiles(directory="static"), name="static")
+
+class ChatMessage(BaseModel):
+    role: str
+    content: str
+
+class ChatRequest(BaseModel):
+    messages: List[ChatMessage]
+
+@app.get("/", response_class=HTMLResponse)
+async def read_root():
+    """Serve the chat interface HTML"""
+    try:
+        with open("static/index.html", "r", encoding="utf-8") as f:
+            return HTMLResponse(content=f.read())
+    except FileNotFoundError:
+        return HTMLResponse(content="<h1>Welcome to LocalFoodAI</h1><p>static/index.html not found. Please create the frontend.</p>")
+
+@app.post("/chat")
+async def chat_endpoint(request: ChatRequest):
+    """Proxy chat requests to the local Ollama instance with streaming support"""
+    payload = {
+        "model": MODEL_NAME,
+        "messages": [msg.model_dump() for msg in request.messages],
+        "stream": True  # Enable streaming for a better UI experience
+    }
+    
+    async def generate_response():
+        try:
+            async with httpx.AsyncClient() as client:
+                async with client.stream("POST", OLLAMA_URL, json=payload, timeout=120.0) as response:
+                    if response.status_code != 200:
+                        error_detail = await response.aread()
+                        logger.error(f"Error communicating with Ollama: {error_detail}")
+                        yield f"data: {json.dumps({'error': 'Error communicating with local LLM.'})}\n\n"
+                        return
+
+                    async for line in response.aiter_lines():
+                        if line:
+                            data = json.loads(line)
+                            if "message" in data and "content" in data["message"]:
+                                content = data["message"]["content"]
+                                yield f"data: {json.dumps({'content': content})}\n\n"
+                            if data.get("done"):
+                                break
+        except Exception as e:
+            logger.error(f"Unexpected error during stream: {e}")
+            yield f"data: {json.dumps({'error': str(e)})}\n\n"
+
+    return StreamingResponse(generate_response(), media_type="text/event-stream")
+
+if __name__ == "__main__":
+    import uvicorn
+    uvicorn.run("main:app", host="127.0.0.1", port=8000, reload=True)

+ 3 - 0
requirements.txt

@@ -0,0 +1,3 @@
+fastapi>=0.100.0
+uvicorn>=0.23.0
+httpx>=0.24.0

+ 50 - 0
static/index.html

@@ -0,0 +1,50 @@
+<!DOCTYPE html>
+<html lang="en">
+<head>
+    <meta charset="UTF-8">
+    <meta name="viewport" content="width=device-width, initial-scale=1.0">
+    <title>LocalFoodAI Chat</title>
+    <meta name="description" content="LocalFoodAI Assistant for Nutritional Information and Menu Proposals">
+    <link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap" rel="stylesheet">
+    <link rel="stylesheet" href="/static/style.css">
+</head>
+<body>
+    <div class="app-container">
+        <header class="chat-header">
+            <div class="brand">
+                <div class="logo">🍳</div>
+                <div>
+                    <h1>LocalFoodAI</h1>
+                    <span class="status-indicator"></span><span class="status-text">Local LLM Ready</span>
+                </div>
+            </div>
+            <div class="actions">
+                <button id="clear-chat" title="Clear Chat">
+                    <svg width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2"><path d="M3 6h18M19 6v14a2 2 0 01-2 2H7a2 2 0 01-2-2V6m3 0V4a2 2 0 012-2h4a2 2 0 012 2v2"></path></svg>
+                </button>
+            </div>
+        </header>
+        
+        <main class="chat-container" id="chat-container">
+            <div class="message system">
+                <div class="avatar">🤖</div>
+                <div class="message-content">
+                    <p>Hello! I am LocalFoodAI, your completely local nutrition and menu assistant. How can I help you today?</p>
+                </div>
+            </div>
+        </main>
+
+        <footer class="chat-input-area">
+            <form id="chat-form" class="input-form">
+                <textarea id="user-input" placeholder="Ask about recipes, nutrition, menus..." rows="1" required></textarea>
+                <button type="submit" id="send-btn" class="send-btn" aria-label="Send message">
+                    <svg width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"><line x1="22" y1="2" x2="11" y2="13"></line><polygon points="22 2 15 22 11 13 2 9 22 2"></polygon></svg>
+                </button>
+            </form>
+            <div class="footer-note">Powered by Llama 3.1 8B running locally on Ubuntu 24.04 via Ollama</div>
+        </footer>
+    </div>
+    
+    <script src="/static/script.js"></script>
+</body>
+</html>

+ 192 - 0
static/script.js

@@ -0,0 +1,192 @@
+document.addEventListener('DOMContentLoaded', () => {
+    const chatForm = document.getElementById('chat-form');
+    const userInput = document.getElementById('user-input');
+    const chatContainer = document.getElementById('chat-container');
+    const sendBtn = document.getElementById('send-btn');
+    const clearChatBtn = document.getElementById('clear-chat');
+
+    let chatHistory = []; // Store conversation history
+
+    // Auto-resize textarea
+    userInput.addEventListener('input', function() {
+        this.style.height = 'auto';
+        this.style.height = (this.scrollHeight > 150 ? 150 : this.scrollHeight) + 'px';
+        if (this.value.trim() === '') {
+            sendBtn.disabled = true;
+        } else {
+            sendBtn.disabled = false;
+        }
+    });
+
+    // Handle Enter key (Shift+Enter for new line)
+    userInput.addEventListener('keydown', function(e) {
+        if (e.key === 'Enter' && !e.shiftKey) {
+            e.preventDefault();
+            if (!sendBtn.disabled) {
+                chatForm.dispatchEvent(new Event('submit'));
+            }
+        }
+    });
+
+    clearChatBtn.addEventListener('click', () => {
+        if (confirm('Are you sure you want to clear the chat history?')) {
+            chatHistory = [];
+            chatContainer.innerHTML = '';
+            addMessage('system', 'Hello! I am LocalFoodAI, your completely local nutrition and menu assistant. How can I help you today?');
+        }
+    });
+
+    chatForm.addEventListener('submit', async (e) => {
+        e.preventDefault();
+        const message = userInput.value.trim();
+        if (!message) return;
+
+        // Reset input
+        userInput.value = '';
+        userInput.style.height = 'auto';
+        sendBtn.disabled = true;
+
+        // Add user message to UI
+        addMessage('user', message);
+        chatHistory.push({ role: 'user', content: message });
+
+        // Add loading indicator
+        const loadingId = addTypingIndicator();
+
+        try {
+            // Fetch response from backend
+            const response = await fetch('/chat', {
+                method: 'POST',
+                headers: { 'Content-Type': 'application/json' },
+                body: JSON.stringify({ messages: chatHistory })
+            });
+
+            if (!response.ok) {
+                throw new Error(`HTTP error! status: ${response.status}`);
+            }
+
+            // Remove loading indicator
+            removeElement(loadingId);
+
+            // Create new bot message container
+            const botMessageId = 'msg-' + Date.now();
+            const botContentEl = addMessage('system', '', botMessageId);
+
+            let botFullResponse = '';
+
+            // Handle Server-Sent Events (Streaming)
+            const reader = response.body.getReader();
+            const decoder = new TextDecoder('utf-8');
+            let done = false;
+
+            while (!done) {
+                const { value, done: readerDone } = await reader.read();
+                done = readerDone;
+                if (value) {
+                    const chunk = decoder.decode(value, { stream: true });
+                    // Split the chunk by double newline to get individual SSE messages
+                    const lines = chunk.split('\n\n');
+                    for (const line of lines) {
+                        if (line.startsWith('data: ')) {
+                            const dataStr = line.substring(6);
+                            if (dataStr.trim() === '') continue;
+                            try {
+                                const data = JSON.parse(dataStr);
+                                if (data.error) {
+                                    botContentEl.innerHTML += `<br><span style="color:#f85149">Error: ${data.error}</span>`;
+                                } else if (data.content !== undefined) {
+                                    botFullResponse += data.content;
+                                    // Basic text to HTML conversion
+                                    botContentEl.innerHTML = formatText(botFullResponse);
+                                    chatContainer.scrollTop = chatContainer.scrollHeight;
+                                }
+                            } catch (err) {
+                                console.error('Error parsing SSE data:', err, dataStr);
+                            }
+                        }
+                    }
+                }
+            }
+
+            // Save bot response to history once complete
+            chatHistory.push({ role: 'assistant', content: botFullResponse });
+
+        } catch (error) {
+            console.error('Chat error:', error);
+            removeElement(loadingId);
+            addMessage('system', 'Sorry, there was an error communicating with the local LLM. Make sure the server and Ollama are running.');
+        } finally {
+            sendBtn.disabled = false;
+            userInput.focus();
+        }
+    });
+
+    function addMessage(role, content, id = null) {
+        const msgDiv = document.createElement('div');
+        msgDiv.className = `message ${role}`;
+        if (id) msgDiv.id = id;
+
+        const avatarDiv = document.createElement('div');
+        avatarDiv.className = 'avatar';
+        avatarDiv.textContent = role === 'user' ? '👤' : '🤖';
+
+        const contentDiv = document.createElement('div');
+        contentDiv.className = 'message-content';
+        contentDiv.innerHTML = formatText(content);
+
+        msgDiv.appendChild(avatarDiv);
+        msgDiv.appendChild(contentDiv);
+        chatContainer.appendChild(msgDiv);
+        chatContainer.scrollTop = chatContainer.scrollHeight;
+
+        return contentDiv;
+    }
+
+    function addTypingIndicator() {
+        const id = 'typing-' + Date.now();
+        const msgDiv = document.createElement('div');
+        msgDiv.className = 'message system';
+        msgDiv.id = id;
+
+        const avatarDiv = document.createElement('div');
+        avatarDiv.className = 'avatar';
+        avatarDiv.textContent = '🤖';
+
+        const contentDiv = document.createElement('div');
+        contentDiv.className = 'message-content typing-indicator';
+        contentDiv.innerHTML = `
+            <div class="typing-dot"></div>
+            <div class="typing-dot"></div>
+            <div class="typing-dot"></div>
+        `;
+
+        msgDiv.appendChild(avatarDiv);
+        msgDiv.appendChild(contentDiv);
+        chatContainer.appendChild(msgDiv);
+        chatContainer.scrollTop = chatContainer.scrollHeight;
+
+        return id;
+    }
+
+    function removeElement(id) {
+        const el = document.getElementById(id);
+        if (el) el.remove();
+    }
+
+    function formatText(text) {
+        if (!text) return '';
+        // Very basic markdown parsing for bold, italics, code, and newlines
+        let formatted = text
+            .replace(/&/g, "&amp;")
+            .replace(/</g, "&lt;")
+            .replace(/>/g, "&gt;")
+            .replace(/\n/g, "<br>")
+            .replace(/\*\*(.*?)\*\*/g, "<strong>$1</strong>") // bold
+            .replace(/\*(.*?)\*/g, "<em>$1</em>") // italic
+            .replace(/`(.*?)`/g, "<code style='background:rgba(255,255,255,0.1);padding:2px 4px;border-radius:4px'>$1</code>"); // inline code
+        return formatted;
+    }
+
+    // Initialize state
+    sendBtn.disabled = true;
+});

+ 289 - 0
static/style.css

@@ -0,0 +1,289 @@
+:root {
+    --bg-color: #0d1117;
+    --panel-bg: rgba(22, 27, 34, 0.7);
+    --border-color: rgba(48, 54, 61, 0.8);
+    --text-main: #c9d1d9;
+    --text-muted: #8b949e;
+    --primary-gradient: linear-gradient(135deg, #2ea043 0%, #238636 100%);
+    --primary-color: #2ea043;
+    --user-msg-bg: linear-gradient(135deg, #1f6feb 0%, #164e63 100%);
+    --bot-msg-bg: rgba(33, 38, 45, 0.8);
+    --glass-blur: blur(16px);
+}
+
+* {
+    box-sizing: border-box;
+    margin: 0;
+    padding: 0;
+}
+
+body {
+    font-family: 'Inter', -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Helvetica, Arial, sans-serif;
+    background-color: var(--bg-color);
+    background-image: radial-gradient(circle at top right, rgba(46, 160, 67, 0.15), transparent 400px),
+                      radial-gradient(circle at bottom left, rgba(31, 111, 235, 0.1), transparent 400px);
+    color: var(--text-main);
+    display: flex;
+    justify-content: center;
+    align-items: center;
+    height: 100vh;
+    padding: 20px;
+    overflow: hidden;
+}
+
+.app-container {
+    width: 100%;
+    max-width: 900px;
+    height: 90vh;
+    background: var(--panel-bg);
+    backdrop-filter: var(--glass-blur);
+    -webkit-backdrop-filter: var(--glass-blur);
+    border: 1px solid var(--border-color);
+    border-radius: 20px;
+    display: flex;
+    flex-direction: column;
+    box-shadow: 0 25px 50px -12px rgba(0, 0, 0, 0.5);
+    overflow: hidden;
+}
+
+.chat-header {
+    padding: 16px 24px;
+    border-bottom: 1px solid var(--border-color);
+    display: flex;
+    justify-content: space-between;
+    align-items: center;
+    background: rgba(13, 17, 23, 0.6);
+}
+
+.brand {
+    display: flex;
+    align-items: center;
+    gap: 12px;
+}
+
+.logo {
+    font-size: 28px;
+    background: var(--primary-gradient);
+    -webkit-background-clip: text;
+    background-clip: text;
+}
+
+h1 {
+    font-size: 1.1rem;
+    font-weight: 600;
+    color: #f0f6fc;
+}
+
+.status-indicator {
+    display: inline-block;
+    width: 8px;
+    height: 8px;
+    background-color: #2ea043;
+    border-radius: 50%;
+    margin-right: 6px;
+    box-shadow: 0 0 10px #2ea043;
+}
+
+.status-text {
+    font-size: 0.75rem;
+    color: var(--text-muted);
+}
+
+#clear-chat {
+    background: none;
+    border: none;
+    color: var(--text-muted);
+    cursor: pointer;
+    transition: color 0.2s ease;
+    padding: 8px;
+    border-radius: 8px;
+}
+
+#clear-chat:hover {
+    color: #f85149;
+    background: rgba(248, 81, 73, 0.1);
+}
+
+.chat-container {
+    flex: 1;
+    padding: 24px;
+    overflow-y: auto;
+    display: flex;
+    flex-direction: column;
+    gap: 20px;
+    scroll-behavior: smooth;
+}
+
+.chat-container::-webkit-scrollbar {
+    width: 6px;
+}
+.chat-container::-webkit-scrollbar-thumb {
+    background: var(--border-color);
+    border-radius: 10px;
+}
+
+.message {
+    display: flex;
+    gap: 16px;
+    max-width: 85%;
+    animation: fadeIn 0.3s ease forwards;
+    opacity: 0;
+    transform: translateY(10px);
+}
+
+@keyframes fadeIn {
+    to {
+        opacity: 1;
+        transform: translateY(0);
+    }
+}
+
+.message.user {
+    align-self: flex-end;
+    flex-direction: row-reverse;
+}
+
+.avatar {
+    width: 36px;
+    height: 36px;
+    border-radius: 10px;
+    display: flex;
+    justify-content: center;
+    align-items: center;
+    font-size: 20px;
+    flex-shrink: 0;
+    background: rgba(255, 255, 255, 0.1);
+}
+
+.message.user .avatar {
+    background: var(--user-msg-bg);
+}
+
+.message.system .avatar {
+    background: var(--bot-msg-bg);
+    border: 1px solid var(--border-color);
+}
+
+.message-content {
+    padding: 14px 18px;
+    border-radius: 18px;
+    line-height: 1.6;
+    font-size: 0.95rem;
+    white-space: pre-wrap;
+    word-break: break-word;
+}
+
+.message.user .message-content {
+    background: var(--user-msg-bg);
+    color: #fff;
+    border-top-right-radius: 4px;
+}
+
+.message.system .message-content {
+    background: var(--bot-msg-bg);
+    border: 1px solid var(--border-color);
+    border-top-left-radius: 4px;
+    box-shadow: 0 4px 12px rgba(0,0,0,0.1);
+}
+
+.chat-input-area {
+    padding: 20px 24px;
+    background: rgba(13, 17, 23, 0.8);
+    border-top: 1px solid var(--border-color);
+}
+
+.input-form {
+    display: flex;
+    gap: 12px;
+    align-items: flex-end;
+    background: var(--bg-color);
+    border: 1px solid var(--border-color);
+    border-radius: 16px;
+    padding: 8px 16px;
+    transition: border-color 0.2s, box-shadow 0.2s;
+}
+
+.input-form:focus-within {
+    border-color: rgba(46, 160, 67, 0.5);
+    box-shadow: 0 0 0 2px rgba(46, 160, 67, 0.1);
+}
+
+textarea {
+    flex: 1;
+    background: none;
+    border: none;
+    color: var(--text-main);
+    font-family: inherit;
+    font-size: 1rem;
+    resize: none;
+    max-height: 150px;
+    padding: 10px 0;
+    outline: none;
+}
+
+textarea::placeholder {
+    color: var(--text-muted);
+}
+
+.send-btn {
+    background: var(--primary-color);
+    border: none;
+    border-radius: 12px;
+    width: 40px;
+    height: 40px;
+    display: flex;
+    justify-content: center;
+    align-items: center;
+    color: #fff;
+    cursor: pointer;
+    transition: transform 0.2s, background 0.2s;
+    flex-shrink: 0;
+    margin-bottom: 2px;
+}
+
+.send-btn:hover {
+    background: #3fb950;
+    transform: scale(1.05);
+}
+
+.send-btn:active {
+    transform: scale(0.95);
+}
+
+.send-btn:disabled {
+    background: var(--border-color);
+    color: var(--text-muted);
+    cursor: not-allowed;
+    transform: none;
+}
+
+.footer-note {
+    text-align: center;
+    font-size: 0.7rem;
+    color: var(--text-muted);
+    margin-top: 12px;
+}
+
+.typing-indicator {
+    display: flex;
+    gap: 4px;
+    padding: 4px 8px;
+    align-items: center;
+}
+
+.typing-dot {
+    width: 6px;
+    height: 6px;
+    background: var(--text-muted);
+    border-radius: 50%;
+    animation: typing 1.4s infinite ease-in-out both;
+}
+
+.typing-dot:nth-child(1) { animation-delay: -0.32s; }
+.typing-dot:nth-child(2) { animation-delay: -0.16s; }
+.typing-dot:nth-child(3) { animation-delay: 0s; }
+
+@keyframes typing {
+    0%, 80%, 100% { transform: scale(0); }
+    40% { transform: scale(1); }
+}