taiga_wiki_260508.py 2.3 KB

123456789101112131415161718192021222324252627282930313233343536373839
  1. import requests
  2. import urllib3
  3. urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
  4. base_url = 'https://192.168.130.161/taiga/api/v1'
  5. auth = requests.post(f'{base_url}/auth', json={'type': 'normal', 'username': 'FrancoisLange', 'password': 'BTSai123'}, verify=False).json()
  6. headers = {'Authorization': f'Bearer {auth["auth_token"]}', 'Content-Type': 'application/json'}
  7. proj_id = 21
  8. slug = '260508-daily'
  9. content = """# Daily Scrum 26.05.08
  10. ## What was done yesterday?
  11. - Addressed application crashes caused by missing columns (`search_limit`) and tables (`products_core`).
  12. - Discovered that the DB drop destroyed the entire schema temporarily until the offline ingestion recreated it, causing UI crashes.
  13. - Implemented `ON DUPLICATE KEY UPDATE` consolidation logic to fix the duplication explosion that degraded search performance.
  14. ## What is the plan for today?
  15. - Ensure the initialization SQL officially defines all vertical partitions (`products_core`, `products_macros`, etc.) so the DB structure exists safely before the offline ingestion completes.
  16. - Lock down LLM model initialization into the Docker network `command` argument to strictly decouple the 1.3GB model download from the Streamlit UI loading phase.
  17. - Finalize Food Scale standardizations (`xl`, `l`, `m`, `s`) in `unit_converter.py`.
  18. ## Blockers
  19. - **Data Race Condition**: The LLM model stream download crashed the AI interface when requested immediately on app startup. Fixed by detaching the download to the container orchestrator.
  20. - **Table Missing Error**: Streamlit `app.py` querying `products_core` before Python `to_sql` had a chance to create it. Fixed by explicitly declaring schemas in `init.sql`.
  21. """
  22. payload = {"project": proj_id, "slug": slug, "content": content}
  23. res = requests.post(f'{base_url}/wiki', json=payload, headers=headers, verify=False)
  24. if res.status_code == 201:
  25. print("Created 260508-daily page!")
  26. else:
  27. # Try put
  28. check_req = requests.get(f'{base_url}/wiki?project={proj_id}&slug={slug}', headers=headers, verify=False).json()
  29. if len(check_req) > 0:
  30. page_id = check_req[0]['id']
  31. version = check_req[0]['version']
  32. res2 = requests.put(f'{base_url}/wiki/{page_id}', json={"project": proj_id, "slug": slug, "content": content, "version": version}, headers=headers, verify=False)
  33. print("Updated 260508-daily page!")