🧠 Project: AI Console (Open-WebUI)
Objective: Deploy a centralized, private web interface for interacting with the Federation's local LLMs.
🏗️ Architecture
- UI Host:
holodeck-lab(192.168.1.11 - LXC) - Inference Engine:
starfleet-compute(192.168.1.35 - Physical Mac Mini) - Model: Mistral-Nemo (12B)
- Standard: UI and Logic are decoupled. Heavy lifting (Ollama) stays on physical hardware; Web traffic stays in the lab environment.
🛠️ Hardware Requirements (Holodeck)
- RAM: 4GB (Upgraded from 2GB baseline)
- Disk: 10GB (For Docker images and UI cache)
🚀 Deployment Steps
- Upgrade Holodeck LXC resources.
- Deploy
open-webuicontainer on Holodeck. - Link to Starfleet Ollama API via internal IP.
- Update Master Dashboard.
🔒 Security
- Internal network access only (VPN required).
- Decoupled from public-facing Caddy records for initial phase.