Opta Help
Comprehensive documentation for the Opta Local private AI stack. CLI, Daemon, LMX inference, Local Web, and more.
The Opta Local Stack
Three components form a layered local-inference pipeline
opta chat / opta tui / opta do# Opta CLI
│
opta daemon127.0.0.1:9999# Session orchestration
│ HTTP v3 REST + WebSocket
Opta LMX192.168.188.11:1234# MLX inference
│ OpenAI-compatible API
Opta Local Weblocalhost:3004# Dashboard + Chat