Opta Help

Operating documentation for deploying, governing, and scaling Opta across local and hybrid runtime environments.

Canonical activation model: establish runtime policy (local LMX or cloud), activate Opta AI, then execute through Opta CLI or Opta Code under daemon-governed operational state.

Docs pages now support platform-aware rendering. Switch between macOS and Windows views to see the exact commands, paths, and lifecycle controls for your environment.

The Opta Local Stack

Runtime strategy feeds Opta AI, then execution is orchestrated through daemon, CLI, and desktop operator surfaces

Cloud models / Opta LMX local runtime# Runtime source
Opta AI# Optimizer core
opta daemon127.0.0.1:<port># Activation + session orchestration
│ HTTP v3 REST + WebSocket streaming
opta chat / opta tui / opta do# Opta CLI surface
Opta Code DesktopTauri desktop + daemon bridge# Opta Code surface