Opta Help
Operating documentation for deploying, governing, and scaling Opta across local and hybrid runtime environments.
Canonical activation model: establish runtime policy (local LMX or cloud), activate Opta AI, then execute through Opta CLI or Opta Code under daemon-governed operational state.
Docs pages now support platform-aware rendering. Switch between macOS and Windows views to see the exact commands, paths, and lifecycle controls for your environment.
The Opta Local Stack
Runtime strategy feeds Opta AI, then execution is orchestrated through daemon, CLI, and desktop operator surfaces