Support FAQ
Quick answers for the most common setup, connectivity, and runtime issues in the Opta Local stack.
Setup & Connectivity
Why does `opta chat` fail to connect?
Verify daemon health first, then LMX reachability. Typical checks are daemon on 127.0.0.1:9999 and LMX on your configured LAN host.
Do I install CLI, LMX, and Code separately?
Distribution is managed through Opta Init Manager. Treat Init as the canonical lifecycle entrypoint for local stack components.
Can I run everything on one machine?
Yes, but production setups typically keep LMX on a high-memory host and run CLI/Code from workstation clients.
Sessions & Models
Where are sessions managed?
Session orchestration is owned by the daemon. CLI and Code Desktop are clients of the same session/control plane.
Why is a model available in LMX but not in my client?
The client view depends on daemon and LMX sync. Re-check daemon model operations and LMX health/model list endpoints.
Security & Permissions
Are prompts and responses sent to cloud services?
Local inference flows remain local by default. Cloud providers are optional and only used when explicitly configured.
Why am I being asked to approve tool calls?
Tool permission prompts are enforced by daemon policy and guardrail mode. Review your current permission profile in CLI configuration.
Web Surfaces
What's the difference between Help and Learn?
Help is reference documentation. Learn is guide-driven onboarding and workflow training.
Where do I check current incidents and release state?
Use Opta Status at status.optalocal.com for live service and feature-state visibility.