Privacy
Opta is built on a local-first privacy architecture. All inference runs on your hardware, session data stays on your disk, there is no telemetry, and no data leaves your network without explicit consent.
Local-First Privacy
Privacy in Opta is not a policy promise -- it is an architectural guarantee. The system is designed so that data physically cannot leave your local network unless you explicitly configure a cloud integration.
There is no cloud backend processing your requests. There is no analytics service tracking your usage. There is no data retention policy because there is no external party retaining your data. The AI runs on your machine, the data stays on your machine, and the network traffic stays on your network.
Inference on Your Hardware
The LMX inference server runs on your Apple Silicon Mac, using MLX for optimized Metal GPU inference. Your prompts are processed locally and model outputs are generated locally. At no point during inference does data leave your machine.
This means:
- Proprietary source code sent to the AI never leaves your network
- Sensitive business documents processed by the AI stay on your hardware
- Personal conversations with the AI are stored only on your local filesystem
- No third party can read, log, or train on your interactions
No Cloud Without Opt-In (S04)
Rule S04 (Strict tier) requires that no component of the Opta stack sends data to any cloud service without explicit user opt-in. This applies to:
- Model downloads (require user-initiated action)
- Cloud API fallback (disabled by default, must be configured)
- Telemetry (does not exist -- there is no telemetry code)
- Error reporting (errors are logged locally, never phoned home)
- Usage analytics (not implemented)
If you configure a Cloudflare Tunnel for remote access, that is an explicit opt-in action. If you add an Anthropic API key for cloud inference fallback, that is an explicit opt-in action. The default configuration makes zero outbound connections to any external service.
Session Data on Disk
All session data -- conversation history, tool call records, model responses, browser automation recordings -- is stored on your local filesystem. The daemon writes session data to ~/.config/opta/daemon/sessions/ as JSON files.
You have full control over this data:
- Read it with standard file tools
- Back it up with your existing backup solution
- Delete it when you no longer need it
- Encrypt it with disk-level encryption (FileVault on macOS)
There is no cloud sync for session data by default. Sessions exist only where the daemon writes them.
No Telemetry
Opta contains no telemetry code. There are no analytics SDKs, no usage tracking, no crash reporters that phone home, and no feature flags fetched from external servers.
This is a deliberate design decision, not an oversight. The Opta codebase does not import Sentry, Amplitude, Mixpanel, Google Analytics, or any other analytics or error reporting service.
Token Safety (S02)
Rule S02 (Strict tier) requires that authentication tokens, API keys, and other secrets are never written to log files or displayed in output. This prevents accidental exposure of credentials in debug logs, terminal output, or session recordings.
The daemon sanitizes all log output to redact tokens before writing. If a log message would contain a Bearer token, API key, or password, it is replaced with a placeholder before the message is persisted.
LAN Isolation
The Opta stack operates entirely within your local area network. The daemon binds to localhost (127.0.0.1), so it is not even accessible from other machines on your LAN. The LMX server binds to a LAN address but is not exposed to the internet.
Network communication within the stack:
- CLI to Daemon -- localhost only (127.0.0.1:9999)
- Daemon to LMX -- LAN only (192.168.188.11:1234)
- Web Dashboard to LMX -- LAN only or via Cloudflare Tunnel (explicit opt-in)
No component initiates connections to external IP addresses or domains unless explicitly configured to do so.