Developer Guide
The Opta Local stack exposes multiple integration points for developers who want to build on top of the platform -- from the daemon HTTP API and LMX inference endpoints to WebSocket event streaming and MCP server extensions.
Building on the Opta Stack
Opta is designed as an open integration platform. Each layer of the stack exposes well-documented APIs that you can use independently or together:
- Build custom UIs that connect to the daemon
- Write automation scripts that use the LMX inference API
- Create MCP servers that give the AI new tools
- Stream real-time events from the daemon WebSocket
- Integrate session data into your own workflows
Integration Points
Daemon HTTP API
The daemon exposes a REST API on 127.0.0.1:9999 for session management, turn submission, and daemon control. All requests require Bearer token authentication.
curl -s http://127.0.0.1:9999/v3/sessions \
-H "Authorization: Bearer <token>" | jqThe full HTTP API is documented in the Daemon HTTP API reference. Key endpoints include session CRUD, turn submission, health checks, and daemon lifecycle control.
LMX OpenAI API
The LMX inference server exposes an OpenAI-compatible API at 192.168.188.11:1234. Any tool or library that works with the OpenAI API can connect to LMX by changing the base URL.
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "http://192.168.188.11:1234/v1",
apiKey: "not-needed", // LMX does not require an API key for inference
});
const response = await client.chat.completions.create({
model: "qwen3-72b",
messages: [{ role: "user", content: "Hello!" }],
stream: true,
});
for await (const chunk of response) {
process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}base_url / baseURL to your LMX address. No API key is needed for inference endpoints.WebSocket Events
The daemon provides a WebSocket endpoint for real-time event streaming. Events include turn progress, tool calls, completion notifications, and error reports. The WebSocket connection supports cursor-based reconnection to avoid re-delivery of events.
const ws = new WebSocket(
"ws://127.0.0.1:9999/v3/events?token=<token>"
);
ws.onmessage = (event) => {
const envelope = JSON.parse(event.data);
switch (envelope.event) {
case "turn.token":
// streaming token
break;
case "turn.done":
// turn completed, envelope.stats has token count, speed, etc.
break;
case "turn.error":
// inference or tool error
break;
}
};Full WebSocket protocol documentation is in the WebSocket Events reference.
@opta/daemon-client
The @opta/daemon-client TypeScript package provides a typed client for the daemon HTTP API and WebSocket events. It handles authentication, reconnection, and event deserialization.
See the Daemon Client SDK page for usage details.
Extending with MCP
The Model Context Protocol (MCP) allows you to give the AI new tools by writing MCP servers. The daemon can connect to multiple MCP servers simultaneously, making their tools available to the model during sessions.
See the MCP Integration page for details on listing, adding, testing, and removing MCP servers.