First Session
Now that the CLI is installed and connected to LMX, it is time to run your first AI session. This page covers interactive chat, autonomous task execution, permission handling, and session management.
Start a Chat
The simplest way to interact with your local AI is the opta chat command. This opens an interactive chat session in your terminal.
Launch interactive chat
This starts the daemon (if not already running) and opens a new chat session.
opta chatType your first prompt
Once the session is active, type a message and press Enter.
> Explain the difference between let and const in TypeScript.
In TypeScript (and JavaScript), `let` and `const` are both block-scoped
variable declarations, but they differ in mutability:
- **const** declares a variable that cannot be reassigned after initialization.
The binding is immutable, though object properties can still be modified.
- **let** declares a variable that can be reassigned. Use it when the value
needs to change during execution.
Best practice: default to `const` and only use `let` when reassignment
is genuinely needed.Continue the conversation
The session maintains full context. Follow-up messages reference the entire conversation history.
> Give me an example of when let is necessary.
A common case is loop counters:
for (let i = 0; i < items.length; i++) {
// 'i' must be reassigned each iteration
process(items[i]);
}
You cannot use `const` here because the value of `i` changes on each loop.Exit the session
Press Ctrl+C or type /exit to end the chat session.
/exit--once flag:opta chat --once "What is the capital of Australia?"Understanding Streaming
Responses stream token-by-token as the model generates them. You will see text appearing incrementally in your terminal rather than waiting for the complete response.
During streaming, the CLI shows:
- Token output -- the response text, rendered as it arrives
- Thinking indicators -- for reasoning models, a spinner or thinking block shows the model's internal reasoning before the final response
- Turn statistics -- after each response, the CLI displays token count, generation speed (tokens/sec), and elapsed time
--- Turn complete ---
Tokens: 147 (prompt: 52, completion: 95)
Speed: 41.2 tok/s
Time: 2.3sDo Mode
While opta chat is conversational, opta do is action-oriented. It tells the AI to complete a specific task using available tools -- file operations, shell commands, code analysis, and more.
opta do "Create a TypeScript function that validates email addresses using a regex, with unit tests"In do mode, the AI will:
- Analyze the task and plan the approach
- Use tools to read existing files, create new files, and run commands
- Ask for permission before potentially destructive operations
- Report the results when the task is complete
Permission Prompts
When the AI wants to perform an action in do mode, the CLI prompts you for approval. This is the permission system -- it ensures no tool runs without your explicit consent.
Tool: write_file
Path: src/utils/validate-email.ts
Content: [142 lines]
Allow this action? [y]es / [n]o / [a]lways for this tool: Your options at each permission prompt:
- y (yes) -- approve this single invocation
- n (no) -- deny this invocation; the AI will try an alternative approach
- a (always) -- approve all future invocations of this tool for the current session
read_file and list_directory) are classified as safe and auto-approve without prompting. Write operations, shell commands, and destructive actions always require explicit approval.Managing Sessions
Every chat and do interaction creates a session. Sessions store the full conversation history, tool invocations, and metadata. You can list, resume, and manage sessions after they end.
List recent sessions
opta sessions listID Mode Created Turns Title a1b2c3d4 chat 2026-03-01 10:15:00 8 TypeScript let vs const e5f6g7h8 do 2026-03-01 10:22:00 3 Email validation function i9j0k1l2 chat 2026-02-28 16:40:00 12 React hook patterns
Resume a previous session
Continue a conversation from where you left off. The full context is restored.
opta chat --resume a1b2c3d4View session details
Inspect metadata, token usage, and tool call history for a session.
opta sessions show a1b2c3d4Session: a1b2c3d4 Mode: chat Created: 2026-03-01 10:15:00 Turns: 8 Tokens: 1,247 (prompt) + 892 (completion) Tools: 0 invocations Title: TypeScript let vs const
Exporting Sessions
Sessions can be exported in multiple formats for sharing, archiving, or processing:
opta sessions export a1b2c3d4 --format markdown --output session.mdSupported export formats:
- markdown -- human-readable Markdown document
- json -- full session data including metadata and tool calls
- text -- plain text transcript
opta sessions export a1b2c3d4 --format json --output session.jsonTips
/model-- switch the active model/session-- view current session info/debug-- toggle debug output/help-- list all available slash commands
opta chat --model qwen3-30b-a3bYou are now ready to use the full Opta Local stack. The next section covers the CLI reference in detail, including all available commands, configuration options, and slash commands.