First Session
Now that the CLI is installed and connected to LMX, it is time to run your first AI session. This page covers interactive chat, autonomous task execution, permission handling, and session management.
Start a Chat
The simplest way to interact with your local AI is the opta chat command. This opens an interactive chat session in your terminal.
Launch interactive chat
This starts the daemon (if not already running) and opens a new chat session.
Type your first prompt
Once the session is active, type a message and press Enter.
Continue the conversation
The session maintains full context. Follow-up messages reference the entire conversation history.
Exit the session
Press Ctrl+C or type /exit to end the chat session.
Understanding Streaming
Responses stream token-by-token as the model generates them. You will see text appearing incrementally in your terminal rather than waiting for the complete response.
During streaming, the CLI shows:
- Token output -- the response text, rendered as it arrives
- Thinking indicators -- for reasoning models, a spinner or thinking block shows the model's internal reasoning before the final response
- Turn statistics -- after each response, the CLI displays token count, generation speed (tokens/sec), and elapsed time
Do Mode
While opta chat is conversational, opta do is action-oriented. It tells the AI to complete a specific task using available tools -- file operations, shell commands, code analysis, and more.
In do mode, the AI will:
- Analyze the task and plan the approach
- Use tools to read existing files, create new files, and run commands
- Ask for permission before potentially destructive operations
- Report the results when the task is complete
Permission Prompts
When the AI wants to perform an action in do mode, the CLI prompts you for approval. This is the permission system -- it ensures no tool runs without your explicit consent.
Your options at each permission prompt:
- y (yes) -- approve this single invocation
- n (no) -- deny this invocation; the AI will try an alternative approach
- a (always) -- approve all future invocations of this tool for the current session
Managing Sessions
Every chat and do interaction creates a session. Sessions store the full conversation history, tool invocations, and metadata. You can list, resume, and manage sessions after they end.
List recent sessions
Resume a previous session
Continue a conversation from where you left off. The full context is restored.
View session details
Inspect metadata, token usage, and tool call history for a session.
Exporting Sessions
Sessions can be exported in multiple formats for sharing, archiving, or processing:
Supported export formats:
- markdown -- human-readable Markdown document
- json -- full session data including metadata and tool calls
- text -- plain text transcript
Tips
You are now ready to use the full Opta Local stack. The next section covers the CLI reference in detail, including all available commands, configuration options, and slash commands.