Chat Interface
The Chat interface is the central hub of interaction in ChatFrame, designed for a familiar and efficient conversation experience with Large Language Models (LLMs).
Starting a Conversation
- Select Provider/Model: Use the dropdown menu to choose your desired LLM provider and model (e.g., OpenAI's GPT-4, Anthropic's Claude 3.5).
- Enter Prompt: Type your question or instruction into the input field.
- Send: Press Enter or click the send button to initiate the conversation.
Key Features in Chat
Context Management
ChatFrame automatically manages the conversation history, sending previous turns as context to the LLM to maintain coherence.
Local RAG Integration
If you have indexed local files (see file-indexing.md), the chat will automatically use the relevant information from those files to augment the LLM's response. This is a seamless process that happens in the background.
Artifacts
When you request the AI to generate code, diagrams, or visual content, the result will often appear as an Artifact in a dedicated panel, allowing for live rendering and iteration (see artifacts.md).
MCP Tools
If an MCP Server is connected and active, the LLM will be able to invoke the exposed tools (e.g., database queries, web search) during the conversation to provide more accurate and context-aware responses (see mcp-overview.md).
Performance
The application is engineered for performance, providing a smooth and responsive experience, even when streaming responses from high-speed providers like Groq.