Getting Started
Welcome to ChatFrame, your personal AI workspace! This guide will help you get set up and start using the core features of the application.
1. Installation
First, ensure you have successfully installed the application on your system.
2. Configure Your Providers (BYOK)
ChatFrame uses a Bring Your Own Key (BYOK) model. To start chatting, you must add your API keys for your preferred Large Language Model (LLM) providers.
- Launch ChatFrame.
- Navigate to the Providers tab (usually found in the settings or sidebar).
- Select your desired provider (e.g., OpenAI, Anthropic, Groq).
- Paste your API key into the designated field.
- Tip: The documentation table in
llms.mdlists direct links to each provider’s key portal to help you find your keys.
- Tip: The documentation table in
- Repeat this process for all the providers you wish to use.
3. Start a Chat
Once your providers are configured:
- Navigate to the Chat interface.
- Select the LLM model you want to use from the dropdown menu.
- Enter your prompt and begin your conversation.
4. Leverage Local RAG
To chat with your own files:
- Navigate to the File Indexing or Projects section.
- Add your PDF, text, or code files to the local index.
- The application will perform vector indexing locally (your data never leaves your machine).
- In your chat, you can now reference the indexed files, and the AI will use this context for Retrieval-Augmented Generation (RAG).
5. Explore Advanced Features
- Artifacts: Try asking the AI to generate a React component or a Mermaid diagram to see the Artifact feature in action.
- MCP Servers: If you need to connect your AI to external databases or internal APIs, explore setting up an MCP Server (see
mcp-overview.md).