Frequently Asked Questions (FAQ)
General
What is ChatFrame?
ChatFrame is a personal AI workspace—a single, polished desktop application for macOS and Windows that lets you chat with multiple Large Language Model (LLM) providers (OpenAI, Anthropic, Groq, etc.) from one interface.
What do I get exactly?
You receive a desktop application with:
- A unified interface for multiple LLM providers (BYOK).
- Local RAG for your own PDF, text, and code files—no data ever leaves your machine.
- Support for the Model Context Protocol (MCP) to plug in custom tools and databases.
- Artifact generation for rendering HTML, React, Mermaid, and SVG.
Is ChatFrame open source?
No. ChatFrame is closed-source and built on Tauri and the Vercel AI SDK.
Data and Privacy
Is my data ever sent to third-party services?
Only the prompts and context you explicitly send to an LLM provider are transmitted. All file parsing, vector indexing, and RAG operations happen locally on your computer. Your files never leave your machine.
How do I add my API keys?
Launch ChatFrame, open the Providers tab in settings, and paste the keys for the services you use. Your keys are stored locally and securely.
Technical
What is MCP and why should I care?
MCP (Model Context Protocol) is an open standard that lets language models call external tools—databases, web search, custom scripts, etc.—in a secure, standardized way. It allows your chatbot to query a Postgres database or run internal APIs.
Do I need to install Node.js or Python to use MCP servers?
Only if you choose to run MCP servers that require them (e.g., the Postgres MCP server needs Node.js). ChatFrame does not bundle runtimes; you control your own environment.
Updates
How do updates work?
Updates download automatically in the background. When a new version is ready, an "Install" button appears in the app; one click applies the update.