OpenAI Compatible Providers
ChatFrame's flexibility is greatly enhanced by its support for OpenAI Compatible APIs. This allows you to integrate any LLM service that adheres to the OpenAI API specification, effectively expanding your choice of models beyond the natively supported providers.
What is OpenAI Compatibility?
An LLM provider is considered "OpenAI Compatible" if its API endpoints, request formats (e.g., for chat completions), and response formats mirror those of the official OpenAI API. This standardization allows ChatFrame to communicate with various services using a single, proven protocol.
Supported Compatible Providers
Several major providers are compatible with the OpenAI API specification and can be configured as custom providers in ChatFrame:
- Cerebras (See
cerebras.md) - Qwen
- Zhipu (See
zhipu.md) - OpenRouter (See
openrouter.md)
Configuration Steps
To configure an OpenAI Compatible provider:
- Obtain API Key and Endpoint: Get the API key and the base API URL (endpoint) from your chosen provider.
- Navigate to Custom Providers: In ChatFrame, go to the Providers settings tab and select Custom Providers.
- Enter Details:
- API Endpoint: Enter the base URL for the provider's API (e.g.,
https://api.cerebras.ai/v1). - API Key: Paste your secret API key.
- API Endpoint: Enter the base URL for the provider's API (e.g.,
- Model Discovery: ChatFrame will attempt to connect to the endpoint and discover the available models, which you can then select for your chats.
Custom Providers
For any other LLM service that claims OpenAI compatibility, you can use the Custom Providers configuration to integrate it into ChatFrame.