Custom Providers

ChatFrame's Custom Providers feature allows you to integrate any Large Language Model (LLM) service that adheres to the OpenAI API specification, providing maximum flexibility for your AI workflow.

When to Use Custom Providers

You should use the Custom Providers setting when:

  1. Your preferred LLM service is not natively listed in ChatFrame's main provider list.
  2. Your LLM service has an API that is compatible with the OpenAI API specification.
  3. You are running a local LLM server (e.g., via ollama or a similar tool) that exposes an OpenAI-compatible endpoint.

Configuration Steps

The configuration process requires two key pieces of information from your custom provider: the API Endpoint and the API Key.

  1. Obtain Credentials: Consult the documentation of your custom provider to find:
    • Base API URL (Endpoint): This is the URL where the API requests should be sent (e.g., https://api.mycustomllm.com/v1).
    • API Key/Token: The secret key required for authentication.
  2. Navigate in ChatFrame: Go to the Providers settings tab and select the Custom Providers option.
  3. Enter Details:
    • Input the Base API URL.
    • Input the API Key.
  4. Save and Connect: Save the settings. ChatFrame will attempt to connect to the endpoint and retrieve the list of available models.

Once successfully configured, the models from your custom provider will appear in the model selection dropdown in your chat interface.