Chat UI documentation
Cloudflare
Cloudflare
| Feature | Available | 
|---|---|
| Tools | No | 
| Multimodal | No | 
You may use Cloudflare Workers AI to run your own models with serverless inference.
You will need to have a Cloudflare account, then get your account ID as well as your API token for Workers AI.
You can either specify them directly in your .env.local using the CLOUDFLARE_ACCOUNT_ID and CLOUDFLARE_API_TOKEN variables, or you can set them directly in the endpoint config.
You can find the list of models available on Cloudflare here.
MODELS=`[
  {
    "name" : "nousresearch/hermes-2-pro-mistral-7b",
    "tokenizer": "nousresearch/hermes-2-pro-mistral-7b",
    "parameters": {
      "stop": ["<|im_end|>"]
    },
    "endpoints" : [
      {
        "type" : "cloudflare"
        <!-- optionally specify these
        "accountId": "your-account-id",
        "authToken": "your-api-token"
        -->
      }
    ]
  }
]`