Skip to content

OpenClaw

OpenClaw is a personal AI assistant. It answers you on the channels you already use like WhatsApp, Telegram, Slack, Microsoft Teams, etc. Flotorch’s OpenAI Compatible API can be natively used in OpenClaw and you can use it to interact with your AI resources. All Flotorch controls like Guardrails, Load Balancing, Dynamic Query Routing, Model Routing, etc. works out of the box in OpenClaw.

To use OpenClaw with FloTorch, you need to create a FloTorch API Key and use it in the OpenClaw configuration.

  1. Create a FloTorch API Key in the FloTorch Console. Create API Key

OpenClaw supports multiple providers. You can configure Flotorch as a provider in OpenClaw by following the steps below. This need to be configured directly in OpenClaw config file. Default configuration file is located at ~/.openclaw/openclaw.json.

Take a backup of the existing configuration file, just in case you want to revert to the original configuration.

Terminal window
cp ~/.openclaw/openclaw.json ~/.openclaw/openclaw.json.bkup

Open the configuration file in a text editor.

Terminal window
nano ~/.openclaw/openclaw.json

Configure Flotorch as a provider in OpenClaw

Section titled “Configure Flotorch as a provider in OpenClaw”

Find the models section in the configuration file. inside the models section you will list of providers in providers key. Add flotorch as a provider.

It should look something like below. Replace <your-flotorch-api-key> with the API key you created in the FloTorch Console.

"models": {
"mode": "merge",
"providers": {
"flotorch": {
"baseUrl": "https://gateway.flotorch.cloud/openai/v1",
"api": "openai-completions",
"apiKey": "<your-flotorch-api-key>",
"models": []
},
}
// other providers
}

Models are usually configured in agents section in the configuration file. Inside the agents section you will list of agents. you can add a Flotorch model to an an agent by adding it to the models key. If you have not already created a FloTorch model, you can create a new model by clicking on Create Model button in the FloTorch Console. Create Model. Copy the name of the model, it usally looks like flotorch/<model-name>.

Once you configure the model in OpenClaw configuration file, It should look something like below.

Replace <your-flotorch-model-name> with the name of the model you created in the FloTorch Console. The filled model name should look like flotorch/flotorch/<your-flotorch-model-name>.

"agents": {
"defaults": {
"model": {
"primary": "flotorch/<your-flotorch-model-name>"
},
"models": {
"flotorch/flotorch/<your-flotorch-model-name>": {}
},
}
// other agents and their configurations
}

Restart OpenClaw to apply the changes.

Terminal window
openclaw gateway restart

You can now change model or apply guardrails, monitor the usage and costs in the FloTorch Console, without worrying about the underlying openclaw configuration.

In the current version of OpenClaw (2026.2.3), there is a known issue with custom OpenAI Integrations, sometimes you will see a message Cannot read properties of undefined (reading '0') in reply, this is a known issue and will be resolved once OpenClaw fix the issue - OpenClaw Issue.