Agents config
This page covers the required setup before using Budibase Agents and Agent Chat.
Before building an Agent or enabling Agent Chat, you need at least one configured model in Workspace Settings > Connections > AI models.
For a broader AI setup walkthrough, see Quickstart: Budibase AI.
Before you start
Make sure you have:
- Access to Workspace Settings
- Credentials for your chosen model provider
- At least one model you can run chat/agents with
Configure your first model in AI models
- Open your Workspace Settings
- Go to Connections > AI models
- Click Connect
- Choose a provider and fill out the required credentials
- Select or type a model
- Add a display name and save
Once saved, the model becomes available in Agent model selection.
Supported providers
Budibase supports multiple model providers:
| Provider | Summary |
|---|---|
| Budibase AI | Budibase-managed provider |
| Anthropic | Connect directly to Claude models |
| Connect directly to Gemini models | |
| Mistral | Connect directly to Mistral models |
| OpenAI | Connect directly to OpenAI models |
| OpenRouter | Connect to a broad catalog of hosted model APIs |
| Groq | Connect to high-speed hosted model APIs |
Custom provider setup
If your provider is not listed, use Connect to a custom provider.
Depending on the provider, you may need values such as:
- API base URL
- API key
- Access key and secret key
- Default model name
Use the exact values expected by your provider. Incorrect values can allow setup to save but still fail during runtime.
Validate your provider setup
Before attaching a model to an Agent:
- Confirm the provider shows as enabled in Connections > AI models
- Confirm your target model name is available and correctly spelled
- Run a simple test from an Agent with a short prompt
- Save the successful configuration as your default team setup
Selecting models in agents
After configuring a model in AI models, attach it to each agent:
- Open Agents
- Create a new Agent or open an existing one
- Click Connect AI Model
- Select the configured provider and model
Ready for agent chat checklist
Before enabling Agent Chat, confirm:
- Your agent has a connected AI model
- The selected model can run chat/completion prompts
- The agent is saved and can answer a simple test prompt
Next, configure chat deployment in Agent chat.
Troubleshooting
If model connection or Agent tests fail:
- Re-check API credentials and model ID
- Verify any provider-specific base URL value
- Confirm outbound internet access is allowed from your environment
- Re-run a simple prompt test after saving changes
If you are self-hosting and using Budibase AI, ensure your network allows the required outbound traffic. See Self-hosted AI features.
Related guides
Updated about 1 month ago