Configure LLM Providers and Models
18 Nov 20181 minute to read
This guide provides step-by-step instructions for administrators to configure Large Language Model (LLM) providers and manage models within Code Studio.
Note: Only admins can configure LLM providers and models.
Prerequisites
- You need an active account on Code Studio
- OpenRouter API Key - Follow the detailed instructions here to obtain an OpenRouter API key and the list of recommended free models for integration.
1. Adding an LLM Provider
To integrate an LLM provider into Code Studio:
Steps:
- Navigate to the LLM keys page under the BYOK section.
- Click on “Add LLM Key”.
- In the dialog box:
- Provider Name: Enter the name of the LLM provider (e.g., OpenAI, Anthropic).
- API Key: Paste the API key provided by the LLM service.
- Click “Add” to save the provider.
✅ Once added, the provider will be available for model selection.
2. Adding BYOK Models
To add a model under a configured provider:
Steps:
- Go to the Models page under the BYOK section.
- Use the search bar to look for a specific model.
- Click “Add Model”.
- In the form:
- Provider: Select from the list of configured providers.
- Model: Choose from the dropdown list of models available under the selected provider. Each model will list with the input cost and the output cost.
-
Modes: Select one or more modes in which the model should be used. This is a multi-select field, allowing you to choose multiple modes for a single model.
- Click Add to include the model in your BYOK model list.
- Click Manage Default Settings to customize which AI models are used for key IDE functionalities like chat, edit, apply, and autocomplete.