Connect to your Large Language Model (LLM)
In addition to selecting OpenAI Azure and Google Vertex Models as the LLM for your Spotter experience, you can now connect your own LLM via your custom gateway. This option allows you to exercise full control over your AI stack, by connecting ThoughtSpot directly to your private LLM endpoints. Note that when you connect to your own LLM, you are responsible for its performance, cost, and maintenance. You will need to gather the required credentials for your LLM provider, such as API keys, endpoint URLS, or service account details, before you connect.
| To edit the LLM configuration of your instance, you must have administrator privileges. | 
Supported models
For the best results with connecting to your own LLM, we recommend using the following models:
- 
GPT -4.1
 - 
Claude Sonnet 4
 
You must ensure that the models you configure are available at your specified endpoint.
Connect to your LLM
We currently support connections to Azure OpenAI, Google Vertex AI, and custom LLM gateways.
Connect to Azure OpenAI
You will need:
- 
Your endpoint URL.
 - 
An API key.
 - 
The deployment names for your Default models.
 
For steps on how to create an Azure OpenAI resource, see this article. For steps on setting up and configuring a GPT deployment using the Azure OpenAI service, see this article.
Configure your Azure OpenAI connection
To configure your Azure OpenAI connection, follow these steps:
- 
In the Connect to Your LLM setup, select ThoughtSpot Enabled. Click Manual and select Azure OpenAI from the list of providers.
 - 
In the configuration form, enter the following details:
- 
Model Alias: A user-friendly name for your connection (e.g., Azure-Marketing-GPT4).
 - 
Endpoint: Your Azure OpenAI resource endpoint URL.
 - 
API Key: Your Azure OpenAI API key.
 
 - 
 - 
Configure your models:
- 
Default Model (Required): Enter the Azure deployment name for your primary model.
 
 - 
 - 
Click Test Connection to validate your credentials.
 - 
If the test is successful, click Save.
 
If you need to connect to Azure OpenAI using OAuth 2.0 (via Entra ID), you must set up an APi gateway. Follow the guide to connect to an LLM Gateway.
Connect to Google Vertex AI
You will need:
- 
The JSON key file for a service account with the Vertex AI User role.
 - 
Your Google Cloud Project ID.
 - 
The Region for your models.
 - 
The Model IDs for your Default model.
 
For steps on how to create Service account keys, see this article.
You will need an administrator to create a service account and give it the correct permissions to access Vertex AI models. Once created, you must grant the service account IAM roles of either a “Vertex AI user” (roles/aiplatform.user) or “Vertex AI Administrator” (roles/aiplatform.admin) so it has permission to make API calls.
Configure your Google Vertex AI connection
- 
In the Connect to Your LLM setup, select ThoughtSpot Enabled. Click Manual and select Google Vertex AI from the list of providers.
 - 
In the configuration form, enter the following details:
- 
Model Alias: A user-friendly name for your connection (e.g., Vertex-Gemini-Finance).
 - 
Project ID: Your Google Cloud Project ID.
 - 
Region: The region where your Vertex AI models are deployed (e.g., us-central1).
 - 
Service Account JSON: Paste the entire content of your service account JSON key file into this field.
 
 - 
 - 
Configure your models:
- 
Default Model : Enter the Model ID for your primary model.
 
 - 
 - 
Click Test Connection to validate your credentials.
 - 
If the test is successful, click Save.
 
Connect to an LLM gateway
Before you connect, check the following:
- 
Your gateway must be fully compatible with the OpenAI /v1/chat/completions API specification and tool calling enabled
 - 
You will need the credentials for your gateway, which will vary based on the authentication method.
 
Configure your connection
- 
In the Connect to Your LLM setup, select Connect Your Own LLM from the list of providers.
 - 
Choose your Authentication Method.
- 
Select Api Key if your gateway is secured by a static API key. Enter the following information:
- 
Model Alias: A user-friendly name for your connection.
 - 
Gateway URL: The base URL of your gateway endpoint.
 - 
API Key: The API key required to access your gateway.
 - 
Default Model Name: The model name your gateway expects.
 
 - 
 - 
Select Oauth2 if your gateway is secured by Machine-to-Machine OAuth 2.0. This is common for connecting to an Azure API Management gateway secured by Entra ID. Enter the following information:
- 
Model Alias: A user-friendly name for your connection.
 - 
Gateway URL: The base URL of your gateway endpoint.
 - 
Token Endpoint URL: The URL of the authentication server that issues access tokens.
 - 
Client ID: The Client ID for your application.
 - 
Client Secret: The Client Secret for your application.
 - 
Default Model Name : The model name your gateway expects.
 
 - 
 
 - 
 - 
After filling in the details for your chosen method, click Test Connection.
 - 
If the test is successful, click Save.