Skip to main content
A chat workspace defines the configuration for a conversational search session, including the LLM provider, system prompt, and search behavior. You can create multiple workspaces targeting different use cases, such as a public-facing knowledge base and an internal support tool.
On Meilisearch Cloud, the default workspace name is cloud. Replace WORKSPACE_NAME with cloud in all API calls. If you need additional workspaces, contact us.

Create a workspace

Create a workspace by sending a PATCH request to /chats/{workspace_uid}/settings. If the workspace does not exist, Meilisearch creates it automatically.
curl \
  -X PATCH 'MEILISEARCH_URL/chats/my-support-bot/settings' \
  -H 'Authorization: Bearer MEILISEARCH_KEY' \
  -H 'Content-Type: application/json' \
  --data-binary '{
    "source": "openAi",
    "apiKey": "YOUR_OPENAI_API_KEY",
    "prompts": {
      "system": "You are a helpful support assistant. Answer questions based only on the provided context."
    }
  }'
The workspace_uid in the URL (in this example, my-support-bot) is a unique identifier you choose. Use a descriptive name that reflects the workspace’s purpose.

Configure the LLM provider

The source field determines which LLM provider Meilisearch uses. Each provider has slightly different requirements:
Providersource valueRequired fieldsOptional fields
OpenAIopenAiapiKeybaseUrl, orgId, projectId
Azure OpenAIazureOpenAiapiKey, baseUrldeploymentId, apiVersion
MistralmistralapiKey, baseUrl
vLLMvLlmbaseUrl
baseUrl is required for all providers except OpenAI. For OpenAI, it is optional and only needed when using a custom endpoint.

Azure OpenAI example

Azure OpenAI requires additional fields for deployment configuration:
curl \
  -X PATCH 'MEILISEARCH_URL/chats/my-support-bot/settings' \
  -H 'Authorization: Bearer MEILISEARCH_KEY' \
  -H 'Content-Type: application/json' \
  --data-binary '{
    "source": "azureOpenAi",
    "apiKey": "YOUR_AZURE_API_KEY",
    "baseUrl": "https://your-resource.openai.azure.com",
    "deploymentId": "your-deployment-id",
    "apiVersion": "2024-02-01"
  }'

Configure the system prompt

The system prompt gives the conversational agent its baseline instructions. It controls the agent’s behavior, tone, and scope. Set it through the prompts.system field:
curl \
  -X PATCH 'MEILISEARCH_URL/chats/my-support-bot/settings' \
  -H 'Authorization: Bearer MEILISEARCH_KEY' \
  -H 'Content-Type: application/json' \
  --data-binary '{
    "prompts": {
      "system": "You are a customer support agent for an online bookstore. Only answer questions about books, orders, and shipping. If the user asks about unrelated topics, politely redirect them to the relevant support channel."
    }
  }'
The prompts object accepts additional fields that help the LLM understand how to use Meilisearch’s search capabilities:
FieldDescription
systemBaseline instructions for the conversational agent
searchDescriptionDescribes the search function to the LLM, helping it understand when and how to search
searchQParamDescribes the query parameter, guiding the LLM on how to formulate search queries
searchFilterParamDescribes the filter parameter, helping the LLM construct appropriate filters
searchIndexUidParamDescribes the index UID parameter, guiding the LLM on which index to search
These fields provide additional context that improves how the agent formulates searches. For guidance on writing effective system prompts, see configure guardrails.

LLM provider parameters passthrough

Meilisearch forwards standard chat completion parameters directly to the configured LLM provider. This means you can include parameters like temperature, top_p, frequency_penalty, or presence_penalty in your chat completions requests and Meilisearch will pass them through to the provider as-is. For example, lowering temperature makes responses more deterministic and factual, while raising it produces more varied and creative outputs:
curl -N \
  -X POST 'MEILISEARCH_URL/chats/WORKSPACE_NAME/chat/completions' \
  -H 'Authorization: Bearer MEILISEARCH_KEY' \
  -H 'Content-Type: application/json' \
  --data-binary '{
    "model": "PROVIDER_MODEL_UID",
    "messages": [
      {
        "role": "user",
        "content": "What are your return policies?"
      }
    ],
    "temperature": 0.2
  }'
Available parameters and their behavior depend on the LLM provider you configured. Refer to your provider’s documentation for the full list of supported parameters and their effects.

Configure indexes for chat

Before a workspace can search your data, each index must have its chat settings configured. See the dedicated configure index chat settings guide for full documentation on description, documentTemplate, searchParameters, and other fields.

Verify workspace configuration

Retrieve the current settings for a workspace at any time:
curl \
  -X GET 'MEILISEARCH_URL/chats/WORKSPACE_NAME/settings' \
  -H "Authorization: Bearer MEILISEARCH_KEY"
This returns the full configuration, including the provider and system prompt. Note that the apiKey value is redacted in the response for security.

Update workspace settings

Update any workspace setting by sending a PATCH request with only the fields you want to change. Fields you omit remain unchanged:
curl \
  -X PATCH 'MEILISEARCH_URL/chats/WORKSPACE_NAME/settings' \
  -H "Authorization: Bearer MEILISEARCH_KEY" \
  -H "Content-Type: application/json" \
  --data-binary '{ "apiKey": "your-valid-api-key" }'
For example, to update only the system prompt without changing the provider:
curl \
  -X PATCH 'MEILISEARCH_URL/chats/my-support-bot/settings' \
  -H 'Authorization: Bearer MEILISEARCH_KEY' \
  -H 'Content-Type: application/json' \
  --data-binary '{
    "prompts": {
      "system": "You are a helpful assistant for a tech documentation site. Always include code examples in your answers when relevant."
    }
  }'

Next steps