On Meilisearch Cloud, the default workspace name is
cloud. Replace WORKSPACE_NAME with cloud in all API calls. If you need additional workspaces, contact us.Create a workspace
Create a workspace by sending aPATCH request to /chats/{workspace_uid}/settings. If the workspace does not exist, Meilisearch creates it automatically.
workspace_uid in the URL (in this example, my-support-bot) is a unique identifier you choose. Use a descriptive name that reflects the workspace’s purpose.
Configure the LLM provider
Thesource field determines which LLM provider Meilisearch uses. Each provider has slightly different requirements:
| Provider | source value | Required fields | Optional fields |
|---|---|---|---|
| OpenAI | openAi | apiKey | baseUrl, orgId, projectId |
| Azure OpenAI | azureOpenAi | apiKey, baseUrl | deploymentId, apiVersion |
| Mistral | mistral | apiKey, baseUrl | |
| vLLM | vLlm | baseUrl |
baseUrl is required for all providers except OpenAI. For OpenAI, it is optional and only needed when using a custom endpoint.
Azure OpenAI example
Azure OpenAI requires additional fields for deployment configuration:Configure the system prompt
The system prompt gives the conversational agent its baseline instructions. It controls the agent’s behavior, tone, and scope. Set it through theprompts.system field:
prompts object accepts additional fields that help the LLM understand how to use Meilisearch’s search capabilities:
| Field | Description |
|---|---|
system | Baseline instructions for the conversational agent |
searchDescription | Describes the search function to the LLM, helping it understand when and how to search |
searchQParam | Describes the query parameter, guiding the LLM on how to formulate search queries |
searchFilterParam | Describes the filter parameter, helping the LLM construct appropriate filters |
searchIndexUidParam | Describes the index UID parameter, guiding the LLM on which index to search |
LLM provider parameters passthrough
Meilisearch forwards standard chat completion parameters directly to the configured LLM provider. This means you can include parameters liketemperature, top_p, frequency_penalty, or presence_penalty in your chat completions requests and Meilisearch will pass them through to the provider as-is.
For example, lowering temperature makes responses more deterministic and factual, while raising it produces more varied and creative outputs:
Available parameters and their behavior depend on the LLM provider you configured. Refer to your provider’s documentation for the full list of supported parameters and their effects.
Configure indexes for chat
Before a workspace can search your data, each index must have its chat settings configured. See the dedicated configure index chat settings guide for full documentation ondescription, documentTemplate, searchParameters, and other fields.
Verify workspace configuration
Retrieve the current settings for a workspace at any time:apiKey value is redacted in the response for security.
Update workspace settings
Update any workspace setting by sending aPATCH request with only the fields you want to change. Fields you omit remain unchanged:
Next steps
- Stream chat responses to deliver answers token by token
- Configure guardrails to control the scope and quality of responses
- Reduce hallucination with system prompt engineering and few-shot prompting
- Consult the workspace settings API reference and the chat completions API reference for all available parameters