/chats API as multi-turn chat, but with a different prompt strategy: instead of building a conversation, you send a single question and receive a summarized answer based on your indexed documents. This is useful for displaying AI-generated answers alongside traditional search results.
Make sure you have completed the setup guide before continuing.
In code examples, replace
WORKSPACE_NAME with the name of your workspace. On Meilisearch Cloud, the default workspace name is cloud.Configure your workspace prompt for summarization
The key difference from a chat interface is the system prompt. For summarization, instruct the model to produce concise, self-contained answers and avoid follow-up questions:- The system prompt explicitly asks for short, self-contained answers
- The model is told not to ask follow-up questions
- Responses are limited to a few sentences
Send a single question
Send a request to the chat completions endpoint. The difference from multi-turn chat is that you only send one message and do not maintain conversation history:_meiliSearchSources tool lets you display the source documents alongside the summarized answer, so users can verify the information. In a real application, you would run this in parallel with a standard Meilisearch search request and display both results together.
For summarization, you may want to use a lower temperature value (for example, 0.1 or 0.2) to produce more deterministic, factual answers. Meilisearch passes these parameters through to your LLM provider.
Next steps
Build a chat interface
Create a multi-turn conversational interface with follow-up questions.
Display source documents
Show users which documents were used to generate the summary.
Configure guardrails
Restrict AI responses to topics covered by your data.
Reduce hallucination
Learn techniques to improve accuracy of AI-generated answers.
Chat completions API reference
Full reference for the chat completions endpoint.