Overview
POST /api/chat
"stream": false
. The final response object will include statistics and additional data from the request.Parameters
model
: (required) the model namemessages
: the messages of the chat, this can be used to keep a chat memorytools
: list of tools in JSON for the model to use if supportedmessage
object has the following fields:role
: the role of the message, either system
, user
, assistant
, or tool
content
: the content of the messageimages
(optional): a list of images to include in the message (for multimodal models such as llava
)tool_calls
(optional): a list of tools in JSON that the model wants to useformat
: the format to return a response in. Format can be json
or a JSON schema.options
: additional model parameters listed in the documentation for the Modelfile such as temperature
stream
: if false
the response will be returned as a single response object, rather than a stream of objectskeep_alive
: controls how long the model will stay loaded into memory following the request (default: 5m
)Structured outputs
format
parameter. The model will generate a response that matches the schema. See the Chat request (Structured outputs) example below.