Chat
Use the Chat API to create a simple conversation agent.
Available models
Model | Release date | Context Length | Description |
---|---|---|---|
solar-propreview | 2024-09-10 | 4096 | A more intelligent, instruction-following Solar LLM with IFEval 80+. The official version with expanded language support and longer context length will be released in November 2024. solar-pro supports English only at this time.solar-pro is an alias for our latest Solar Pro model. (Currently solar-pro-preview-240910 ) |
solar-minibeta (Prev. solar-1-mini-chat) | 2024-06-14 | 32768 | A compact LLM offering superior performance to GPT-3.5, with robust multilingual capabilities for both English and Korean, delivering high efficiency in a smaller package.solar-mini is an alias for our latest Solar Mini model. (Currently solar-mini-240612 ) |
solar-mini-jabeta (Prev. solar-1-mini-chat-ja) | 2024-06-14 | 32768 | A compact LLM that extends the capabilities of Solar Mini with specialization in Japanese, while maintaining high efficiency and performance in English and Korean.solar-mini-ja is an alias for our latest Solar Mini ja model. (Currently solar-mini-ja-240612 ) |
Request
POST https://api.upstage.ai/v1/solar/chat/completions
Parameters
The messages
parameter is a list of message objects. Each message object has a role
(which can be one of "system", "user", or "assistant") and content
. The conversation starts with a "system" message, followed by messages from "user" and "assistant".
Users can control the behavior of the assistant throughout the conversation via system messages. For example, they can convey the assistant's role, attitude, or specific instructions that need to be followed.
A user
message is what the user inputs. Through the user message, the user can request actions from the assistant. An assistant
message can either record the previous response from the assistant or serve as an example to demonstrate the actions that the assistant should take based on the user's request.
In the tool array, you can specify custom functions for function calling, empowering the model to generate functions and their signatures in JSON format for integration with external tools.
Request headers
Authorization string Required |
Request body
messages list Required |
messages[].content string Optional |
messages[].role string Required ✨ System prompts are not currently supported in Solar Pro Preview. This feature will be available in the official release of Solar Pro. |
messages[].tool_call_id string Optional |
messages[].tool_calls string Optional |
messages[].tool_calls[].id string Required |
messages[].tool_calls[].type string Required |
messages[].tool_calls[].function string Required |
messages[].tool_calls[].function.name string Required |
messages[].tool_calls[].function.arguments string Required |
model string Required |
max_tokens integer Optional |
stream boolean Optional |
temperature float Optional |
top_p float Optional |
tools array Optional |
tools[].type float Required |
tools[].function object Required |
tools[].function.description string Optional |
tools[].function.name string Required |
tools[].function.parameters object Required
|
tool_choice string or object Optional |
Response
Return values
Returns a chat.completion
object, or a streamed sequence of chat.completion.chunk
objects if the request is streamed.
The chat completion object
id string |
object string |
created integer |
model string |
system_fingerprint null |
choices list |
choices[].finish_reason string |
choices[].index integer |
choices[].message object |
choices[].message.content string |
choices[].message.role string |
choices[].message.tool_calls list |
choices[].message.tool_calls[].id string |
choices[].message.tool_calls[].type string |
choices[].message.tool_calls[].function object |
choices[].message.tool_calls[].function.name string |
choices[].message.tool_calls[].function.arguments string |
choices[].logprobs null |
usage object |
usage.completion_tokens integer |
usage.prompt_tokens integer |
usage.total_tokens integer |
The chat completion chunk object
id string |
object string |
created integer |
model string |
system_fingerprint null |
choices list |
choices[].finish_reason string |
choices[].index integer |
choices[].delta object |
choices[].delta.content string |
choices[].delta.role string or null |
choices[].delta.tool_calls list |
choices[].delta.tool_calls[].id string |
choices[].delta.tool_calls[].type string |
choices[].delta.tool_calls[].function object |
choices[].delta.tool_calls[].function.name string |
choices[].delta.tool_calls[].function.arguments string |
choices[].logprobs null |
Example
Request
curl --location 'https://api.upstage.ai/v1/solar/chat/completions' \
--header 'Authorization: Bearer UPSTAGE_API_KEY' \
--header 'Content-Type: application/json' \
--data '{
"model": "solar-pro",
"messages": [
{
"role": "user",
"content": "Hi, how are you?"
}
],
"stream": true
}'
Response
{
"id": "e1a90437-df41-45cd-acc6-a7bacbdd2a86",
"object": "chat.completion",
"created": 1707269210,
"model": "solar-pro-preview-240910",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello and welcome! How can I assist you today?"
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 23,
"completion_tokens": 12,
"total_tokens": 35
},
"system_fingerprint": null
}