Documentation
APIs
Chat

Chat

Use the Chat API to create a simple conversation agent.

Available models

ModelRelease dateContext LengthDescription
solar-1-mini-chat2024-02-22 beta16384A compact LLM offering superior performance to GPT-3.5, with robust multilingual capabilities for both English and Korean, delivering high efficiency in a smaller package.

Request

POST https://api.upstage.ai/v1/solar/chat/completions

Parameters

The messages parameter is a list of message objects. Each message object has a role (which can be one of "system", "user", or "assistant") and content. The conversation starts with a "system" message, followed by messages from "user" and "assistant".

Users can control the behavior of the assistant throughout the conversation via system messages. For example, they can convey the assistant's role, attitude, or specific instructions that need to be followed.

A user message is what the user inputs. Through the user message, the user can request actions from the assistant. An assistant message can either record the previous response from the assistant or serve as an example to demonstrate the actions that the assistant should take based on the user's request.

In the tool array, you can specify custom functions for function calling, empowering the model to generate functions and their signatures in JSON format for integration with external tools.

Request headers

Authorization string Required
Authentication token, format: Bearer API_KEY

Request body

messages list Required
A list of messages comprising the conversation so far.

messages[].content string Optional
The contents of the user message.

messages[].role string Required
The role of the messages author. (which can be one of "system", "user", "assistant", or “tool”)

messages[].tool_call_id string Optional
ID of tool calls responded by api. This field is required when messages[].role is “tool”.

messages[].tool_calls string Optional
Tools generated by api. messages[].content should be null when you are using this field.

messages[].tool_calls[].id string Required
ID of the tool message.

messages[].tool_calls[].type string Required
The type of tool. Should be function.

messages[].tool_calls[].function string Required
The function generated by model.

messages[].tool_calls[].function.name string Required
The name of the function.

messages[].tool_calls[].function.arguments string Required
The arguments of the json generated by model. arguments parameter must be valid, escaped json string.

model string Required
The model name to generate the completion.

max_tokens integer Optional
An optional parameter that limits the maximum number of tokens to generate. If max_tokens is set, sum of input tokens and max_tokens should be lower than or equal to context length of model. Default value is inf.

stream boolean Optional
An optional parameter that specifies whether a response should be sent as a stream. If set true, partial message deltas will be sent. Tokens will be sent as data-only server-sent events. Default value is false.

temperature float Optional
An optional parameter to set the sampling temperature. The value should lie between 0 and 2. Higher values like 0.8 result in a more random output, whereas lower values such as 0.2 enhance focus and determinism in the output. Default value is 0.7.

top_p float Optional
An optional parameter to trigger nucleus sampling. The tokens with top_p probability mass will be considered, which means, setting this value to 0.1 will consider tokens comprising the top 10% probability.

tools array Optional
A List of tools that may utilized to assist model to generate output. tools parameter define JSON input for each tools. Model will generate JSON for tools if needed.

tools[].type float Required
The type of tool. Should be function.

tools[].function object Required
A JSON definition of function.

tools[].function.description string Optional
A description of the function. This field will help model to understand your function.

tools[].function.name string Required
The name of function.

tools[].function.parameters object Required
A JSON definition of input paramters to your function. parameters filed follows JSON Schema (opens in a new tab).

ex) To find Location of place

{
    "type": "object",
    "properties": {
        "location": {
            "type": "string",
            "description": "Location to search with google map"
        }
    },
    "required": [
        "location"
    ]
}

tool_choice string or object Optional
Choose which tool is used to generate response. Use value auto for auto selection, or JSON object to select one of your functions.
ex) {"type": "function", "function": {"name": "find_location"}}

Response

Return values

Returns a chat.completion object, or a streamed sequence of chat.completion.chunk objects if the request is streamed.

The chat completion object

id string
A unique identifier for the chat completion. Each chunk has the same ID.

object string
The obejct type, which is always chat.completion

created integer
The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.

model string
A string representing the version of the model being used.

system_fingerprint null
This field is not yet available.

choices list
A list of chat completion choices.

choices[].finish_reason string
The reason the model stopped generating tokens. This will be stop if the model hit a natural stop point or a provided stop sequence, length if the maximum number of tokens specified in the request was reached.

choices[].index integer
The index of the choice in the list of choices.

choices[].message object
A chat completion message generated by the model.

choices[].message.content string
The contents of the message.

choices[].message.role string
The role of the author of this message.

choices[].message.tool_calls list
A list of tools selected by model to call.

choices[].message.tool_calls[].id string
The ID of tool calls.

choices[].message.tool_calls[].type string
The type of tool.

choices[].message.tool_calls[].function object
A function object to call.

choices[].message.tool_calls[].function.name string
The name of function to call.

choices[].message.tool_calls[].function.arguments string
A JSON input to function.

choices[].logprobs null
This field is not yet available.

usage object
Usage statistics for the completion request.

usage.completion_tokens integer
Number of tokens in the generated completion.

usage.prompt_tokens integer
Number of tokens in the prompt.

usage.total_tokens integer
Total number of tokens used in the request (prompt + completion).

The chat completion chunk object

id string
A unique identifier for the chat completion. Each chunk has the same ID.

object string
The obejct type, which is always chat.completion.chunk

created integer
The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.

model string
A string representing the version of the model being used.

system_fingerprint null
This field is not yet available.

choices list
A list of chat completion choices.

choices[].finish_reason string
The reason the model stopped generating tokens. This will be stop if the model hit a natural stop point or a provided stop sequence, length if the maximum number of tokens specified in the request was reached.

choices[].index integer
The index of the choice in the list of choices.

choices[].delta object
A chat completion message generated by the model.

choices[].delta.content string
The contents of the message.

choices[].delta.role string or null
The role of the author of this message.

choices[].delta.tool_calls list
A list of tools selected by model to call.

choices[].delta.tool_calls[].id string
The ID of tool calls.

choices[].delta.tool_calls[].type string
The type of tool.

choices[].delta.tool_calls[].function object
A function object to call.

choices[].delta.tool_calls[].function.name string
The name of function to call.

choices[].delta.tool_calls[].function.arguments string
A JSON input to function.

choices[].logprobs null
This field is not yet available.

Example

Request

curl --location 'https://api.upstage.ai/v1/solar/chat/completions' \
  --header 'Authorization: Bearer YOUR_API_KEY' \
  --header 'Content-Type: application/json' \
  --data '{
    "model": "solar-1-mini-chat",
    "messages": [
      {
        "role": "system",
        "content": "You are a helpful assistant."
      },
      {
        "role": "user",
        "content": "Hello!"
      }
    ],
    "stream": true
  }'

Response

{
  "id": "e1a90437-df41-45cd-acc6-a7bacbdd2a86",
  "object": "chat.completion",
  "created": 1707269210,
  "model": "upstage/solar-1-mini-chat-ko-0108",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "Hello and welcome! How can I assist you today?"
      },
      "logprobs": null,
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 23,
    "completion_tokens": 12,
    "total_tokens": 35
  },
  "system_fingerprint": null
}