Serverless completions chunk object
Represents a streamed chunk of a completions response returned by model, based on the provided input.
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": " such",
"token": 1778,
"finish_reason": null,
"logprobs": null
}
],
"created": 1733382157
}
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": " as",
"token": 439,
"finish_reason": null,
"logprobs": null
}
],
"created": 1733382157
}
...
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": "",
"finish_reason": "length",
"logprobs": null
}
],
"created": 1733382157
}
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [],
"usage": {
"prompt_tokens": 5,
"completion_tokens": 10,
"total_tokens": 15
},
"created": 1733382157
}
data: [DONE]
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": " such",
"token": 1778,
"finish_reason": null,
"logprobs": null
}
],
"created": 1733382157
}
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": " as",
"token": 439,
"finish_reason": null,
"logprobs": null
}
],
"created": 1733382157
}
...
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": "",
"finish_reason": "length",
"logprobs": null
}
],
"created": 1733382157
}
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [],
"usage": {
"prompt_tokens": 5,
"completion_tokens": 10,
"total_tokens": 15
},
"created": 1733382157
}
data: [DONE]
A unique ID of the completion.
The object type, which is always set to text_completion
.
The model to generate the completion.
The index of the choice in the list of generated choices.
The text.
The token.
Termination condition of the generation.
stop
means the API returned the full completions generated by the model without running into any limits.
length
means the generation exceeded max_tokens
or the conversation exceeded the max context length.
Available options: stop
, length
Log probability information for the choice.
The starting character position of each token in the generated text, useful for mapping tokens back to their exact location for detailed analysis.
The log probabilities of each generated token, indicating the model’s confidence in selecting each token.
A list of individual tokens generated in the completion, representing segments of text such as words or pieces of words.
A list of dictionaries, where each dictionary represents the top alternative tokens considered by the model at a specific position in the generated text, along with their log probabilities. The number of items in each dictionary matches the value of logprobs
.
The Unix timestamp (in seconds) for when the token sampled.
Was this page helpful?
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": " such",
"token": 1778,
"finish_reason": null,
"logprobs": null
}
],
"created": 1733382157
}
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": " as",
"token": 439,
"finish_reason": null,
"logprobs": null
}
],
"created": 1733382157
}
...
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [
{
"index": 0,
"text": "",
"finish_reason": "length",
"logprobs": null
}
],
"created": 1733382157
}
data: {
"id": "cmpl-26a1e10db8544bc3adb488d2d205288b",
"model": "meta-llama-3.1-8b-instruct",
"object": "text_completion",
"choices": [],
"usage": {
"prompt_tokens": 5,
"completion_tokens": 10,
"total_tokens": 15
},
"created": 1733382157
}
data: [DONE]