POST
/
serverless
/
v1
/
tokenize

To successfully run an inference request, it is mandatory to enter a Friendli Token (e.g. flp_XXX) value in the Bearer Token field. Refer to the authentication section on our introduction page to learn how to acquire this variable and visit here to generate your token.

Authorizations

Authorization
string
headerrequired

When using Friendli Endpoints API for inference requests, you need to provide a Friendli Token for authentication and authorization purposes.

For more detailed information, please refer here.

Headers

X-Friendli-Team
string

ID of team to run requests as (optional parameter).

Body

application/json
model
string
required

Code of the model to use. See available model list.

prompt
string
required

Input text prompt to tokenize.

Response

200 - application/json
tokens
integer[]

A list of token IDs.