Serverless
Tokenization
By giving a text input, generate a tokenized output of token IDs.
POST
/
serverless
/
v1
/
tokenize
To successfully run an inference request, it is mandatory to enter a Friendli Token (e.g. flp_XXX) value in the Bearer Token field. Refer to the authentication section on our introduction page to learn how to acquire this variable and visit here to generate your token.
Authorizations
Headers
ID of team to run requests as (optional parameter).
Body
application/json
Code of the model to use. See available model list.
Input text prompt to tokenize.
Response
200 - application/json
A list of token IDs.
Was this page helpful?