Documentation Index
Fetch the complete documentation index at: https://friendli.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
Overview
Friendli Serverless Endpoints let you run state-of-the-art AI models instantly, without provisioning infrastructure, managing deployments, or configuring runtime settings. You can experiment interactively in the UI or integrate models directly into your application using simple API calls. This quickstart walks you through both options.Explore popular AI models in a chat-style playground
The fastest way to get started is by trying models directly in Friendli Suite. The playground provides an interactive, chat-based experience where you can test prompts, inspect responses, and fine-tune inference settings.1. Sign up or log in
Create an account or log in at Friendli Suite.2. Navigate to the Serverless Endpoints page
Simply open the Serverless Endpoints page in Friendli Suite and select any model from the list of available options, including GLM-5.1 and other frontier models.3. Choose a model
Browse the list and select the model you want to use. Each model page provides a brief overview along with usage details.4. Experiment in the playground
Once a model is selected, you can immediately start interacting with it in the Playground. The Playground offers a chat-style interface designed for rapid experimentation, with built-in tools such as a calculator, Python interpreter, and Linkup web search.You can also adjust inference parameters, such as temperature and top-p, to fine-tune the model’s behavior and output.