How to use
Before you start, ensure you’ve already obtained theFRIENDLI_TOKEN from the Friendli Suite.
Our products are entirely compatible with OpenAI, so we use the langchain-openai package by referring to the FriendliAI baseURL.
Instantiation
Now we can instantiate our model object and generate chat completions. We provide usage examples for each type of endpoint. Choose the one that best suits your needs:Runnable interface
We support both synchronous and asynchronous runnable methods to generate a response.Synchronous methods:
Asynchronous methods:
Chaining
We can chain our model with a prompt template. Prompt templates convert raw user input to better input to the LLM.Tool calling
Describe tools and their parameters, and let the model return a tool to invoke with the input arguments. Tool calling is extremely useful for enhancing the model’s capability to provide more comprehensive and actionable responses.Define tools to use
The@tool decorator is used to define a tool.
If you set parse_docstring=True, the tool will parse the docstring to extract the information of arguments.