Function Calling
Learn how to do OpenAI compatible function calling on Friendli Serverless Endpoints.
Function calling is a powerful feature that connects large language models (LLMs) with external systems to maximize the model’s utility. It goes beyond simply relying on model’s learned knowledge and provides the possibility of utilizing real-time data and performing complex tasks.
Simple Example
In the example below, which consists of 1 to 5 steps, we define a get_weather
function that retrieves weather information, ask a question that prompts the model to use the tool, and execute the tool to execute the final response.
Tool Definition
Define a function that the model can call (get_weather
) with a JSON Schema.
The function requires the following parameters:
location
: The location to look up weather information for.date
: The date to look up weather information for.
This definition is included in the tools
array and passed to the model.
Calling the model
When a user asks a question, this request is passed to the model as a messages
array.
For example, the request “What’s the weather like in Paris today?” would be passed as:
Call the model using the tools
and messages
defined above.
Execution tool
The API caller runs the tool based on the function call information of the model.
For example, the get_weather
function is executed as follows:
Adding tool responses
Add the tool’s response to the messages
array and pass it back to the model.
- Append tool call information
- Append the tool’s execution result
This ensures the model has all the necessary information to generate a response.
Generating the final response
The model generates the final response based on the tool’s output:
Parameters
To use function calling, modify the tool_choice
, tools
, and parallel_tool_calls
parameters.
Parameter | Description | default |
---|---|---|
tool_choice | Specifies how the model should choose tools. Has four options: “none”, “auto”, “required”, or named tool choice. | auto |
tools | The list of tool objects that define the functions the model can call. | - |
parallel_tool_calls | Boolean value (True or False ) specifying whether to make tool calls in parallel. | True |
tool_choice
options
The model will automatically choose whether to call a function and which function to call by default.
However, you can use the tool_choice
parameter to tell the model to use a function.
none
: Disables the use of tools.auto
: Enables the model to decide whether to use tools and which ones to use.required
: Forces the model to use a tool, but the model chooses which one.- Named tool choice: Forces the model to use a specific tool. It must be in the following format:
Supported models
meta-llama-3.1-8b-instruct
meta-llama-3.1-70b-instruct
References
Building an AI Agent for Google Calendar ( / )
Friendli Tools Blog Series ( / / )
Was this page helpful?