Function calling is a powerful feature that connects large language models (LLMs) with external systems to maximize the model’s utility. It goes beyond simply relying on model’s learned knowledge and provides the possibility of utilizing real-time data and performing complex tasks.

Simple Example

In the example below, which consists of 1 to 5 steps, we define a get_weather function that retrieves weather information, ask a question that prompts the model to use the tool, and execute the tool to execute the final response.

Open In Colab
1

Tool Definition

Define a function that the model can call (get_weather) with a JSON Schema.
The function requires the following parameters:

  • location: The location to look up weather information for.
  • date: The date to look up weather information for.

This definition is included in the tools array and passed to the model.

tools = [
  {
      "type": "function",
      "function": {
          "name": "get_weather",
          "parameters": {
              "type": "object",
              "properties": {
                  "location": {"type": "string"},
                  "date": {"type": "string", "format": "date"}
              },
          },
      },
  }
]
2

Calling the model

When a user asks a question, this request is passed to the model as a messages array.
For example, the request “What’s the weather like in Paris today?” would be passed as:

from datetime import datetime

today = datetime.now()
messages = [
    {"role": "system", "content": f"You are a helpful assistant. today is {today}."},
    {"role": "user", "content": "What's the weather like in Paris today?"}
]

Call the model using the tools and messages defined above.

from openai import OpenAI
import os

token = os.getenv("FRIENDLI_TOKEN") or "<YOUR_FRIENDLI_TOKEN>"

client = OpenAI(
    base_url = "https://api.friendli.ai/serverless/v1",
    api_key = token
)

completion = client.chat.completions.create(
  model="meta-llama-3.1-8b-instruct",
  messages=messages,
  tools=tools,
)

print(completion.choices[0].message.tool_calls)
3

Execution tool

The API caller runs the tool based on the function call information of the model.
For example, the get_weather function is executed as follows:

import json
import random

def get_weather(location: str, date: str):
    temperature = random.randint(60, 80)
    return {"temperature": temperature, "forecast": "sunny"}

tool_call = completion.choices[0].message.tool_calls[0]

tool_response = locals()[tool_call.function.name](**json.loads(tool_call.function.arguments))
print(tool_response)
Result:
{'temperature': 65, 'forecast': 'sunny'}
4

Adding tool responses

Add the tool’s response to the messages array and pass it back to the model.

  1. Append tool call information
  2. Append the tool’s execution result

This ensures the model has all the necessary information to generate a response.

model_response = completion.choices[0].message

# Append the response from the model
messages.append(
    {
        "role": model_response.role,
        "tool_calls": [
            tool_call.model_dump()
            for tool_call in model_response.tool_calls
        ]
    }
)

# Append the response from the tool
messages.append(
    {
        "role": "tool",
        "content": json.dumps(tool_response),
        "tool_call_id": tool_call.id
    }
)

print(json.dumps(messages, indent=2))
5

Generating the final response

The model generates the final response based on the tool’s output:

next_completion = client.chat.completions.create(
    model="meta-llama-3.1-8b-instruct",
    messages=messages,
    tools=tools
)

print(next_completion.choices[0].message.content)
Final output:
According to the forecast, it's going to be a sunny day in Paris with a temperature of 65 degrees.

Parameters

To use function calling, modify the tool_choice, tools, and parallel_tool_calls parameters.

ParameterDescriptiondefault
tool_choiceSpecifies how the model should choose tools. Has four options: “none”, “auto”, “required”, or named tool choice.auto
toolsThe list of tool objects that define the functions the model can call.-
parallel_tool_callsBoolean value (True or False) specifying whether to make tool calls in parallel.True

tool_choice options

The model will automatically choose whether to call a function and which function to call by default.
However, you can use the tool_choice parameter to tell the model to use a function.

  • none: Disables the use of tools.
  • auto: Enables the model to decide whether to use tools and which ones to use.
  • required: Forces the model to use a tool, but the model chooses which one.
  • Named tool choice: Forces the model to use a specific tool. It must be in the following format:
    {
      "type": "function",
      "function": {
        "name": "get_current_weather" // The function name you want to specify
      }
    }
    

Supported models

  • meta-llama-3.1-8b-instruct
  • meta-llama-3.1-70b-instruct

References

Building an AI Agent for Google Calendar ( / )
Friendli Tools Blog Series ( / / )