(function() { var utmInheritingDomain = "appstore.com", utmRegExp = /(&|\?)utm_[A-Za-z]+=[A-Za-z0-9]+/gi, links = document.getElementsByTagName("a"), utms = [ "utm_medium={{URL - utm_medium}}", "utm_source={{URL - utm_source}}", "utm_campaign={{URL - utm_campaign}}" ]; for (var index = 0; index < links.length; index += 1) { var tempLink = links[index].href, tempParts; if (tempLink.indexOf(utmInheritingDomain) > 0) { tempLink = tempLink.replace(utmRegExp, ""); tempParts = tempLink.split("#"); if (tempParts[0].indexOf("?") < 0 ) { tempParts[0] += "?" + utms.join("&"); } else { tempParts[0] += "&" + utms.join("&"); } tempLink = tempParts.join("#"); } links[index].href = tempLink; } }());
  • July 22, 2024
  • 6 min read

Building AI Agents Using Function Calling with LLMs

Building AI Agents Using Function Calling with LLMs thumbnail
Friendli Tools Series: Part 2 of 3

As the second installment of our series of blog posts on the latest Friendli Tools (aka Function Calls) release, we will learn how to build an AI agent application using multiple tools, each facilitating action and data retrieval. Friendli Tools is an essential feature for building fast and accurate agents. This post will highlight how our LLMs' function calling feature is a gateway for creating LLM agents.

The basics of function calling for language models, with a runnable Python script, have been covered in our previous post: Function Calling: Connecting LLMs with Functions and APIs.

Technology used

  • Friendli Serverless Endpoints
    • Chat completions API
    • Function calling

What are AI Agents?

Have you ever wanted a large language model to manage tasks that require handling sequential user flows for function calling, such as booking an airplane ticket? The flow could start by searching the web for the best ticket deals and presenting the results to the user, followed by booking the chosen ticket. AI agents built from function calling LLMs can accomplish these sequences of actions with a single command!

Integrating multiple tools with an LLM is a process that can significantly accelerate application development and enhance the capabilities of LLMs. By leveraging LLMs as the intelligent core of applications, connecting and utilizing diverse APIs can become effortless. This agentic workflow concept is evolving into multi-agent workflows, potentially offering a glimpse into the future of application development.

Tools with Different Roles

Tools used by LLMs can roughly be categorized into two types. The first are tools that enable LLMs to take a specific action, such as sending a Slack message or booking a hotel. The second are tools that allow LLMs to retrieve information, such as acquiring data on the current weather or a sports game. We will cover both types of tools with concrete examples in this tutorial. In addition, we will learn how to create a conversational function calling agent in Python with code examples.

Building on this foundational understanding of function calling, it's time to dive into the practical implementation of these concepts. We will explore how specific tools can be integrated into an AI agent application, demonstrating their utility in realistic use cases.

By following the outlined steps, you will:

  1. Implement a Slack Message Tool: Learn how to set up and use a function to send messages to a Slack channel.
  2. Implement a Weather Tool: Fetch current weather data for any location and understand how to use this information effectively.
  3. Combine Tools in a Conversational Agent: Discover how to build a Python application that uses both tools.

You can check out the runnable Python notebook code for this article at Building AI Agents Using Function Calling with LLMs.ipynb.

Tool #1: A Slack Message Tool

The first function we will add as a tool is the send_slack_message function. This function accepts a string argument and sends the value of the argument to a designated Slack channel. The Slack Messaging API is used to implement this function.

Here’s a high-level demonstration of the send_slack_message function: You can see how the message, “Hello world!”, given as the string argument, is sent to the WeatherHi Slack application’s channel.

python
send_slack_message("Hello world!")

When this function is used as a tool for language models, the LLM will fill in the function parameter accordingly, and the program will be able to send the Slack message by calling this function.

An example of a conversation where the LLM decides to send a Slack message containing a lovely quote by Theodore Roosevelt might be:

User: Could you share a lovely quote with cute emojis in the Slack app?

FriendliAI: I'm glad I could share a lovely quote with you through the Slack app!

Set up

Before getting into the code, we need to set up a couple of things to use the Slack Messaging API. We need to obtain values for the SLACK_BOT_TOKEN and CHANNEL_ID, which you can obtain upon creating a Slack application.

By following steps one through three here, you can finish installing your application for the Slack message tool. In this tutorial, we chose the application channel to send messages, so that the only Bot Token Scope we need to add is the chat:write scope.

Step three in the aforementioned document explains how to find an access token under 'OAuth Tokens for Your Workspace'. We will export the Bot User OAuth Token as an environment variable called SLACK_BOT_TOKEN.

Lastly, to find the application channel ID, follow these steps. This value will be exported as CHANNEL_ID. First, launch the Slack Desktop App, log in, and click on the application that you have created. Next, locate the application icon at the top and left-click on the application name. This will open a page displaying the channel ID at the bottom.

You can install the required slack_sdk library and export relevant variables as:

$ pip install slack_sdk
$ export SLACK_BOT_TOKEN=[FILL_IN_YOUR_TOKEN]
$ export CHANNEL_ID=[FILL_IN_CHANNEL_ID]

Here’s the full implementation of the send_slack_message function: It returns the text sent as the Slack message and includes the code for error handling. The required argument, message, is annotated with a description that will be included in the JSON schema.

python
import os
from slack_sdk import WebClient
from slack_sdk.errors import SlackApiError
from typing import Annotated

def send_slack_message(message: Annotated[str, "text message to send"]):
    """
    Send a slack message to WeatherHi slack app
    """
    client = WebClient(token=os.environ['SLACK_BOT_TOKEN'])

    try:
        response = client.chat_postMessage(channel=os.environ['CHANNEL_ID'], text=message)
        return response["message"]["text"]
    except SlackApiError as e:
        # You will get a SlackApiError if "ok" is False
        assert e.response["ok"] is False
        assert e.response["error"]  # str like 'invalid_auth', 'channel_not_found'
        print(f"Got an error: {e.response['error']}")
        # Also receive a corresponding status_code
        assert isinstance(e.response.status_code, int)
        print(f"Received a response status_code: {e.response.status_code}")

Tool #2: A Weather Tool

The second function we will add as a tool is the get_current_weather function. This function takes a location (i.e., a city or a state) as input and returns the current temperature in Celsius or Fahrenheit for that location. The OpenWeatherMap Current Weather Data API is used to implement this function.

Here’s a high-level demonstration of the get_current_weather function: As you can see, the results from the function and the OpenWeatherMap are the same.

python
print(get_current_weather("San Francisco", "Fahrenheit"))
# 60.98

Set up

Before diving into the code, we need to obtain the value for the WEATHER_TOKEN. You can get your OpenWeatherMap API key here after signing in.

You can install the required requests library and export the relevant variable as:

$ pip install requests
$ export WEATHER_TOKEN=[FILL_IN_YOUR_TOKEN]

Here’s the full implementation of the get_current_weather function: It returns the current temperature as a string variable. The required arguments, location and unit, are annotated with descriptions that will be included in the JSON schema.

python
import os
import requests
from typing import Annotated

def get_current_weather(location: Annotated[str, "name of current location"], unit: Annotated[str, "unit of temperature degree, Celsius or Fahrenheit"]):
    """
    Get the current weather in a given location.
    """
    token = os.environ['WEATHER_TOKEN']
    url = f"https://api.openweathermap.org/data/2.5/weather?q={location}&appid={token}"

    if unit.lower() == "celsius":
        url += "&units=metric"
    elif unit.lower() == "fahrenheit":
        url += "&units=imperial"

    data = requests.get(url).json()
    temperature = data['main']['temp']

    return str(temperature)

JSON Schema Conversion

Now that we have finished implementing the functions, we need to convert the raw function codes into JSON schemas. Recall that functions must be provided to the LLM in a JSON format. In this blog, we will code a get_tool_schema function to automate the conversion from Python code to a JSON schema and convert our two functions.

Luckily, the function-schema library provides the get_function_schema function, which handles most of the conversion for us. We slightly modify the result of this function to create our own get_tool_schema function. Note how the annotated descriptions are included in the JSON schemas.

You can install the required libraries as:

$ pip install function_schema
$ pip install packaging

Here’s the full implementation of the get_tool_schema function:

python
from function_schema import get_function_schema

def get_tool_schema(function):
    return {
        "type": "function",
        "function": get_function_schema(function)
    }

Below is the JSON schema for the send_slack_message function:

python
print(get_tool_schema(send_slack_message))
{
  "type": "function",
  "function": {
    "name": "send_slack_message",
    "description": "Send a slack message to WeatherHi slack app",
    "parameters": {
      "type": "object",
      "properties": {
        "message": {
          "type": "string",
          "description": "text message to send"
        }
      },
      "required": ["message"]
    }
  }
}

Below is the JSON schema for the get_current_weather function:

python
print(get_tool_schema(get_current_weather))
{
  "type": "function",
  "function": {
    "name": "get_current_weather",
    "description": "Get the current weather in a given location",
    "parameters": {
      "type": "object",
      "properties": {
        "location": {
          "type": "string",
          "description": "name of current location"
        },
        "unit": {
          "type": "string",
          "description": "unit of temperature degree, Celsius or Fahrenheit"
        }
      },
      "required": ["location", "unit"]
    }
  }
}

Conversational Agent

The final part of this blog integrates the coded tools into a comprehensive conversational agent. We will also demonstrate how the LLM provided by FriendliAI can accurately call one or more tools in an example conversation with a user. Additionally, you will need one more token: the Friendli Personal Access Token.

You can install the required libraries and export relevant variables as:

$ pip install friendli-client -U
$ pip install slack_sdk requests function_schema packaging typer
$ export FRIENDLI_TOKEN=[FILL_IN_YOUR_TOKEN]
$ export SLACK_BOT_TOKEN=[FILL_IN_YOUR_TOKEN]
$ export CHANNEL_ID=[FILL_IN_CHANNEL_ID]
$ export WEATHER_TOKEN=[FILL_IN_YOUR_TOKEN]

Here’s the full commented Python script:

python
import os
import json
import typer
from friendli import Friendli
from function_schema import get_function_schema
from weather import get_current_weather
from slack import send_slack_message

# Function to get the schema of a tool
def get_tool_schema(function):
    return {
        "type": "function",
        "function": get_function_schema(function)
    }

# Dictionary of available tools
available_tools = {
    "get_current_weather": get_current_weather,
    "send_slack_message": send_slack_message,
}

Here’s a realistic agent usage scenario that first uses the weather tool and then the Slack message tool:

User: I live in sf. What should I wear right now?

TOOL CALLS: get_current_weather({"location": "San Francisco", "unit": "Fahrenheit"})

FriendliAI: It looks like the current temperature in San Francisco is around 60.75°F.
Considering that, I'd recommend dressing in layers. You might want to wear a lightweight
jacket or sweater to keep yourself comfortable. It's always a good idea to check the winds as
well, but based on the temperature alone, you should be okay with a moderate level of
clothing.

User: Can you send this information to the WeatherHi app?

TOOL CALLS: send_slack_message({"message": "Current temperature in San Francisco is 60.75°F. Dress in layers!"})

FriendliAI: The message has been sent to the WeatherHi app! You can now share the current
weather conditions in San Francisco with others. They'll receive the message: "Current
temperature in San Francisco is 60.75°F. Dress in layers!"

Conclusion

In conclusion, this post detailed how to build an AI agent using Friendli Tools. We covered the setup and implementation of a Slack message tool and a weather tool, showing how LLMs can use these tools to sequentially retrieve information and send messages. This approach showcases the potential of LLMs in streamlining application development, paving the way for more advanced and efficient multi-agent workflows in the future.

Our next blog post will showcase the benchmark results of the newly released Friendli Tools. Designed to simplify and optimize function calling, these tools offer significant performance enhancements. Stay tuned for more detailed use cases on leveraging these tools to unlock the full potential of your generative AI applications.

Resources


Written by

FriendliAI logo

FriendliAI Tech & Research


Share