(function() { var utmInheritingDomain = "appstore.com", utmRegExp = /(&|\?)utm_[A-Za-z]+=[A-Za-z0-9]+/gi, links = document.getElementsByTagName("a"), utms = [ "utm_medium={{URL - utm_medium}}", "utm_source={{URL - utm_source}}", "utm_campaign={{URL - utm_campaign}}" ]; for (var index = 0; index < links.length; index += 1) { var tempLink = links[index].href, tempParts; if (tempLink.indexOf(utmInheritingDomain) > 0) { tempLink = tempLink.replace(utmRegExp, ""); tempParts = tempLink.split("#"); if (tempParts[0].indexOf("?") < 0 ) { tempParts[0] += "?" + utms.join("&"); } else { tempParts[0] += "&" + utms.join("&"); } tempLink = tempParts.join("#"); } links[index].href = tempLink; } }());
  • July 22, 2024
  • 11 min read

Building AI Agents Using Function Calling with LLMs

Building AI Agents Using Function Calling with LLMs thumbnail
Friendli Tools Series: Part 2 of 3

As the second installment of our series of blog posts on the latest Friendli Tools (aka Function Calls) release, we will learn how to build an AI agent application using multiple tools, each facilitating action and data retrieval. Friendli Tools is an essential feature for building fast and accurate agents. This post will highlight how our LLMs' function calling feature is a gateway for creating LLM agents.

The basics of function calling for language models, with a runnable Python script, have been covered in our previous post: Function Calling: Connecting LLMs with Functions and APIs.

Technology used

  • Friendli Serverless Endpoints
    • Chat completions API
    • Function calling

What are AI Agents?

Have you ever wanted a large language model to manage tasks that require handling sequential user flows for function calling, such as booking an airplane ticket? The flow could start by searching the web for the best ticket deals and presenting the results to the user, followed by booking the chosen ticket. AI agents built from function calling LLMs can accomplish these sequences of actions with a single command!

Integrating multiple tools with an LLM is a process that can significantly accelerate application development and enhance the capabilities of LLMs. By leveraging LLMs as the intelligent core of applications, connecting and utilizing diverse APIs can become effortless. This agentic workflow concept is evolving into multi-agent workflows, potentially offering a glimpse into the future of application development.

Tools with Different Roles

Tools used by LLMs can roughly be categorized into two types. The first are tools that enable LLMs to take a specific action, such as sending a Slack message or booking a hotel. The second are tools that allow LLMs to retrieve information, such as acquiring data on the current weather or a sports game. We will cover both types of tools with concrete examples in this tutorial. In addition, we will learn how to create a conversational function calling agent in Python with code examples.

Building on this foundational understanding of function calling, it's time to dive into the practical implementation of these concepts. We will explore how specific tools can be integrated into an AI agent application, demonstrating their utility in realistic use cases.

By following the outlined steps, you will:

  1. Implement a Slack Message Tool: Learn how to set up and use a function to send messages to a Slack channel.
  2. Implement a Weather Tool: Fetch current weather data for any location and understand how to use this information effectively.
  3. Combine Tools in a Conversational Agent: Discover how to build a Python application that uses both tools.

You can check out the runnable Python notebook code for this article at Building AI Agents Using Function Calling with LLMs.ipynb.

Tool #1: A Slack Message Tool

The first function we will add as a tool is the send_slack_message function. This function accepts a string argument and sends the value of the argument to a designated Slack channel. The Slack Messaging API is used to implement this function.

Here’s a high-level demonstration of the send_slack_message function: You can see how the message, “Hello world!”, given as the string argument, is sent to the WeatherHi Slack application’s channel.

python
send_slack_message("Hello world!")