Goals

  • Use tool calling to build your own AI agent with Friendli Serverless Endpoints
  • Check out the examples below to see how you can interact with state-of-the-art language models while letting them search the web, run Python code, etc.
  • Feel free to make your own custom tools!

Getting Started

  1. Head to https://suite.friendli.ai, and create an account.
  2. Grab a FRIENDLI_TOKEN to use Friendli Serverless Endpoints within an agent.

🚀 Step 1. Playground UI

Experience tool calling on the Playground

  1. On your dashboard, click the “Go to Playground” button of Friendli Serverless Endpoints
  2. Choose a model that best aligns with your desired use case.
  3. Click a web:search tool calling example and see the response. 😀

🚀 Step 2. Tool Calling

Search interesting information using the web:search tool. This time, let’s try it by writing python code.

  1. Turn on the web:search tool on the playground.

  2. Ask something interesting!

    Find information on the popular movies currently showing in theaters and provide their ratings.
    
  3. Click the “View code” button to use the tool calling in Python/Javascript.

  4. Copy/Paste the code on your IDE.

  5. Click here to generate a Friendli Token.

  6. Fill in the token value of the copied/pasted code and run it.

🚀 Step 3. Multiple tool calling

Use multiple tools at once to calculate “How long it will take you to buy a house in the San Francisco Bay Area based on your annual salary”. Here is the available built-in tools (beta) list.

  • math:calculator (tool for calculating arithmetic operations)
  • math:statistics (tool for analyzing statistic data)
  • math:calendar (tool for handling date-related data)
  • web:search (tool for retrieving data through the web search)
  • web:url (tool for extracting data from a given website)
  • code:python-interpreter (tool for writing and executing python code)
  • file:text (tool for extracting text data from a given file)

Example Answer sheet

Prompt: My annual salary is $ 100k. How long it will take to buy a house in San Francisco Bay Area? (`web:search` & `math:calculator` used)

Answer: Based on the web search results, the median price of an existing single-family home in the Bay Area is around $1.25 million.
Using a calculator to calculate how long it would take to buy a house in the San Francisco Bay Area with an annual salary of $100,000, we get:
$1,200,000 (house price) / $100,000 (annual salary) = 12 years
So, it would take approximately 12 years to buy a house in the San Francisco Bay Area with an annual salary of $100,000,
assuming you save your entire salary each year and don't consider other factors like interest rates, taxes, and living expenses.

🚀 Step 4. Build a custom tool

Build your own creative tool. We will show you how to make a custom tool that retrieves temperature information. (Completed code snippet is provided at the bottom)

  1. Define a function for using as a custom tool

    def get_temperature(location: str) -> int:
        """Mock function that returns the city temperature"""
        if "new york" in location.lower():
            return 45
        if "san francisco" in location.lower():
            return 72
        return 30
    
  2. Send a function call inference request

    1. Add the user’s input as an user role message.
    2. The information about the custom function (e.g., get_temperature) goes into the tools option. The function’s parameters are described in JSON schema.
    3. The response includes the arguments field, which are values extracted from the user’s input that can be used as parameters of the custom function.
    from friendli import Friendli
    
    token = os.environ.get("FRIENDLI_TOKEN") or "YOUR_FRIENDLI_TOKEN"
    client= Friendli(token=token)
    user_prompt = "I live in New York. What should I wear for today's weather?"
    
    messages = [
      {
        "role": "user",
        "content": user_prompt,
      },
    ]
    
    tools=[
      {
        "type": "function",
        "function": {
          "name": "get_temperature",
          "description": "Get the temperature information in a given location.",
            "parameters": {
               "type": "object",
               "properties": {
               "location": {
                 "type": "string",
                 "description": "The name of current location e.g., New York",
               },
             },
           },
         },
      },
    ]
    
    chat = client.chat.completions.create(
        model="meta-llama-3.1-70b-instruct",
        messages=messages,
        tools=tools,
        temperature=0,
        frequency_penalty=1,
    )
    
    print(chat)
    
  3. Generate the final response using the tool calling results

    1. Add the tool_calls response as an assistant role message.
    2. Add the result obtained by calling the get_weather function as a tool message to the Chat API again.
    import json
    
    func_kwargs = json.loads(chat.choices[0].message.tool_calls[0].function.arguments)
    temperature_info = get_temperature(**func_kwargs)
    
    messages.append(
        {
            "role": "assistant",
            "tool_calls": [
                tool_call.model_dump()
                for tool_call in chat.choices[0].message.tool_calls
            ]
        }
    )
    messages.append(
        {
            "role": "tool",
            "content": str(temperature_info),
            "tool_call_id": chat.choices[0].message.tool_calls[0].id
        }
    )
    
    chat_w_info = client.chat.completions.create(
        model="meta-llama-3.1-70b-instruct",
        tools=tools,
        messages=messages,
    )
    
    for choice in chat_w_info.choices:
        print(choice.message.content)
    
  • Complete Code Snippet

    from friendli import Friendli
    import json
    import os
    
    token = os.environ.get("FRIENDLI_TOKEN") or "YOUR_FRIENDLI_TOKEN"
    client= Friendli(token=token)
    user_prompt = "I live in New York. What should I wear for today's weather?"
    
    messages = [
      {
        "role": "user",
        "content": user_prompt,
      },
    ]
    
    tools=[
      {
        "type": "function",
        "function": {
          "name": "get_temperature",
          "description": "Get the temperature information in a given location.",
            "parameters": {
               "type": "object",
               "properties": {
               "location": {
                 "type": "string",
                 "description": "The name of current location e.g., New York",
               },
             },
           },
         },
      },
    ]
    
    chat = client.chat.completions.create(
        model="meta-llama-3.1-70b-instruct",
        messages=messages,
        tools=tools,
        temperature=0,
        frequency_penalty=1,
    )
    
    def get_temperature(location: str) -> int:
        """Mock function that returns the city temperature"""
        if "new york" in location.lower():
            return 45
        if "san francisco" in location.lower():
            return 72
        return 30
    
    func_kwargs = json.loads(chat.choices[0].message.tool_calls[0].function.arguments)
    temperature_info = get_temperature(**func_kwargs)
    
    messages.append(
        {
            "role": "assistant",
            "tool_calls": [
                tool_call.model_dump()
                for tool_call in chat.choices[0].message.tool_calls
            ]
        }
    )
    messages.append(
        {
            "role": "tool",
            "content": str(temperature_info),
            "tool_call_id": chat.choices[0].message.tool_calls[0].id
        }
    )
    
    chat_w_info = client.chat.completions.create(
        model="meta-llama-3.1-70b-instruct",
        tools=tools,
        messages=messages,
    )
    
    for choice in chat_w_info.choices:
        print(choice.message.content)
    

🎉 Congratulations!

Following the above instructions, we’ve experienced the whole process of defining and using a custom tool to generate an accurate and rich answer from LLM models!

Brainstorm creative ideas for your agent by reading our blog articles!