Tutorials
Build an agent with LangChain
Tutorials
Build an agent with LangChain
Build an AI agent with LangChain and Friendli Serverless Endpoints, integrating tool calling for dynamic and efficient responses.
Introduction
This article walks you through creating an Agent using LangChain and Serverless Endpoints.
Setup
pip install -qU langchain-openai langchain-community langchain wikipedia
Get your Friendli Token from https://suite.friendli.ai/ to use.
import getpass
import os
if not os.environ.get("FRIENDLI_TOKEN"):
os.environ["FRIENDLI_TOKEN"] = getpass.getpass("Enter your Friendli Token: ")
Instantiation
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="meta-llama-3.1-8b-instruct",
base_url="https://api.friendli.ai/serverless/v1",
api_key=os.environ["FRIENDLI_TOKEN"],
)
Create Agent with LangChain
Step 1. Create Tool
from langchain_community.tools import WikipediaQueryRun
from langchain_community.utilities import WikipediaAPIWrapper
api_wrapper = WikipediaAPIWrapper(top_k_results=1, doc_content_chars_max=100)
wiki = WikipediaQueryRun(api_wrapper=api_wrapper)
tools = [wiki]
Step 2. Create Prompt
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful assistant"),
MessagesPlaceholder("chat_history"),
("user", "{input}"),
("placeholder", "{agent_scratchpad}"),
]
)
prompt.messages
Step 3. Create Agent
from langchain.agents import AgentExecutor
from langchain.agents import create_tool_calling_agent
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
Step 4. Run the Agent
chat_history = []
while True:
user_input = input("Enter your message: ")
result = agent_executor.invoke(
{"input": user_input, "chat_history": chat_history},
)
chat_history.append({"role": "user", "content": user_input})
chat_history.append({"role": "assistant", "content": re["output"]})
When you run the code, it will wait for the user’s input. After inputting, it will wait and output the result. When you ask a question about a specific wikipedia, it will automatically call the wikipedia tool and output the result.
final result
Enter your Friendli Token: ··········
Enter your message: hello
> Entering new AgentExecutor chain...
Hello, it's nice to meet you. I'm here to help with any questions or topics you'd like to discuss. Is there something in particular you'd like to talk about, or do you need assistance with something?
> Finished chain.
Enter your message: What does the Linux kernel do?
> Entering new AgentExecutor chain...
Invoking: `wikipedia` with `{'query': 'Linux kernel'}`
responded: The Linux kernel is the core component of the Linux operating system. It acts as a bridge between the computer hardware and the user space applications. The kernel manages the system's hardware resources, such as memory, CPU, and I/O devices. It provides a set of interfaces and APIs that allow user space applications to interact with the hardware.
Page: Linux kernel
Summary: The Linux kernel is a free and open source,: 4 UNIX-like kernel that isThe Linux kernel is a free and open source, UNIX-like kernel that is responsible for managing the system's hardware resources, such as memory, CPU, and I/O devices. It provides a set of interfaces and APIs that allow user space applications to interact with the hardware. The kernel is the core component of the Linux operating system, and it plays a crucial role in ensuring the stability and security of the system.
> Finished chain.
Enter your message:
Full Example Code
import getpass
import os
from langchain_openai import ChatOpenAI
from langchain_community.tools import WikipediaQueryRun
from langchain_community.utilities import WikipediaAPIWrapper
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.agents import AgentExecutor
from langchain.agents import create_tool_calling_agent
if not os.environ.get("FRIENDLI_TOKEN"):
os.environ["FRIENDLI_TOKEN"] = getpass.getpass("Enter your Friendli Token: ")
llm = ChatOpenAI(
model="meta-llama-3.1-8b-instruct",
base_url="https://api.friendli.ai/serverless/v1",
api_key=os.environ["FRIENDLI_TOKEN"],
)
api_wrapper = WikipediaAPIWrapper(top_k_results=1, doc_content_chars_max=100)
wiki = WikipediaQueryRun(api_wrapper=api_wrapper)
tools = [wiki]
# Get the prompt to use - you can modify this!
prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful assistant"),
MessagesPlaceholder("chat_history"),
("user", "{input}"),
("placeholder", "{agent_scratchpad}"),
]
)
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
chat_history = []
while True:
user_input = input("Enter your message: ")
result = agent_executor.invoke(
{"input": user_input, "chat_history": chat_history},
)
chat_history.append({"role": "user", "content": user_input})
chat_history.append({"role": "assistant", "content": result["output"]})
Was this page helpful?