Skip to main content
You can use LlamaIndex to interact with FriendliAI. This makes migration of existing applications already using LlamaIndex particularly easy.

How to use

Before you start, ensure you’ve already obtained the API_KEY from the Friendli Suite > Personal Settings > API Keys.
pip install llama-index llama-index-llms-friendli

Instantiation

Now we can instantiate our model object and generate chat completions. The default model (i.e. meta-llama-3.3-70b-instruct) will be used if no model is specified.
import os
from llama_index.llms.friendli import Friendli

os.environ['API_KEY'] = "YOUR_API_KEY"

llm = Friendli(model="meta-llama-3.3-70b-instruct")

Chat completion

Generate a response from a given conversation.
from llama_index.core.llms import ChatMessage, MessageRole

message = ChatMessage(role=MessageRole.USER, content="Tell me a joke.")
resp = llm.chat([message])

print(resp)

Completion

Generate a response from a given prompt.
prompt = "Draft a cover letter for a role in software engineering."
resp = llm.complete(prompt)

print(resp)
Last modified on April 17, 2026