You can use Vercel AI SDK to interact with FriendliAI. This makes migration of existing applications already using Vercel AI SDK particularly easy.

How to use

Before you start, ensure you’ve already obtained the FRIENDLI_TOKEN from the Friendli Suite.

Instantiation

Instantiate your models using a Friendli provider instance. We provide usage examples for each type of endpoint. Choose the one that best suits your needs:

Example: Generating text

Generate a response with the generateText function:

Example: Using Enforcing Patterns (Regex)

Specify a specific pattern (e.g., CSV), character sets, or specific language characters (e.g., Korean Hangul characters) for your LLM’s output.

Example: Using built-in tools

This feature is in Beta and available only on the Serverless Endpoints.

Using tool assisted chat completion API, models can utilize built-in tools prepared for tool calls, enhancing its capability to provide more comprehensive and actionable responses.

Available tools are listed here.

import { friendli } from "@friendliai/ai-provider";
import { streamText } from "ai";

const result = await streamText({
  model: friendli("meta-llama-3.1-70b-instruct", {
    tools: [
        {"type": "web:search"},
        {"type": "math:calculator"},
    ],
  }),
  prompt: "Find the current USD to CAD exchange rate and calculate how much $5,000 USD would be in Canadian dollars.",
});

for await (const textPart of result.textStream) {
  console.log(textPart);
}

OpenAI Compatibility

You can also use @ai-sdk/openai as the APIs are OpenAI-compatible.

import { createOpenAI } from '@ai-sdk/openai';

const friendli = createOpenAI({
  baseURL: 'https://api.friendli.ai/serverless/v1',
  apiKey: process.env.FRIENDLI_TOKEN,
});

If you are using dedicated endpoints

import { createOpenAI } from '@ai-sdk/openai';

const friendli = createOpenAI({
  baseURL: 'https://api.friendli.ai/dedicated/v1',
  apiKey: process.env.FRIENDLI_TOKEN,
});

Further resources