Instantiate your models using a Friendli provider instance.
We provide usage examples for each type of endpoint. Choose the one that best suits your needs:
import { friendli } from '@friendliai/ai-provider';// Automatically select serverless endpointsconst model = friendli("meta-llama-3.3-70b-instruct");// Or specify a specific serverless endpointconst model = friendli("meta-llama-3.3-70b-instruct", { endpoint: "serverless",});
Specify a specific pattern (e.g., CSV), character sets, or specific language characters (e.g., Korean Hangul characters) for your LLM’s output.
import { friendli } from "@friendliai/ai-provider";import { generateText } from "ai";const { text } = await generateText({ model: friendli("meta-llama-3.3-70b-instruct", { regex: new RegExp("[\n ,.?!0-9\uac00-\ud7af]*"), }), prompt: "Who is the first king of the Joseon Dynasty?",});console.log(text);
This feature is in Beta and available only on the Serverless Endpoints.
Using tool assisted chat completion API, models can utilize built-in tools prepared for tool calls, enhancing its capability to provide more comprehensive and actionable responses.
import { friendli } from "@friendliai/ai-provider";import { streamText } from "ai";const result = await streamText({ model: friendli("meta-llama-3.3-70b-instruct", { tools: [ {"type": "web:search"}, {"type": "math:calculator"}, ], }), prompt: "Find the current USD to CAD exchange rate and calculate how much $5,000 USD would be in Canadian dollars.",});for await (const textPart of result.textStream) { console.log(textPart);}