Before you start, ensure the baseURL and apiKey refer to FriendliAI.
Since our products are entirely compatible with OpenAI SDK, now you are good to follow the examples below.
This feature is in Beta and available only on the Serverless Endpoints.
Using tool assisted chat completion API, models can utilize built-in tools prepared for tool calls, enhancing its capability to provide more comprehensive and actionable responses.
import OpenAI from"openai";const client =newOpenAI({ baseURL:"https://api.friendli.ai/serverless/v1", apiKey: process.env.FRIENDLI_TOKEN,});asyncfunctionmain(){const messages =[{ role:"user", content:"What is the current average home price in New York City, and if I put 15% down, how much will my mortgage be?",},];const tools =[{ type:"code:python-interpreter"},{ type:"web:search"}];const completion =await client.chat.completions.create({ model:"meta-llama-3.1-8b-instruct", messages: messages, tools: tools, tool_choice:"auto", stream:true,});forawait(const chunk of completion){if(chunk.choices ===undefined){console.log(`event: ${chunk.event}, data: ${JSON.stringify(chunk.data)}`);}else{console.log(chunk.choices[0].delta.content);}}}main();