- August 21, 2025
- 2 min read
Partnering with Linkup: Built‑in AI Web Search in Friendli Serverless Endpoints

We’re excited to announce a new partnership with Linkup, bringing their high‑quality, AI‑powered web search directly into the Friendli Suite. Starting today, Linkup Search API is available as a built-in tool in the Serverless Endpoints playground.
Why It Matters Large language models can’t know what happened five minutes ago, but production AI agents need fresh, factual, and source-backed data. With Linkup now inside Friendli Serverless Endpoints, developers get real-time, trustworthy search results in a single integration, fully compatible across FriendliAI’s AI models. Linkup ranks #1 on OpenAI’s SimpleQA benchmark, so whether you’re building agents for customer support, news, or market insights, reliable answers are now just a tool call away.
This integration helps teams build more reliable, production-ready AI agents by grounding LLM outputs with up‑to‑date information from trusted sources.
About Linkup
Linkup provides an AI‑powered web search API designed specifically for AI agents and LLMs, enabling access to accurate, structured, and source-cited web content to enhance AI performance and reduce hallucinations.
Linkup stands out as the world’s best search for AI apps, achieving the highest score on OpenAI's SimpleQA factuality benchmark.
Getting Started
In the Playground
All you need to do is open the Serverless Endpoints playground and enable the linkup:search
tool.
That’s it! You can now prompt your models to trigger the Linkup web search tool calls.
Via API and SDK
For API and SDK access, you’ll need a Linkup API key. To enable Linkup’s web search tool:
- Go to https://app.linkup.so, and get your Linkup API key.
- In Friendli Suite, open “Personal settings > Integrations” and add your Linkup API key.
You can now send requests via API and SDK.
bash
python
For more information, please refer to our documentation and guide in addition to Linkup’s documentation.
Looking Ahead
At FriendliAI, we're committed to delivering the best AI inference platform for powering real-world AI applications. We welcome partnerships as part of our ongoing effort to provide even greater value and better services to our users.
This partnership represents another step forward in our mission to make AI more accessible, efficient, and impactful. By combining FriendliAI’s cutting-edge AI inference technology with Linkup’s advanced AI web search API, we're enabling smarter, faster, and more reliable AI experiences for developers and businesses.
Stay tuned — we've got more to come soon.
Written by
FriendliAI Tech & Research
Share
General FAQ
What is FriendliAI?
FriendliAI is a GPU-inference platform that lets you deploy, scale, and monitor large language and multimodal models in production, without owning or managing GPU infrastructure. We offer three things for your AI models: Unmatched speed, cost efficiency, and operational simplicity. Find out which product is the best fit for you in here.
How does FriendliAI help my business?
Our Friendli Inference allows you to squeeze more tokens-per-second out of every GPU. Because you need fewer GPUs to serve the same load, the true metric—tokens per dollar—comes out higher even if the hourly GPU rate looks similar on paper. View pricing
Which models and modalities are supported?
Over 380,000 text, vision, audio, and multi-modal models are deployable out of the box. You can also upload custom models or LoRA adapters. Explore models
Can I deploy models from Hugging Face directly?
Yes. A one-click deploy by selecting “Friendli Endpoints” on the Hugging Face Hub will take you to our model deployment page. The page provides an easy-to-use interface for setting up Friendli Dedicated Endpoints, a managed service for generative AI inference. Learn more about our Hugging Face partnership
Still have questions?
If you want a customized solution for that key issue that is slowing your growth, contact@friendli.ai or click Talk to an expert — our experts (not a bot) will reply within one business day.