New: Bring Your Own LLM to Forest Admin
Length
Author
Guillaume Rigal
Published
Feb 20, 2026
Take full control of your AI stack. Route AI calls from Forest Admin directly to your Anthropic, OpenAI, Mistral, or Gemini models for total privacy.
Up to now, AI calls from Workflows used our AI instance.
Easy? Yes. Private? We kept it safe. But not "Zero-Knowledge" safe.
To address this, we’re bring more options to the LLM routing.
You can now bring your own LLM.
Every AI call is redirected to your agent.
As Alban from our dev team put it: "What does it mean? Every AI call is redirected to the customer’s agent. Zero requests on our side, no privacy violation."
We’re launching with support for Anthropic, OpenAI, Mistral AI, Google Gemini.
This comes on top of 50+ SaaS integrations and 25 new Fintech solutions announced this week.
Our goal is to make easier for your team to setup Forest Admin as a layer above your fragmented stack in order to provide a single point of contact for Ops teams.
A few lines of code to setup your own LLM instance in Forest Admin
It's really just a couple of lines of code to configure your Forest Admin agent. And from then on, your agent will route every AI request to your model.
As you see, pretty easy setup and here is the full documentation
And get in touch if you have question or need assistance to make it happen!
