Brave Search + Leo AI with local LLM via ollama

First of all thank you for a wonderful browser :slight_smile:
Description of the issue:
I managed to configure Leo AI to use my local LLM model via ollama. And this works perfectly if I open the side tab to chat with Leo. However, when I choose “Answer with AI” in the Brave search field, the AI model used is not my local ollama model, but the Brave’s default model.
This is the feature I like (and use) a lot and which seem to summarise wonderfully the answers given by the web pages that appear on the results page. Is it possible to use the local llm for this feature?

Brave Version (check About Brave):
Version 1.76.74 Chromium: 134.0.6998.89 (Official Build) (arm64)

Operating System:
MacOS Sequoia 15.3.2
MacBook Pro M2 Max

At this time, Brave Search uses a custom model for it’s Answer with AI feature. I believe we have plans to introduce additional models and/or BYO models for Search in future releases.

1 Like

You might want to precise this when advertising BYO possibility for Leo AI. I must admit I was mislead to believe it also applies to the Answer with AI feature.