First of all thank you for a wonderful browser
Description of the issue:
I managed to configure Leo AI to use my local LLM model via ollama. And this works perfectly if I open the side tab to chat with Leo. However, when I choose “Answer with AI” in the Brave search field, the AI model used is not my local ollama model, but the Brave’s default model.
This is the feature I like (and use) a lot and which seem to summarise wonderfully the answers given by the web pages that appear on the results page. Is it possible to use the local llm for this feature?
Brave Version (check About Brave
):
Version 1.76.74 Chromium: 134.0.6998.89 (Official Build) (arm64)
Operating System:
MacOS Sequoia 15.3.2
MacBook Pro M2 Max