Local AI Model use for Leo?

Feature Request:

  • Using a self-hosted AI model processing locally on the user’s device.

I don’t know if this is even technologically feasible at the moment. However, it would be nice if Leo was capable of using an AI model (Such as Liberty Edge) hosted by the user.

The reasons I think this would be useful:

  • Privacy
  • Option to utilize “uncensored” models
  • Reduced server load for Brave

Just a few more words. The free models available to Leo are far to…‘corporate’ and can be rather unhelpful at times. The paid models seem to be fine, however, subscription for premium is rather pricey given how little I use Leo (The Brave AI Search assistant is so helpful, it’s my new bestie <3).

I do not think this would take away incentive to subscribe to Brave as there would still be the appeal of faster (and perhaps more relevant) responses than the user’s local model to incentivize subscriptions.

Anyways, I feel this would give users more options, especially for those who are more privacy orientated and-or do not mind a less “sanitized” model.

Check out this blog post by Brave, you can test it out in the beta right now. The blog post says you have to download the nightly, but it’s been enough time to where this is now in the beta as well.

1 Like

Thank you so much for bringing this to my attention. I look forward to giving it a try!

1 Like

Hope Brave will allow to integrate with openrouter.ai, groq and google gemini apis.

how to make Leo to pass 100% of content of webpage when using BYOM (ollama)?