Brave Leo Customization

I just went premium for Leo and my first impression was it has a huge potential for as a native part of the browser.

I also think that Leo needs to be customizable in some way, like I found the responses profoundly unusable and appears to be too short most of the time.

In my case I’m already paying for several products offering inference such as Raycast, TypingMind, OpenAI, Claude etc.

At this point it is pretty much unusable for me and I feel bad to pay $15 a month just for access to a 70B LLaMA model that can even run on a 32GB Macbook, there are operating costs and everything, I know, but just offer us a way to customize it maybe do something like to only offer customization for Pro users.

Some ideas that could be very useful in my case:

  • Bring your custom LLM
  • Customizing system prompts
  • Creating ‘shortcut’ like template(s) that can be run on the page.
  • Chat history browsing (with client-side state)
1 Like

Also the token limit of 2000 is absurd to me that resulting in loose-context of the page, just offer us a way to get more context size for LLM inference and charge more or let us use our model and charge for “using a custom llm”, the pricing and feature wise, Leo is just a disappointment for me so for.

Say if I work with my team ( and publish a pull request allowing us to customize endpoints, in the settings, would the devs allow the merge?

1 Like