[Leo AI] Possibility to send LLM prompts as one message or define a template

I created a topic here that describes the problem in details: Using Perplexity.ai not possible - #4 by Lajtowo

tldr;

When you add a custom LLM model in Leo AI and try to “Summarize the page”, Leo actually sends an array of two messages with the same role “user” to your LLM service’s API (one containing the page context, the other the “Summarize…” prompt).

Some APIs, however, restrict consecutive messages with the same role - they expect an alternating pattern (user > assistant > user > assistant).

In the case of such restricted APIs, a 400 Bad Request is returned. This isn’t a bug per se, as it’s still consistent with the OpenAI API, which allows any combination of user/assistant messages.

However, since some APIs (like the Perplexity API in my case) enforce stricter input formats - and this “alternate pattern” seems to be gaining traction (at least according to ChatGPT) - it might be worth adding a checkbox in the LLM API configuration (per model) to force Leo to concatenate user messages into a single one, separated by a double line break (\n\n).

As an additional enhancement, consider supporting simple templating with variables, allowing custom prompts to be defined more flexibly.

Example:

You are an amazing assistant. Your name is Leo.
You summarize pages using George R.R. Martin's writing style.
Be concise and funny.

Here is page context to summarize:
{{page_context}}

We could add multiple templates per one model and they could be visible here as actions to click: