Commit Graph

4 Commits

Author SHA1 Message Date
Jared Van Bortel
b1ebe63820 llamamodel: restore leading space removal logic
When llama.cpp was updated, I removed the space removal logic, but it
turns out it's still actually needed. This is now a proper parameter, as
we specifically only want to disable the *leading* space when we are
tokenizing input that comes after a normal token.

This fixes a regression in commit 290c6294 ("backend: rebase llama.cpp
submodule on latest upstream (#2694)").

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-07-30 17:52:15 -04:00
Jared Van Bortel
41c9013fa4
chat: don't use incomplete types with signals/slots/Q_INVOKABLE (#2408)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-06-06 11:59:28 -04:00
Jared Van Bortel
d3d777bc51
chat: fix #includes with include-what-you-use (#2401)
Also use qGuiApp instead of qApp.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2024-06-04 14:47:11 -04:00
Olyxz16
2c0a660e6e
feat: Add support for Mistral API models (#2053)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Signed-off-by: Cédric Sazos <cedric.sazos@tutanota.com>
Co-authored-by: Jared Van Bortel <jared@nomic.ai>
2024-03-13 18:23:57 -04:00