Adam Treat
6d943917f1
Fail early/gracefully if incompatible hardware detected. And default to universal builds on mac.
2023-05-08 08:23:00 -04:00
Adam Treat
e397fda250
Bump the version and save up to an order of magnitude of disk space for chat files.
2023-05-05 20:12:00 -04:00
Adam Treat
6d4d86d07c
Bump the version.
2023-05-05 11:43:25 -04:00
Adam Treat
06bb6960d4
Add about dialog.
2023-05-05 10:47:05 -04:00
Adam Treat
f291853e51
First attempt at providing a persistent chat list experience.
...
Limitations:
1) Context is not restored for gpt-j models
2) When you switch between different model types in an existing chat
the context and all the conversation is lost
3) The settings are not chat or conversation specific
4) The sizes of the chat persisted files are very large due to how much
data the llama.cpp backend tries to persist. Need to investigate how
we can shrink this.
2023-05-04 15:31:41 -04:00
Adam Treat
4a09f0f0ec
More extensive usage stats to help diagnose errors and problems in the ui.
2023-05-02 20:31:17 -04:00
Adam Treat
118e0bdc44
Allow removing chats.
2023-05-01 20:56:53 -04:00
Adam Treat
a48226613c
Turn the chat list into a model.
2023-05-01 17:13:20 -04:00
Adam Treat
8f80f8e3a2
Break out the drawer into own component.
2023-05-01 13:51:46 -04:00
Adam Treat
4d87c46948
Major refactor in prep for multiple conversations.
2023-05-01 09:10:05 -04:00
Adam Treat
d1e3198b65
Add new C++ version of the chat model. Getting ready for chat history.
2023-04-30 20:28:43 -04:00
Adam Treat
13401fc52f
Bump the version.
2023-04-29 21:04:47 -04:00
Adam Treat
727a74de6c
Make an offline installer option.
2023-04-29 12:13:11 -04:00
Adam Treat
9ebf2537fa
Bump the version.
2023-04-29 08:56:53 -04:00
Adam Treat
22441a460b
Fix the icons more.
2023-04-28 21:48:10 -04:00
Adam Treat
1df4035679
Fix icons.
2023-04-28 21:40:45 -04:00
Adam Treat
3f7852f384
Correct the macOS symlink.
2023-04-28 21:26:38 -04:00
Adam Treat
a9bbe3f949
Fix icons and try to make macOS experience happier.
2023-04-28 21:19:12 -04:00
Aaron Miller
af83056a4f
put chat.exe in 'bin' folder of build tree
...
because this is also in llama.cpp's CMakeLists:
https://github.com/ggerganov/llama.cpp/blob/master/CMakeLists.txt#L11
this is where libllama.dll winds up, causing attempts to run the chat UI
from Qt Creator on Windows to fail due to not finding libllama.dll - I've been
working around this by copying libllama.dll *out* of bin/ but have been
bitten a few times by forgetting to keep doing that and the build getting
out of sync.
2023-04-28 20:45:02 -04:00
Adam Treat
43eef81ca8
New startup dialog features.
2023-04-28 11:03:16 -04:00
Adam Treat
fbce5f2078
Unnecessary after all.
2023-04-26 18:35:53 -04:00
Adam Treat
aadeb47026
Put this before.
2023-04-26 13:54:25 -04:00
Adam Treat
8f913c382c
Signing ident.
2023-04-26 13:33:33 -04:00
Adam Treat
74c611b49a
Add back option.
2023-04-26 11:02:05 -04:00
Adam Treat
3c9139b5d2
Move the backend code into own subdirectory and make it a shared library. Begin fleshing out the C api wrapper that bindings can use.
2023-04-26 08:22:38 -04:00
Aaron Miller
cd03c5b7d5
Add QuickDialogs2 to CMake component list
2023-04-25 16:24:55 -04:00
Adam Treat
c9888a285e
Force avx2 off if avx_only is checked.
2023-04-24 17:44:57 -04:00
Adam Treat
f456756ba8
Make clear this is optional and bump the version.
2023-04-24 13:40:10 -04:00
Adam Treat
e6a8681dbe
Always download to a local directory outside of the binary directory otherwise
...
models will be deleted when updates occur. Update version.
2023-04-24 11:31:41 -04:00
Adam Treat
e1159cd997
Make it easier to test and build installers for localhost and avx only.
2023-04-24 01:08:13 -04:00
Adam Treat
76e5b85128
Try again with macOS icon.
2023-04-24 00:44:02 -04:00
Adam Treat
57276d3520
See if we can get the icon for macOS associated with bundle.
2023-04-24 00:33:57 -04:00
Adam Treat
e974b41b2b
Change this back on linux/windows.
2023-04-23 23:42:55 -04:00
Adam Treat
cd352b958d
Working on macos now.
2023-04-23 23:38:12 -04:00
Adam Treat
1d37ebc826
Change name of exe.
2023-04-23 22:57:37 -04:00
Adam Treat
f8dc47e796
Need a subdir.
2023-04-23 22:48:27 -04:00
Adam Treat
134b4dd286
macOS specific cmake changes experiment.
2023-04-23 22:43:30 -04:00
Adam Treat
93f54742b9
Small fixes.
2023-04-23 22:05:24 -04:00
Adam Treat
e06cff8b48
Consolidate all colors to a central theme object.
2023-04-23 09:42:35 -04:00
Adam Treat
c366fc8054
Move the popup dialog into own file and disable network for now.
2023-04-23 07:05:43 -04:00
Adam Treat
889d7d8563
Move settings dialog into own file.
2023-04-23 06:58:07 -04:00
Adam Treat
1f65e381ee
New thumbs up/down support for gpt4all-datalake.
2023-04-22 22:09:14 -04:00
Adam Treat
cca2a88e47
Getting ready for next update.
2023-04-21 23:23:57 -04:00
Adam Treat
bec8072fe1
Fix logic.
2023-04-21 13:46:50 -04:00
eachadea
116f740fb5
Don't build test_hw on apple silicon
2023-04-21 11:25:03 -04:00
Adam Treat
14831cd1c0
Add a small program that tests hardware.
2023-04-20 19:34:56 -04:00
eachadea
b09ca009c5
Don't build a universal binary
...
unless -DBUILD_UNIVERSAL=ON
2023-04-20 06:37:54 -04:00
Adam Treat
55084333a9
Add llama.cpp support for loading llama based models in the gui. We now
...
support loading both gptj derived models and llama derived models.
2023-04-20 06:19:09 -04:00
Adam Treat
e6cb6a2ae3
Add a new model download feature.
2023-04-18 21:10:06 -04:00
Adam Treat
bbf838354e
Don't add version number to the installer or the install location.
2023-04-17 15:59:14 -04:00