Adam Treat
3cf8f0da13
New version of icns made on a mac.
2023-04-29 08:40:54 -04:00
Adam Treat
8cd3838480
Add 1024 resolution to icns.
2023-04-29 04:39:55 -04:00
Adam Treat
7ed1af3c94
Fixup icns
2023-04-29 04:38:36 -04:00
Adam Treat
c0f97fa76c
Rework the icon a bit to more closely match macOS style guidelines.
2023-04-29 04:31:06 -04:00
Adam Treat
4a968a8c88
Always hardcode.
2023-04-29 04:06:26 -04:00
Adam Treat
233505c48f
Require a direct choice for opt-in
2023-04-29 03:55:06 -04:00
Adam Treat
e6b919ee49
Always hardcode.
2023-04-28 22:46:01 -04:00
Adam Treat
c794488b25
Fixup.
2023-04-28 22:37:59 -04:00
Adam Treat
23f3ba5b78
Try to fix uninstall of symlink.
2023-04-28 22:28:11 -04:00
Adam Treat
9979c78c6c
Set the folder when the browse opens
2023-04-28 22:24:59 -04:00
Adam Treat
792cdd60fd
Force ini format for all platforms.
2023-04-28 22:21:23 -04:00
Adam Treat
977d5d7956
No need to install so many icons.
2023-04-28 22:10:41 -04:00
Adam Treat
364eeb2ce5
Don't delete symlink unless we're uninstalling.
2023-04-28 22:07:37 -04:00
Adam Treat
bba50d1aec
Remove symlink when uninstalling.
2023-04-28 21:51:39 -04:00
Adam Treat
22441a460b
Fix the icons more.
2023-04-28 21:48:10 -04:00
Adam Treat
1df4035679
Fix icons.
2023-04-28 21:40:45 -04:00
Adam Treat
3f7852f384
Correct the macOS symlink.
2023-04-28 21:26:38 -04:00
Adam Treat
a9bbe3f949
Fix icons and try to make macOS experience happier.
2023-04-28 21:19:12 -04:00
Aaron Miller
ad2cb91d5a
use C locale for DoubleValidator
...
Closes https://github.com/nomic-ai/gpt4all-chat/issues/126
2023-04-28 20:45:40 -04:00
Aaron Miller
af83056a4f
put chat.exe in 'bin' folder of build tree
...
because this is also in llama.cpp's CMakeLists:
https://github.com/ggerganov/llama.cpp/blob/master/CMakeLists.txt#L11
this is where libllama.dll winds up, causing attempts to run the chat UI
from Qt Creator on Windows to fail due to not finding libllama.dll - I've been
working around this by copying libllama.dll *out* of bin/ but have been
bitten a few times by forgetting to keep doing that and the build getting
out of sync.
2023-04-28 20:45:02 -04:00
Adam Treat
9b4a5e7e9c
Convert new ico and icns logos.
2023-04-28 20:40:35 -04:00
Adam Treat
bc77d95def
Add a requires field for the models.json for future proofing.
2023-04-28 20:30:52 -04:00
Adam Treat
69f92d8ea8
Load models from filepath only.
2023-04-28 20:15:10 -04:00
Adam Treat
ca2af100cd
Update ignore.
2023-04-28 14:11:56 -04:00
Adam Treat
b3a0bd158c
Fix bug with startup order and new logos.
2023-04-28 14:11:18 -04:00
Adam Treat
d982dc0529
Update to latest llama.cpp
2023-04-28 11:03:16 -04:00
Adam Treat
43eef81ca8
New startup dialog features.
2023-04-28 11:03:16 -04:00
Adam Treat
f8754cbe1b
Fix settings dialog to use onClosed handler.
2023-04-28 11:03:16 -04:00
Aaron Miller
305c9dc30c
make download path in settings match dl dialog
2023-04-27 17:41:38 -04:00
Adam Treat
8a13d638d4
Small fix.
2023-04-27 16:45:24 -04:00
Adam Treat
6256b4fd33
Have to be able to change the download path from the download dialog and other fixes.
2023-04-27 16:27:53 -04:00
Adam Treat
b00da454e4
Provide a description and make the downloader cleaner and prettier.
2023-04-27 14:52:40 -04:00
Adam Treat
62a885de40
Always try and load default model first. Groovy is the default default.
2023-04-27 13:52:29 -04:00
Adam Treat
97baf3d486
Make the input area wrap automatically.
2023-04-27 11:54:53 -04:00
Adam Treat
db3acf9980
Silence warning.
2023-04-27 11:44:41 -04:00
Adam Treat
5a7d40f604
Move the saving of the tokens to the impl and not the callbacks responsibility.
2023-04-27 11:16:51 -04:00
Adam Treat
ba4b28fcd5
Move the promptCallback to own function.
2023-04-27 11:08:15 -04:00
Adam Treat
0e9f85bcda
Provide an initial impl. of the C interface. NOTE: has not been tested.
2023-04-27 09:43:24 -04:00
Adam Treat
386ce08fca
Track check for updates.
2023-04-27 07:41:23 -04:00
Adam Treat
b19d2f2c21
Add this and unbreak the build.
2023-04-26 22:45:10 -04:00
Aaron Miller
5641c365af
download: don't read whole file into ram to md5 it
...
we go to the trouble of using a tempfile and then reintroduce
a case of reading the whole file into ram again?
2023-04-26 22:14:21 -04:00
Aaron Miller
18fa61c025
download: atomically move tempfile when possible
...
should save unnecessary time and I/O and eliminate the possibility
of the file being improperly truncated when the temp file is on
the same filesystem as the destination path
2023-04-26 22:14:21 -04:00
Adam Treat
ee5c58c26c
Initial support for opt-in telemetry.
2023-04-26 22:05:56 -04:00
Adam Treat
a3d97fa009
Don't crash when prompt is too large.
2023-04-26 19:08:37 -04:00
Adam Treat
fbce5f2078
Unnecessary after all.
2023-04-26 18:35:53 -04:00
Adam Treat
aadeb47026
Put this before.
2023-04-26 13:54:25 -04:00
Adam Treat
8f913c382c
Signing ident.
2023-04-26 13:33:33 -04:00
Adam Treat
7da3bc07cc
Update llama.cpp submodule to latest.
2023-04-26 11:50:05 -04:00
Adam Treat
74c611b49a
Add back option.
2023-04-26 11:02:05 -04:00
Adam Treat
739ef41325
Add optional.
2023-04-26 09:48:49 -04:00