Commit Graph

123 Commits

Author SHA1 Message Date
Adam Treat
4c970fdc9c Pin the llama.cpp to a slightly older version. 2023-04-20 07:34:15 -04:00
Adam Treat
43e6d05d21 Don't crash starting with no model. 2023-04-20 07:17:07 -04:00
Adam Treat
d336db9fe9 Don't use versions for model downloader. 2023-04-20 06:48:13 -04:00
eachadea
b09ca009c5 Don't build a universal binary
unless -DBUILD_UNIVERSAL=ON
2023-04-20 06:37:54 -04:00
Adam Treat
55084333a9 Add llama.cpp support for loading llama based models in the gui. We now
support loading both gptj derived models and llama derived models.
2023-04-20 06:19:09 -04:00
Aaron Miller
f1b87d0b56 Add thread count setting 2023-04-19 08:33:13 -04:00
Adam Treat
e6cb6a2ae3 Add a new model download feature. 2023-04-18 21:10:06 -04:00
Adam Treat
1eda8f030e Allow unloading/loading/changing of models. 2023-04-18 11:42:38 -04:00
Aaron Miller
3a82a1d96c remove fill color for prompt template box 2023-04-18 08:47:37 -04:00
Adam Treat
a842f6c33f Fix link color to have consistency across platforms. 2023-04-18 08:45:21 -04:00
Adam Treat
0928c01ddb Make the gui accessible. 2023-04-18 08:40:04 -04:00
Pavol Rusnak
0e599e6b8a readme: GPL -> MIT license 2023-04-17 16:45:29 -04:00
Adam Treat
ef711b305b Changing to MIT license. 2023-04-17 16:37:50 -04:00
Adam Treat
bbf838354e Don't add version number to the installer or the install location. 2023-04-17 15:59:14 -04:00
Adam Treat
9f4e3cb7f4 Bump the version for the context bug fix. 2023-04-17 15:37:24 -04:00
Adam Treat
15ae0a4441 Fix the context. 2023-04-17 14:11:41 -04:00
Adam Treat
801107a12c Set a new default temp that is more conservative. 2023-04-17 09:49:59 -04:00
AT
ea7179e2e8
Update README.md 2023-04-17 09:02:26 -04:00
Adam Treat
7dbf81ed8f Update submodule. 2023-04-17 08:04:40 -04:00
Adam Treat
42fb215f61 Bump version to 2.1 as this has been referred to far and wide as
GPT4All v2 so doing this to decrease confusion. Also, making the version
number visible in the title bar.
2023-04-17 07:50:39 -04:00
Adam Treat
1dcd4dce58 Update the bundled model name. 2023-04-16 22:10:26 -04:00
Adam Treat
7ea548736b New version. 2023-04-16 19:20:43 -04:00
Adam Treat
659ab13665 Don't allow empty prompts. Context past always equal or greater than zero. 2023-04-16 14:57:58 -04:00
Adam Treat
7e9ca06366 Trim trailing whitespace at the end of generation. 2023-04-16 14:19:59 -04:00
Adam Treat
fdf7f20d90 Remove newlines too. 2023-04-16 14:04:25 -04:00
Adam Treat
f8b962d50a More conservative default params and trim leading whitespace from response. 2023-04-16 13:56:56 -04:00
TheBloke
7215b9f3fb Change the example CLI prompt to something more appropriate, as this is not a Llama model! :) 2023-04-16 12:52:23 -04:00
TheBloke
16f6b04a47 Fix repo name 2023-04-16 12:52:23 -04:00
TheBloke
67fcfeea8b Update README to include instructions for building CLI only
Users may want to play around with gpt4all-j from the command line. But they may not have Qt, and might not want to get it, or may find it very hard to do so - eg when using a Google Colab or similar hosted service.

It's easy to build the CLI tools just by building the `ggml` sub folder.  So this commit adds instructions on doing that, including an example invocation of the `gpt-j` binary.
2023-04-16 12:52:23 -04:00
TheBloke
605b3d18ad Update .gitignore to ignore a local build directory. 2023-04-16 12:52:23 -04:00
TheBloke
0abea1db35 Update git clone command in README to point to main nomic repo=
I'm not sure if it was intentional that the build instructions tell the user to clone `manyoso/gpt4all-chat.git`?

But I would think this should be cloning the main repo at `nomic-ai/gpt4all-chat` instead.  Otherwise users following this command might get changes not yet merged into the main repo, which could be confusing.
2023-04-16 12:52:23 -04:00
AT
a29420cbc8
Update README.md 2023-04-16 11:53:02 -04:00
Adam Treat
71ff6bc6f4 Rearrange the buttons and provide a message what the copy button does. 2023-04-16 11:44:55 -04:00
Adam Treat
185dc2460e Check for ###Prompt: or ###Response and stop generating and modify the default template a little bit. 2023-04-16 11:25:48 -04:00
Aaron Miller
d4767478fc add tooltips to settings dialog 2023-04-16 11:16:30 -04:00
Aaron Miller
421a3ed8e7 add "restore defaults" button 2023-04-16 11:16:30 -04:00
Aaron Miller
cb6d2128d3 use the settings dialog settings when generating 2023-04-16 11:16:30 -04:00
Aaron Miller
17c3fa820b add settings dialog 2023-04-16 11:16:30 -04:00
Aaron Miller
be0375e32d add settings icon 2023-04-16 11:16:30 -04:00
Adam Treat
2354779ac1 Provide an instruct/chat template. 2023-04-15 16:33:37 -04:00
Aaron Miller
0f9b80e6b6 Use completeBaseName to display model name
this cuts the filename at the *final* dot instead of the first, allowing
model names with version numbers to be displayed correctly.
2023-04-15 13:29:51 -04:00
Adam Treat
2f3a46c17f Erase the correct amount of logits when regenerating which is not the same
as the number of tokens.
2023-04-15 09:19:54 -04:00
Adam Treat
12bf78bf24 Fix crash with recent change to erase context. 2023-04-15 09:10:34 -04:00
Adam Treat
f8005cff45 When regenerating erase the previous response and prompt from the context. 2023-04-15 09:10:27 -04:00
AT
aa836fa6d5
Merge pull request #28 from TheBloke/macOS_Universal
Add support for building a Universal binary on macOS
2023-04-14 14:06:47 -04:00
TheBloke
2c64c8972d Remove Qt dir 2023-04-14 17:33:54 +01:00
TheBloke
ccde3f8111 Remove test debug lines 2023-04-14 17:28:44 +01:00
TheBloke
a8a6b8ae30 Add support for building a Universal binary on macOS 2023-04-14 17:19:03 +01:00
Zach Nussbaum
dcd802de4d
Update README.md 2023-04-14 06:17:02 -07:00
AT
ab33720b0a
Merge branch 'manyoso:master' into master 2023-04-14 09:06:54 -04:00