chat: update and improve translations for v3.3.0 (#2970)

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Signed-off-by: Riccardo Giovanetti <riccardo.giovanetti@gmail.com>
Co-authored-by: Riccardo Giovanetti <riccardo.giovanetti@gmail.com>
This commit is contained in:
Jared Van Bortel 2024-09-19 14:35:53 -04:00 committed by GitHub
parent 3682b242e7
commit 5d454603d3
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
10 changed files with 187 additions and 852 deletions

View File

@ -29,7 +29,8 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
- Fix the antenna icon tooltip when using the local server ([#2922](https://github.com/nomic-ai/gpt4all/pull/2922)) - Fix the antenna icon tooltip when using the local server ([#2922](https://github.com/nomic-ai/gpt4all/pull/2922))
- Fix a few issues with locating files and handling errors when loading remote models on startup ([#2875](https://github.com/nomic-ai/gpt4all/pull/2875)) - Fix a few issues with locating files and handling errors when loading remote models on startup ([#2875](https://github.com/nomic-ai/gpt4all/pull/2875))
- Significantly improve API server request parsing and response correctness ([#2929](https://github.com/nomic-ai/gpt4all/pull/2929)) - Significantly improve API server request parsing and response correctness ([#2929](https://github.com/nomic-ai/gpt4all/pull/2929))
- Removed unnecessary dependency on Qt WaylandCompositor module ([#2949](https://github.com/nomic-ai/gpt4all/pull/2949)) - Remove unnecessary dependency on Qt WaylandCompositor module ([#2949](https://github.com/nomic-ai/gpt4all/pull/2949))
- Update translations ([#2970](https://github.com/nomic-ai/gpt4all/pull/2970))
## [3.2.1] - 2024-08-13 ## [3.2.1] - 2024-08-13

View File

@ -32,15 +32,15 @@ MySettingsTab {
anchors.centerIn: parent anchors.centerIn: parent
modal: false modal: false
padding: 20 padding: 20
width: 40 + 400 * theme.fontScale
Text { Text {
anchors.fill: parent
horizontalAlignment: Text.AlignJustify horizontalAlignment: Text.AlignJustify
text: qsTr("ERROR: Update system could not find the MaintenanceTool used<br> text: qsTr("ERROR: Update system could not find the MaintenanceTool used to check for updates!<br/><br/>"
to check for updates!<br><br> + "Did you install this application using the online installer? If so, the MaintenanceTool "
Did you install this application using the online installer? If so,<br> + "executable should be located one directory above where this application resides on your "
the MaintenanceTool executable should be located one directory<br> + "filesystem.<br/><br/>If you can't start it manually, then I'm afraid you'll have to reinstall.")
above where this application resides on your filesystem.<br><br> wrapMode: Text.WordWrap
If you can't start it manually, then I'm afraid you'll have to<br>
reinstall.")
color: theme.textErrorColor color: theme.textErrorColor
font.pixelSize: theme.fontSizeLarge font.pixelSize: theme.fontSizeLarge
Accessible.role: Accessible.Dialog Accessible.role: Accessible.Dialog

View File

@ -52,11 +52,18 @@ MyDialog {
MyTextArea { MyTextArea {
id: textOptIn id: textOptIn
width: 1024 - 40 width: 1024 - 40
text: qsTr("By enabling this feature, you will be able to participate in the democratic process of training a large language model by contributing data for future model improvements. text: qsTr("By enabling this feature, you will be able to participate in the democratic process of "
+ "training a large language model by contributing data for future model improvements.\n\n"
When a GPT4All model responds to you and you have opted-in, your conversation will be sent to the GPT4All Open Source Datalake. Additionally, you can like/dislike its response. If you dislike a response, you can suggest an alternative response. This data will be collected and aggregated in the GPT4All Datalake. + "When a GPT4All model responds to you and you have opted-in, your conversation will be sent to "
+ "the GPT4All Open Source Datalake. Additionally, you can like/dislike its response. If you "
NOTE: By turning on this feature, you will be sending your data to the GPT4All Open Source Datalake. You should have no expectation of chat privacy when this feature is enabled. You should; however, have an expectation of an optional attribution if you wish. Your chat data will be openly available for anyone to download and will be used by Nomic AI to improve future GPT4All models. Nomic AI will retain all attribution information attached to your data and you will be credited as a contributor to any GPT4All model release that uses your data!") + "dislike a response, you can suggest an alternative response. This data will be collected and "
+ "aggregated in the GPT4All Datalake.\n\n"
+ "NOTE: By turning on this feature, you will be sending your data to the GPT4All Open Source "
+ "Datalake. You should have no expectation of chat privacy when this feature is enabled. You "
+ "should; however, have an expectation of an optional attribution if you wish. Your chat data "
+ "will be openly available for anyone to download and will be used by Nomic AI to improve "
+ "future GPT4All models. Nomic AI will retain all attribution information attached to your data "
+ "and you will be credited as a contributor to any GPT4All model release that uses your data!")
focus: false focus: false
readOnly: true readOnly: true
Accessible.role: Accessible.Paragraph Accessible.role: Accessible.Paragraph

View File

@ -370,17 +370,6 @@
<source>opt-in to share feedback/conversations</source> <source>opt-in to share feedback/conversations</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="37"/>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt;
to check for updates!&lt;br&gt;&lt;br&gt;
Did you install this application using the online installer? If so,&lt;br&gt;
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to&lt;br&gt;
reinstall.</source>
<translation type="unfinished"></translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="48"/> <location filename="../src/qml/ApplicationSettings.qml" line="48"/>
<source>Error dialog</source> <source>Error dialog</source>
@ -416,6 +405,11 @@
<source>Light</source> <source>Light</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="39"/>
<source>ERROR: Update system could not find the MaintenanceTool used to check for updates!&lt;br/&gt;&lt;br/&gt;Did you install this application using the online installer? If so, the MaintenanceTool executable should be located one directory above where this application resides on your filesystem.&lt;br/&gt;&lt;br/&gt;If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to reinstall.</source>
<translation type="unfinished"></translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="114"/> <location filename="../src/qml/ApplicationSettings.qml" line="114"/>
<source>LegacyDark</source> <source>LegacyDark</source>
@ -1098,37 +1092,37 @@ model to get started</source>
<context> <context>
<name>Download</name> <name>Download</name>
<message> <message>
<location filename="../src/download.cpp" line="279"/> <location filename="../src/download.cpp" line="278"/>
<source>Model &quot;%1&quot; is installed successfully.</source> <source>Model &quot;%1&quot; is installed successfully.</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="289"/> <location filename="../src/download.cpp" line="288"/>
<source>ERROR: $MODEL_NAME is empty.</source> <source>ERROR: $MODEL_NAME is empty.</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="295"/> <location filename="../src/download.cpp" line="294"/>
<source>ERROR: $API_KEY is empty.</source> <source>ERROR: $API_KEY is empty.</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="301"/> <location filename="../src/download.cpp" line="300"/>
<source>ERROR: $BASE_URL is invalid.</source> <source>ERROR: $BASE_URL is invalid.</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="307"/> <location filename="../src/download.cpp" line="306"/>
<source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source> <source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="326"/> <location filename="../src/download.cpp" line="325"/>
<source>Model &quot;%1 (%2)&quot; is installed successfully.</source> <source>Model &quot;%1 (%2)&quot; is installed successfully.</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="350"/> <location filename="../src/download.cpp" line="349"/>
<source>Model &quot;%1&quot; is removed.</source> <source>Model &quot;%1&quot; is removed.</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
@ -2066,47 +2060,47 @@ NOTE: By turning on this feature, you will be sending your data to the GPT4All O
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="63"/> <location filename="../src/qml/NetworkDialog.qml" line="70"/>
<source>Terms for opt-in</source> <source>Terms for opt-in</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="64"/> <location filename="../src/qml/NetworkDialog.qml" line="71"/>
<source>Describes what will happen when you opt-in</source> <source>Describes what will happen when you opt-in</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="72"/> <location filename="../src/qml/NetworkDialog.qml" line="79"/>
<source>Please provide a name for attribution (optional)</source> <source>Please provide a name for attribution (optional)</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="74"/> <location filename="../src/qml/NetworkDialog.qml" line="81"/>
<source>Attribution (optional)</source> <source>Attribution (optional)</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="75"/> <location filename="../src/qml/NetworkDialog.qml" line="82"/>
<source>Provide attribution</source> <source>Provide attribution</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="88"/> <location filename="../src/qml/NetworkDialog.qml" line="95"/>
<source>Enable</source> <source>Enable</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="89"/> <location filename="../src/qml/NetworkDialog.qml" line="96"/>
<source>Enable opt-in</source> <source>Enable opt-in</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="93"/> <location filename="../src/qml/NetworkDialog.qml" line="100"/>
<source>Cancel</source> <source>Cancel</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="94"/> <location filename="../src/qml/NetworkDialog.qml" line="101"/>
<source>Cancel opt-in</source> <source>Cancel opt-in</source>
<translation type="unfinished"></translation> <translation type="unfinished"></translation>
</message> </message>

View File

@ -370,17 +370,6 @@
<source>opt-in to share feedback/conversations</source> <source>opt-in to share feedback/conversations</source>
<translation>optar por compartir comentarios/conversaciones</translation> <translation>optar por compartir comentarios/conversaciones</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="37"/>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt;
to check for updates!&lt;br&gt;&lt;br&gt;
Did you install this application using the online installer? If so,&lt;br&gt;
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to&lt;br&gt;
reinstall.</source>
<translation type="unfinished"></translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="48"/> <location filename="../src/qml/ApplicationSettings.qml" line="48"/>
<source>Error dialog</source> <source>Error dialog</source>
@ -416,6 +405,11 @@
<source>Light</source> <source>Light</source>
<translation>Claro</translation> <translation>Claro</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="39"/>
<source>ERROR: Update system could not find the MaintenanceTool used to check for updates!&lt;br/&gt;&lt;br/&gt;Did you install this application using the online installer? If so, the MaintenanceTool executable should be located one directory above where this application resides on your filesystem.&lt;br/&gt;&lt;br/&gt;If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to reinstall.</source>
<translation>ERROR: El sistema de actualización no pudo encontrar la Herramienta de Mantenimiento utilizada para buscar actualizaciones.&lt;br&gt;&lt;br&gt;¿Instaló esta aplicación utilizando el instalador en línea? Si es así, el ejecutable de la Herramienta de Mantenimiento debería estar ubicado un directorio por encima de donde reside esta aplicación en su sistema de archivos.&lt;br&gt;&lt;br&gt;Si no puede iniciarlo manualmente, me temo que tendrá que reinstalar la aplicación.</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="114"/> <location filename="../src/qml/ApplicationSettings.qml" line="114"/>
<source>LegacyDark</source> <source>LegacyDark</source>
@ -558,23 +552,13 @@
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="505"/> <location filename="../src/qml/ApplicationSettings.qml" line="505"/>
<source>Enable Local API Server</source> <source>Enable Local API Server</source>
<translation type="unfinished"></translation> <translation>Habilitar el servidor API local</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="506"/> <location filename="../src/qml/ApplicationSettings.qml" line="506"/>
<source>Expose an OpenAI-Compatible server to localhost. WARNING: Results in increased resource usage.</source> <source>Expose an OpenAI-Compatible server to localhost. WARNING: Results in increased resource usage.</source>
<translation>Exponer un servidor compatible con OpenAI a localhost. ADVERTENCIA: Resulta en un mayor uso de recursos.</translation> <translation>Exponer un servidor compatible con OpenAI a localhost. ADVERTENCIA: Resulta en un mayor uso de recursos.</translation>
</message> </message>
<message>
<source>Enable Local Server</source>
<translation type="vanished">Habilitar servidor local</translation>
</message>
<message>
<source>Expose an OpenAI-Compatible server to localhost. WARNING: Results in increased
resource usage.</source>
<translation type="vanished">Exponer un servidor compatible con OpenAI a localhost. ADVERTENCIA: Resulta
en un mayor uso de recursos.</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="522"/> <location filename="../src/qml/ApplicationSettings.qml" line="522"/>
<source>API Server Port</source> <source>API Server Port</source>
@ -616,22 +600,6 @@
<source>Application default</source> <source>Application default</source>
<translation>Predeterminado de la aplicación</translation> <translation>Predeterminado de la aplicación</translation>
</message> </message>
<message>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt;
to check for updates!&lt;br&gt;&lt;br&gt;
Did you install this application using the online installer? If so,&lt;br&gt;
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to&lt;br&gt;
reinstall.</source>
<translation type="vanished">ERROR: El sistema de actualización no pudo encontrar la Herramienta de Mantenimiento utilizada&lt;br&gt;
para buscar actualizaciones.&lt;br&gt;&lt;br&gt;
¿Instaló esta aplicación utilizando el instalador en línea? Si es así,&lt;br&gt;
el ejecutable de la Herramienta de Mantenimiento debería estar ubicado un directorio&lt;br&gt;
por encima de donde reside esta aplicación en su sistema de archivos.&lt;br&gt;&lt;br&gt;
Si no puede iniciarlo manualmente, me temo que tendrá que&lt;br&gt;
reinstalar la aplicación.</translation>
</message>
</context> </context>
<context> <context>
<name>Chat</name> <name>Chat</name>
@ -904,10 +872,6 @@
<source>You</source> <source>You</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>recalculating context ...</source>
<translation type="vanished">recalculando contexto ...</translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="878"/> <location filename="../src/qml/ChatView.qml" line="878"/>
<source>response stopped ...</source> <source>response stopped ...</source>
@ -1134,37 +1098,37 @@ modelo para comenzar
<context> <context>
<name>Download</name> <name>Download</name>
<message> <message>
<location filename="../src/download.cpp" line="279"/> <location filename="../src/download.cpp" line="278"/>
<source>Model &quot;%1&quot; is installed successfully.</source> <source>Model &quot;%1&quot; is installed successfully.</source>
<translation>El modelo &quot;%1&quot; se ha instalado correctamente.</translation> <translation>El modelo &quot;%1&quot; se ha instalado correctamente.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="289"/> <location filename="../src/download.cpp" line="288"/>
<source>ERROR: $MODEL_NAME is empty.</source> <source>ERROR: $MODEL_NAME is empty.</source>
<translation>ERROR: $MODEL_NAME está vacío.</translation> <translation>ERROR: $MODEL_NAME está vacío.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="295"/> <location filename="../src/download.cpp" line="294"/>
<source>ERROR: $API_KEY is empty.</source> <source>ERROR: $API_KEY is empty.</source>
<translation>ERROR: $API_KEY está vacía.</translation> <translation>ERROR: $API_KEY está vacía.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="301"/> <location filename="../src/download.cpp" line="300"/>
<source>ERROR: $BASE_URL is invalid.</source> <source>ERROR: $BASE_URL is invalid.</source>
<translation>ERROR: $BASE_URL no es válida.</translation> <translation>ERROR: $BASE_URL no es válida.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="307"/> <location filename="../src/download.cpp" line="306"/>
<source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source> <source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source>
<translation>ERROR: El modelo &quot;%1 (%2)&quot; está en conflicto.</translation> <translation>ERROR: El modelo &quot;%1 (%2)&quot; está en conflicto.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="326"/> <location filename="../src/download.cpp" line="325"/>
<source>Model &quot;%1 (%2)&quot; is installed successfully.</source> <source>Model &quot;%1 (%2)&quot; is installed successfully.</source>
<translation>El modelo &quot;%1 (%2)&quot; se ha instalado correctamente.</translation> <translation>El modelo &quot;%1 (%2)&quot; se ha instalado correctamente.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="350"/> <location filename="../src/download.cpp" line="349"/>
<source>Model &quot;%1&quot; is removed.</source> <source>Model &quot;%1&quot; is removed.</source>
<translation>El modelo &quot;%1&quot; ha sido eliminado.</translation> <translation>El modelo &quot;%1&quot; ha sido eliminado.</translation>
</message> </message>
@ -1284,12 +1248,6 @@ modelo para comenzar
<source>Allowed File Extensions</source> <source>Allowed File Extensions</source>
<translation>Extensiones de archivo permitidas</translation> <translation>Extensiones de archivo permitidas</translation>
</message> </message>
<message>
<source>Comma-separated list. LocalDocs will only attempt to process files with these
extensions.</source>
<translation type="vanished">Lista separada por comas. DocumentosLocales solo intentará procesar
archivos con estas extensiones.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsSettings.qml" line="100"/> <location filename="../src/qml/LocalDocsSettings.qml" line="100"/>
<source>Embedding</source> <source>Embedding</source>
@ -1300,12 +1258,6 @@ modelo para comenzar
<source>Use Nomic Embed API</source> <source>Use Nomic Embed API</source>
<translation>Usar API de incrustación Nomic</translation> <translation>Usar API de incrustación Nomic</translation>
</message> </message>
<message>
<source>Embed documents using the fast Nomic API instead of a private local model.
Requires restart.</source>
<translation type="vanished">Incrustar documentos usando la rápida API de Nomic en lugar de un modelo
local privado. Requiere reinicio.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsSettings.qml" line="130"/> <location filename="../src/qml/LocalDocsSettings.qml" line="130"/>
<source>Nomic API Key</source> <source>Nomic API Key</source>
@ -1381,11 +1333,6 @@ modelo para comenzar
<source>Max best N matches of retrieved document snippets to add to the context for prompt. Larger numbers increase likelihood of factual responses, but also result in slower generation.</source> <source>Max best N matches of retrieved document snippets to add to the context for prompt. Larger numbers increase likelihood of factual responses, but also result in slower generation.</source>
<translation>Máximo de N mejores coincidencias de fragmentos de documentos recuperados para añadir al contexto del prompt. Números más grandes aumentan la probabilidad de respuestas verídicas, pero también resultan en una generación más lenta.</translation> <translation>Máximo de N mejores coincidencias de fragmentos de documentos recuperados para añadir al contexto del prompt. Números más grandes aumentan la probabilidad de respuestas verídicas, pero también resultan en una generación más lenta.</translation>
</message> </message>
<message>
<source> Values too large may cause localdocs failure, extremely slow responses or failure to respond at all. Roughly speaking, the {N chars x N snippets} are added to the model&apos;s context window. More info &lt;a href=&quot;https://docs.gpt4all.io/gpt4all_desktop/localdocs.html&quot;&gt;here&lt;/a&gt;.
</source>
<translation type="vanished"> Valores demasiado grandes pueden causar fallos en documentos locales, respuestas extremadamente lentas o falta de respuesta. En términos generales, los {N caracteres x N fragmentos} se agregan a la ventana de contexto del modelo. Más información &lt;a href=&quot;https://docs.gpt4all.io/gpt4all_desktop/localdocs.html&quot;&gt;aquí&lt;/a&gt;.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsSettings.qml" line="266"/> <location filename="../src/qml/LocalDocsSettings.qml" line="266"/>
<source>Document snippet size (characters)</source> <source>Document snippet size (characters)</source>
@ -1414,10 +1361,6 @@ modelo para comenzar
<source> Add Collection</source> <source> Add Collection</source>
<translation> Agregar colección</translation> <translation> Agregar colección</translation>
</message> </message>
<message>
<source>ERROR: The LocalDocs database is not valid.</source>
<translation type="vanished">ERROR: La base de datos de DocumentosLocales no es válida.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsView.qml" line="109"/> <location filename="../src/qml/LocalDocsView.qml" line="109"/>
<source>No Collections Installed</source> <source>No Collections Installed</source>
@ -1562,12 +1505,6 @@ modelo para comenzar
<source>&lt;ul&gt;&lt;li&gt;Requires personal OpenAI API key.&lt;/li&gt;&lt;li&gt;WARNING: Will send your chats to OpenAI!&lt;/li&gt;&lt;li&gt;Your API key will be stored on disk&lt;/li&gt;&lt;li&gt;Will only be used to communicate with OpenAI&lt;/li&gt;&lt;li&gt;You can apply for an API key &lt;a href=&quot;https://platform.openai.com/account/api-keys&quot;&gt;here.&lt;/a&gt;&lt;/li&gt;</source> <source>&lt;ul&gt;&lt;li&gt;Requires personal OpenAI API key.&lt;/li&gt;&lt;li&gt;WARNING: Will send your chats to OpenAI!&lt;/li&gt;&lt;li&gt;Your API key will be stored on disk&lt;/li&gt;&lt;li&gt;Will only be used to communicate with OpenAI&lt;/li&gt;&lt;li&gt;You can apply for an API key &lt;a href=&quot;https://platform.openai.com/account/api-keys&quot;&gt;here.&lt;/a&gt;&lt;/li&gt;</source>
<translation>&lt;ul&gt;&lt;li&gt;Requiere clave API personal de OpenAI.&lt;/li&gt;&lt;li&gt;ADVERTENCIA: ¡Enviará sus chats a OpenAI!&lt;/li&gt;&lt;li&gt;Su clave API se almacenará en el disco&lt;/li&gt;&lt;li&gt;Solo se usará para comunicarse con OpenAI&lt;/li&gt;&lt;li&gt;Puede solicitar una clave API &lt;a href=&quot;https://platform.openai.com/account/api-keys&quot;&gt;aquí.&lt;/a&gt;&lt;/li&gt;</translation> <translation>&lt;ul&gt;&lt;li&gt;Requiere clave API personal de OpenAI.&lt;/li&gt;&lt;li&gt;ADVERTENCIA: ¡Enviará sus chats a OpenAI!&lt;/li&gt;&lt;li&gt;Su clave API se almacenará en el disco&lt;/li&gt;&lt;li&gt;Solo se usará para comunicarse con OpenAI&lt;/li&gt;&lt;li&gt;Puede solicitar una clave API &lt;a href=&quot;https://platform.openai.com/account/api-keys&quot;&gt;aquí.&lt;/a&gt;&lt;/li&gt;</translation>
</message> </message>
<message>
<source>&lt;strong&gt;OpenAI&apos;s ChatGPT model GPT-3.5 Turbo&lt;/strong&gt;&lt;br&gt;
%1</source>
<translation type="vanished">&lt;strong&gt;Modelo ChatGPT GPT-3.5 Turbo de
OpenAI&lt;/strong&gt;&lt;br&gt; %1</translation>
</message>
<message> <message>
<location filename="../src/modellist.cpp" line="1585"/> <location filename="../src/modellist.cpp" line="1585"/>
<source>&lt;strong&gt;OpenAI&apos;s ChatGPT model GPT-3.5 Turbo&lt;/strong&gt;&lt;br&gt; %1</source> <source>&lt;strong&gt;OpenAI&apos;s ChatGPT model GPT-3.5 Turbo&lt;/strong&gt;&lt;br&gt; %1</source>
@ -1617,12 +1554,12 @@ modelo para comenzar
<location filename="../src/modellist.cpp" line="1226"/> <location filename="../src/modellist.cpp" line="1226"/>
<location filename="../src/modellist.cpp" line="1277"/> <location filename="../src/modellist.cpp" line="1277"/>
<source>cannot open &quot;%1&quot;: %2</source> <source>cannot open &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>no se puede abrir &quot;%1&quot;: %2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1238"/> <location filename="../src/modellist.cpp" line="1238"/>
<source>cannot create &quot;%1&quot;: %2</source> <source>cannot create &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>no se puede crear &quot;%1&quot;: %2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1289"/> <location filename="../src/modellist.cpp" line="1289"/>
@ -2132,19 +2069,6 @@ NOTA: No surte efecto hasta que recargue el modelo.</translation>
<source>Contribute data to the GPT4All Opensource Datalake.</source> <source>Contribute data to the GPT4All Opensource Datalake.</source>
<translation>Contribuir datos al Datalake de código abierto de GPT4All.</translation> <translation>Contribuir datos al Datalake de código abierto de GPT4All.</translation>
</message> </message>
<message>
<source>By enabling this feature, you will be able to participate in the democratic process of training a large language model by contributing data for future model improvements.
When a GPT4All model responds to you and you have opted-in, your conversation will
be sent to the GPT4All Open Source Datalake. Additionally, you can like/dislike its
response. If you dislike a response, you can suggest an alternative response. This
data will be collected and aggregated in the GPT4All Datalake.
NOTE: By turning on this feature, you will be sending your data to the GPT4All Open Source Datalake. You should have no expectation of chat privacy when this feature is enabled. You should; however, have an expectation of an optional attribution if you wish. Your chat data will be openly available for anyone to download and will be used by Nomic AI to improve future GPT4All models. Nomic AI will retain all attribution information attached to your data and you will be credited as a contributor to any GPT4All model release that uses your data!</source>
<translation type="vanished">Al habilitar esta función, podrá participar en el proceso democrático de entrenamiento de un modelo de lenguaje grande contribuyendo con datos para futuras mejoras del modelo. Cuando un modelo GPT4All le responda y usted haya aceptado, su conversación se enviará al Datalake de código abierto de GPT4All. Además, puede dar me gusta/no me gusta a su respuesta. Si no le gusta una respuesta, puede sugerir una alternativa. Estos datos se recopilarán y agregarán en el Datalake de GPT4All.
NOTA: Al activar esta función, enviará sus datos al Datalake de código abierto de GPT4All. No debe esperar privacidad en el chat cuando esta función esté habilitada. Sin embargo, puede esperar una atribución opcional si lo desea. Sus datos de chat estarán disponibles abiertamente para que cualquiera los descargue y serán utilizados por Nomic AI para mejorar futuros modelos de GPT4All. Nomic AI conservará toda la información de atribución adjunta a sus datos y se le acreditará como contribuyente en cualquier lanzamiento de modelo GPT4All que utilice sus datos.</translation>
</message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="55"/> <location filename="../src/qml/NetworkDialog.qml" line="55"/>
<source>By enabling this feature, you will be able to participate in the democratic process of training a large language model by contributing data for future model improvements. <source>By enabling this feature, you will be able to participate in the democratic process of training a large language model by contributing data for future model improvements.
@ -2159,47 +2083,47 @@ Cuando un modelo GPT4All te responda y hayas aceptado participar, tu conversaci
NOTA: Al activar esta función, estarás enviando tus datos al Datalake de Código Abierto de GPT4All. No debes esperar privacidad en el chat cuando esta función esté habilitada. Sin embargo, puedes esperar una atribución opcional si lo deseas. Tus datos de chat estarán disponibles abiertamente para que cualquiera los descargue y serán utilizados por Nomic AI para mejorar futuros modelos de GPT4All. Nomic AI conservará toda la información de atribución adjunta a tus datos y se te acreditará como contribuyente en cualquier lanzamiento de modelo GPT4All que utilice tus datos.</translation> NOTA: Al activar esta función, estarás enviando tus datos al Datalake de Código Abierto de GPT4All. No debes esperar privacidad en el chat cuando esta función esté habilitada. Sin embargo, puedes esperar una atribución opcional si lo deseas. Tus datos de chat estarán disponibles abiertamente para que cualquiera los descargue y serán utilizados por Nomic AI para mejorar futuros modelos de GPT4All. Nomic AI conservará toda la información de atribución adjunta a tus datos y se te acreditará como contribuyente en cualquier lanzamiento de modelo GPT4All que utilice tus datos.</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="63"/> <location filename="../src/qml/NetworkDialog.qml" line="70"/>
<source>Terms for opt-in</source> <source>Terms for opt-in</source>
<translation>Términos para optar por participar</translation> <translation>Términos para optar por participar</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="64"/> <location filename="../src/qml/NetworkDialog.qml" line="71"/>
<source>Describes what will happen when you opt-in</source> <source>Describes what will happen when you opt-in</source>
<translation>Describe lo que sucederá cuando opte por participar</translation> <translation>Describe lo que sucederá cuando opte por participar</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="72"/> <location filename="../src/qml/NetworkDialog.qml" line="79"/>
<source>Please provide a name for attribution (optional)</source> <source>Please provide a name for attribution (optional)</source>
<translation>Por favor, proporcione un nombre para la atribución (opcional)</translation> <translation>Por favor, proporcione un nombre para la atribución (opcional)</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="74"/> <location filename="../src/qml/NetworkDialog.qml" line="81"/>
<source>Attribution (optional)</source> <source>Attribution (optional)</source>
<translation>Atribución (opcional)</translation> <translation>Atribución (opcional)</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="75"/> <location filename="../src/qml/NetworkDialog.qml" line="82"/>
<source>Provide attribution</source> <source>Provide attribution</source>
<translation>Proporcionar atribución</translation> <translation>Proporcionar atribución</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="88"/> <location filename="../src/qml/NetworkDialog.qml" line="95"/>
<source>Enable</source> <source>Enable</source>
<translation>Habilitar</translation> <translation>Habilitar</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="89"/> <location filename="../src/qml/NetworkDialog.qml" line="96"/>
<source>Enable opt-in</source> <source>Enable opt-in</source>
<translation>Habilitar participación</translation> <translation>Habilitar participación</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="93"/> <location filename="../src/qml/NetworkDialog.qml" line="100"/>
<source>Cancel</source> <source>Cancel</source>
<translation>Cancelar</translation> <translation>Cancelar</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="94"/> <location filename="../src/qml/NetworkDialog.qml" line="101"/>
<source>Cancel opt-in</source> <source>Cancel opt-in</source>
<translation>Cancelar participación</translation> <translation>Cancelar participación</translation>
</message> </message>
@ -2240,13 +2164,6 @@ NOTA: Al activar esta función, estarás enviando tus datos al Datalake de Códi
<translation>Se muestra cuando la ventana emergente está ocupada</translation> <translation>Se muestra cuando la ventana emergente está ocupada</translation>
</message> </message>
</context> </context>
<context>
<name>QObject</name>
<message>
<source>Default</source>
<translation type="vanished">Predeterminado</translation>
</message>
</context>
<context> <context>
<name>SettingsView</name> <name>SettingsView</name>
<message> <message>
@ -2289,7 +2206,10 @@ NOTA: Al activar esta función, estarás enviando tus datos al Datalake de Códi
%1&lt;br/&gt; %1&lt;br/&gt;
### Contributors ### Contributors
%2</source> %2</source>
<translation type="unfinished"></translation> <translation>### Notas de la versión
%1&lt;br/&gt;
### Colaboradores
%2</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/StartupDialog.qml" line="71"/> <location filename="../src/qml/StartupDialog.qml" line="71"/>
@ -2301,16 +2221,6 @@ NOTA: Al activar esta función, estarás enviando tus datos al Datalake de Códi
<source>Release notes for this version</source> <source>Release notes for this version</source>
<translation>Notas de la versión para esta versión</translation> <translation>Notas de la versión para esta versión</translation>
</message> </message>
<message>
<source>### Release notes
%1### Contributors
%2
</source>
<translation type="vanished">### Notas de la versión
%1### Colaboradores
%2
</translation>
</message>
<message> <message>
<source>### Opt-ins for anonymous usage analytics and datalake By enabling these features, you will be able to participate in the democratic process of training a large language model by contributing data for future model improvements. <source>### Opt-ins for anonymous usage analytics and datalake By enabling these features, you will be able to participate in the democratic process of training a large language model by contributing data for future model improvements.
@ -2391,15 +2301,6 @@ NOTA: Al activar esta función, estarás enviando tus datos al Datalake de Códi
<source>Allow opt-out anonymous sharing of chats to the GPT4All Datalake</source> <source>Allow opt-out anonymous sharing of chats to the GPT4All Datalake</source>
<translation>Permitir rechazar el compartir anónimo de chats con el Datalake de GPT4All</translation> <translation>Permitir rechazar el compartir anónimo de chats con el Datalake de GPT4All</translation>
</message> </message>
<message>
<source>### Release notes
%1### Contributors
%2</source>
<translation type="vanished">### Notas de la versión
%1### Colaboradores
%2
</translation>
</message>
<message> <message>
<location filename="../src/qml/StartupDialog.qml" line="87"/> <location filename="../src/qml/StartupDialog.qml" line="87"/>
<source>### Opt-ins for anonymous usage analytics and datalake <source>### Opt-ins for anonymous usage analytics and datalake
@ -2434,12 +2335,6 @@ lanzamiento de modelo GPT4All que utilice sus datos.</translation>
</context> </context>
<context> <context>
<name>SwitchModelDialog</name> <name>SwitchModelDialog</name>
<message>
<source>&lt;b&gt;Warning:&lt;/b&gt; changing the model will erase the current
conversation. Do you wish to continue?</source>
<translation type="vanished">&lt;b&gt;Advertencia:&lt;/b&gt; cambiar el modelo borrará la conversación
actual. ¿Desea continuar?</translation>
</message>
<message> <message>
<location filename="../src/qml/SwitchModelDialog.qml" line="22"/> <location filename="../src/qml/SwitchModelDialog.qml" line="22"/>
<source>&lt;b&gt;Warning:&lt;/b&gt; changing the model will erase the current conversation. Do you wish to continue?</source> <source>&lt;b&gt;Warning:&lt;/b&gt; changing the model will erase the current conversation. Do you wish to continue?</source>

View File

@ -370,17 +370,6 @@
<source>opt-in to share feedback/conversations</source> <source>opt-in to share feedback/conversations</source>
<translation>aderisci per condividere feedback/conversazioni</translation> <translation>aderisci per condividere feedback/conversazioni</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="37"/>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt;
to check for updates!&lt;br&gt;&lt;br&gt;
Did you install this application using the online installer? If so,&lt;br&gt;
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to&lt;br&gt;
reinstall.</source>
<translation></translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="48"/> <location filename="../src/qml/ApplicationSettings.qml" line="48"/>
<source>Error dialog</source> <source>Error dialog</source>
@ -416,6 +405,11 @@
<source>Light</source> <source>Light</source>
<translation>Chiaro</translation> <translation>Chiaro</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="39"/>
<source>ERROR: Update system could not find the MaintenanceTool used to check for updates!&lt;br/&gt;&lt;br/&gt;Did you install this application using the online installer? If so, the MaintenanceTool executable should be located one directory above where this application resides on your filesystem.&lt;br/&gt;&lt;br/&gt;If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to reinstall.</source>
<translation type="unfinished"></translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="114"/> <location filename="../src/qml/ApplicationSettings.qml" line="114"/>
<source>LegacyDark</source> <source>LegacyDark</source>
@ -575,7 +569,7 @@
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="505"/> <location filename="../src/qml/ApplicationSettings.qml" line="505"/>
<source>Enable Local API Server</source> <source>Enable Local API Server</source>
<translation type="unfinished"></translation> <translation>Abilita il server API locale</translation>
</message> </message>
<message> <message>
<source>Enable Local Server</source> <source>Enable Local Server</source>
@ -885,10 +879,6 @@ modello per iniziare</translation>
<source>You</source> <source>You</source>
<translation>Tu</translation> <translation>Tu</translation>
</message> </message>
<message>
<source>recalculating context ...</source>
<translation type="vanished">ricalcolo contesto ...</translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="878"/> <location filename="../src/qml/ChatView.qml" line="878"/>
<source>response stopped ...</source> <source>response stopped ...</source>
@ -1112,37 +1102,37 @@ modello per iniziare</translation>
<context> <context>
<name>Download</name> <name>Download</name>
<message> <message>
<location filename="../src/download.cpp" line="279"/> <location filename="../src/download.cpp" line="278"/>
<source>Model &quot;%1&quot; is installed successfully.</source> <source>Model &quot;%1&quot; is installed successfully.</source>
<translation>Il modello &quot;%1&quot; è stato installato correttamente.</translation> <translation>Il modello &quot;%1&quot; è stato installato correttamente.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="289"/> <location filename="../src/download.cpp" line="288"/>
<source>ERROR: $MODEL_NAME is empty.</source> <source>ERROR: $MODEL_NAME is empty.</source>
<translation>ERRORE: $MODEL_NAME è vuoto.</translation> <translation>ERRORE: $MODEL_NAME è vuoto.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="295"/> <location filename="../src/download.cpp" line="294"/>
<source>ERROR: $API_KEY is empty.</source> <source>ERROR: $API_KEY is empty.</source>
<translation>ERRORE: $API_KEY è vuoto.</translation> <translation>ERRORE: $API_KEY è vuoto.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="301"/> <location filename="../src/download.cpp" line="300"/>
<source>ERROR: $BASE_URL is invalid.</source> <source>ERROR: $BASE_URL is invalid.</source>
<translation>ERRORE: $BASE_URL non è valido.</translation> <translation>ERRORE: $BASE_URL non è valido.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="307"/> <location filename="../src/download.cpp" line="306"/>
<source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source> <source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source>
<translation>ERRORE: il modello &quot;%1 (%2)&quot; è in conflitto.</translation> <translation>ERRORE: il modello &quot;%1 (%2)&quot; è in conflitto.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="326"/> <location filename="../src/download.cpp" line="325"/>
<source>Model &quot;%1 (%2)&quot; is installed successfully.</source> <source>Model &quot;%1 (%2)&quot; is installed successfully.</source>
<translation>Il modello &quot;%1 (%2)&quot; è stato installato correttamente.</translation> <translation>Il modello &quot;%1 (%2)&quot; è stato installato correttamente.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="350"/> <location filename="../src/download.cpp" line="349"/>
<source>Model &quot;%1&quot; is removed.</source> <source>Model &quot;%1&quot; is removed.</source>
<translation>Il modello &quot;%1&quot; è stato rimosso.</translation> <translation>Il modello &quot;%1&quot; è stato rimosso.</translation>
</message> </message>
@ -1376,10 +1366,6 @@ modello per iniziare</translation>
<source> Add Collection</source> <source> Add Collection</source>
<translation> Aggiungi raccolta</translation> <translation> Aggiungi raccolta</translation>
</message> </message>
<message>
<source>ERROR: The LocalDocs database is not valid.</source>
<translation type="vanished">ERRORE: il database di LocalDocs non è valido.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsView.qml" line="85"/> <location filename="../src/qml/LocalDocsView.qml" line="85"/>
<source>&lt;h3&gt;ERROR: The LocalDocs database cannot be accessed or is not valid.&lt;/h3&gt;&lt;br&gt;&lt;i&gt;Note: You will need to restart after trying any of the following suggested fixes.&lt;/i&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Make sure that the folder set as &lt;b&gt;Download Path&lt;/b&gt; exists on the file system.&lt;/li&gt;&lt;li&gt;Check ownership as well as read and write permissions of the &lt;b&gt;Download Path&lt;/b&gt;.&lt;/li&gt;&lt;li&gt;If there is a &lt;b&gt;localdocs_v2.db&lt;/b&gt; file, check its ownership and read/write permissions, too.&lt;/li&gt;&lt;/ul&gt;&lt;br&gt;If the problem persists and there are any &apos;localdocs_v*.db&apos; files present, as a last resort you can&lt;br&gt;try backing them up and removing them. You will have to recreate your collections, however.</source> <source>&lt;h3&gt;ERROR: The LocalDocs database cannot be accessed or is not valid.&lt;/h3&gt;&lt;br&gt;&lt;i&gt;Note: You will need to restart after trying any of the following suggested fixes.&lt;/i&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Make sure that the folder set as &lt;b&gt;Download Path&lt;/b&gt; exists on the file system.&lt;/li&gt;&lt;li&gt;Check ownership as well as read and write permissions of the &lt;b&gt;Download Path&lt;/b&gt;.&lt;/li&gt;&lt;li&gt;If there is a &lt;b&gt;localdocs_v2.db&lt;/b&gt; file, check its ownership and read/write permissions, too.&lt;/li&gt;&lt;/ul&gt;&lt;br&gt;If the problem persists and there are any &apos;localdocs_v*.db&apos; files present, as a last resort you can&lt;br&gt;try backing them up and removing them. You will have to recreate your collections, however.</source>
@ -1558,12 +1544,12 @@ modello per iniziare</translation>
<location filename="../src/modellist.cpp" line="1226"/> <location filename="../src/modellist.cpp" line="1226"/>
<location filename="../src/modellist.cpp" line="1277"/> <location filename="../src/modellist.cpp" line="1277"/>
<source>cannot open &quot;%1&quot;: %2</source> <source>cannot open &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>impossibile aprire &quot;%1&quot;: %2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1238"/> <location filename="../src/modellist.cpp" line="1238"/>
<source>cannot create &quot;%1&quot;: %2</source> <source>cannot create &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>impossibile creare &quot;%1&quot;: %2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1288"/> <location filename="../src/modellist.cpp" line="1288"/>
@ -2096,47 +2082,47 @@ Quando un modello di GPT4All ti risponde e tu hai aderito, la tua conversazione
NOTA: attivando questa funzione, invierai i tuoi dati al Datalake Open Source di GPT4All. Non dovresti avere aspettative sulla privacy della chat quando questa funzione è abilitata. Dovresti, tuttavia, aspettarti un&apos;attribuzione facoltativa, se lo desideri. I tuoi dati di chat saranno liberamente disponibili per essere scaricati da chiunque e verranno utilizzati da Nomic AI per migliorare i futuri modelli GPT4All. Nomic AI conserverà tutte le informazioni di attribuzione allegate ai tuoi dati e verrai accreditato come collaboratore a qualsiasi versione del modello GPT4All che utilizza i tuoi dati!</translation> NOTA: attivando questa funzione, invierai i tuoi dati al Datalake Open Source di GPT4All. Non dovresti avere aspettative sulla privacy della chat quando questa funzione è abilitata. Dovresti, tuttavia, aspettarti un&apos;attribuzione facoltativa, se lo desideri. I tuoi dati di chat saranno liberamente disponibili per essere scaricati da chiunque e verranno utilizzati da Nomic AI per migliorare i futuri modelli GPT4All. Nomic AI conserverà tutte le informazioni di attribuzione allegate ai tuoi dati e verrai accreditato come collaboratore a qualsiasi versione del modello GPT4All che utilizza i tuoi dati!</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="63"/> <location filename="../src/qml/NetworkDialog.qml" line="70"/>
<source>Terms for opt-in</source> <source>Terms for opt-in</source>
<translation>Termini per l&apos;adesione</translation> <translation>Termini per l&apos;adesione</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="64"/> <location filename="../src/qml/NetworkDialog.qml" line="71"/>
<source>Describes what will happen when you opt-in</source> <source>Describes what will happen when you opt-in</source>
<translation>Descrive cosa accadrà quando effettuerai l&apos;adesione</translation> <translation>Descrive cosa accadrà quando effettuerai l&apos;adesione</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="72"/> <location filename="../src/qml/NetworkDialog.qml" line="79"/>
<source>Please provide a name for attribution (optional)</source> <source>Please provide a name for attribution (optional)</source>
<translation>Fornisci un nome per l&apos;attribuzione (facoltativo)</translation> <translation>Fornisci un nome per l&apos;attribuzione (facoltativo)</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="74"/> <location filename="../src/qml/NetworkDialog.qml" line="81"/>
<source>Attribution (optional)</source> <source>Attribution (optional)</source>
<translation>Attribuzione (facoltativo)</translation> <translation>Attribuzione (facoltativo)</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="75"/> <location filename="../src/qml/NetworkDialog.qml" line="82"/>
<source>Provide attribution</source> <source>Provide attribution</source>
<translation>Fornire attribuzione</translation> <translation>Fornire attribuzione</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="88"/> <location filename="../src/qml/NetworkDialog.qml" line="95"/>
<source>Enable</source> <source>Enable</source>
<translation>Abilita</translation> <translation>Abilita</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="89"/> <location filename="../src/qml/NetworkDialog.qml" line="96"/>
<source>Enable opt-in</source> <source>Enable opt-in</source>
<translation>Abilita l&apos;adesione</translation> <translation>Abilita l&apos;adesione</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="93"/> <location filename="../src/qml/NetworkDialog.qml" line="100"/>
<source>Cancel</source> <source>Cancel</source>
<translation>Annulla</translation> <translation>Annulla</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="94"/> <location filename="../src/qml/NetworkDialog.qml" line="101"/>
<source>Cancel opt-in</source> <source>Cancel opt-in</source>
<translation>Annulla l&apos;adesione</translation> <translation>Annulla l&apos;adesione</translation>
</message> </message>

View File

@ -370,23 +370,6 @@
<source>opt-in to share feedback/conversations</source> <source>opt-in to share feedback/conversations</source>
<translation>Compartilhar feedback e conversas</translation> <translation>Compartilhar feedback e conversas</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="37"/>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt;
to check for updates!&lt;br&gt;&lt;br&gt;
Did you install this application using the online installer? If so,&lt;br&gt;
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to&lt;br&gt;
reinstall.</source>
<translation>ERRO: O sistema de atualização não encontrou a Ferramenta de Manutenção&lt;br&gt;
necessária para verificar atualizações!&lt;br&gt;&lt;br&gt;
Você instalou este aplicativo usando o instalador online? Se sim,&lt;br&gt;
o executável da Ferramenta de Manutenção deve estar localizado um diretório&lt;br&gt;
acima de onde este aplicativo está instalado.&lt;br&gt;&lt;br&gt;
Se você não conseguir iniciá-lo manualmente, será necessário&lt;br&gt;
reinstalar o aplicativo.</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="48"/> <location filename="../src/qml/ApplicationSettings.qml" line="48"/>
<source>Error dialog</source> <source>Error dialog</source>
@ -422,6 +405,11 @@
<source>Light</source> <source>Light</source>
<translation>Modo Claro</translation> <translation>Modo Claro</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="39"/>
<source>ERROR: Update system could not find the MaintenanceTool used to check for updates!&lt;br/&gt;&lt;br/&gt;Did you install this application using the online installer? If so, the MaintenanceTool executable should be located one directory above where this application resides on your filesystem.&lt;br/&gt;&lt;br/&gt;If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to reinstall.</source>
<translation>ERRO: O sistema de atualização não encontrou a Ferramenta de Manutenção necessária para verificar atualizações!&lt;br&gt;&lt;br&gt;Você instalou este aplicativo usando o instalador online? Se sim, o executável da Ferramenta de Manutenção deve estar localizado um diretório acima de onde este aplicativo está instalado.&lt;br&gt;&lt;br&gt;Se você não conseguir iniciá-lo manualmente, será necessário reinstalar o aplicativo.</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="114"/> <location filename="../src/qml/ApplicationSettings.qml" line="114"/>
<source>LegacyDark</source> <source>LegacyDark</source>
@ -578,11 +566,7 @@
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="505"/> <location filename="../src/qml/ApplicationSettings.qml" line="505"/>
<source>Enable Local API Server</source> <source>Enable Local API Server</source>
<translation type="unfinished"></translation> <translation>Ativar servidor de API local</translation>
</message>
<message>
<source>Enable Local Server</source>
<translation type="vanished">Ativar Servidor Local</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="506"/> <location filename="../src/qml/ApplicationSettings.qml" line="506"/>
@ -888,10 +872,6 @@ modelo instalado para funcionar</translation>
<source>You</source> <source>You</source>
<translation>Você</translation> <translation>Você</translation>
</message> </message>
<message>
<source>recalculating context ...</source>
<translation type="vanished">recalculando contexto...</translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="878"/> <location filename="../src/qml/ChatView.qml" line="878"/>
<source>response stopped ...</source> <source>response stopped ...</source>
@ -1115,37 +1095,37 @@ modelo instalado para funcionar</translation>
<context> <context>
<name>Download</name> <name>Download</name>
<message> <message>
<location filename="../src/download.cpp" line="279"/> <location filename="../src/download.cpp" line="278"/>
<source>Model &quot;%1&quot; is installed successfully.</source> <source>Model &quot;%1&quot; is installed successfully.</source>
<translation>Modelo &quot;%1&quot; instalado com sucesso.</translation> <translation>Modelo &quot;%1&quot; instalado com sucesso.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="289"/> <location filename="../src/download.cpp" line="288"/>
<source>ERROR: $MODEL_NAME is empty.</source> <source>ERROR: $MODEL_NAME is empty.</source>
<translation>ERRO: O nome do modelo ($MODEL_NAME) está vazio.</translation> <translation>ERRO: O nome do modelo ($MODEL_NAME) está vazio.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="295"/> <location filename="../src/download.cpp" line="294"/>
<source>ERROR: $API_KEY is empty.</source> <source>ERROR: $API_KEY is empty.</source>
<translation>ERRO: A chave da API ($API_KEY) está vazia.</translation> <translation>ERRO: A chave da API ($API_KEY) está vazia.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="301"/> <location filename="../src/download.cpp" line="300"/>
<source>ERROR: $BASE_URL is invalid.</source> <source>ERROR: $BASE_URL is invalid.</source>
<translation>ERRO: A URL base ($BASE_URL) é inválida.</translation> <translation>ERRO: A URL base ($BASE_URL) é inválida.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="307"/> <location filename="../src/download.cpp" line="306"/>
<source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source> <source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source>
<translation>ERRO: Conflito com o modelo &quot;%1 (%2)&quot;.</translation> <translation>ERRO: Conflito com o modelo &quot;%1 (%2)&quot;.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="326"/> <location filename="../src/download.cpp" line="325"/>
<source>Model &quot;%1 (%2)&quot; is installed successfully.</source> <source>Model &quot;%1 (%2)&quot; is installed successfully.</source>
<translation>Modelo &quot;%1 (%2)&quot; instalado com sucesso.</translation> <translation>Modelo &quot;%1 (%2)&quot; instalado com sucesso.</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="350"/> <location filename="../src/download.cpp" line="349"/>
<source>Model &quot;%1&quot; is removed.</source> <source>Model &quot;%1&quot; is removed.</source>
<translation>Modelo &quot;%1&quot; removido.</translation> <translation>Modelo &quot;%1&quot; removido.</translation>
</message> </message>
@ -1379,10 +1359,6 @@ modelo instalado para funcionar</translation>
<source> Add Collection</source> <source> Add Collection</source>
<translation> Adicionar Coleção</translation> <translation> Adicionar Coleção</translation>
</message> </message>
<message>
<source>ERROR: The LocalDocs database is not valid.</source>
<translation type="vanished">ERRO: O banco de dados do LocalDocs não é válido.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsView.qml" line="85"/> <location filename="../src/qml/LocalDocsView.qml" line="85"/>
<source>&lt;h3&gt;ERROR: The LocalDocs database cannot be accessed or is not valid.&lt;/h3&gt;&lt;br&gt;&lt;i&gt;Note: You will need to restart after trying any of the following suggested fixes.&lt;/i&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Make sure that the folder set as &lt;b&gt;Download Path&lt;/b&gt; exists on the file system.&lt;/li&gt;&lt;li&gt;Check ownership as well as read and write permissions of the &lt;b&gt;Download Path&lt;/b&gt;.&lt;/li&gt;&lt;li&gt;If there is a &lt;b&gt;localdocs_v2.db&lt;/b&gt; file, check its ownership and read/write permissions, too.&lt;/li&gt;&lt;/ul&gt;&lt;br&gt;If the problem persists and there are any &apos;localdocs_v*.db&apos; files present, as a last resort you can&lt;br&gt;try backing them up and removing them. You will have to recreate your collections, however.</source> <source>&lt;h3&gt;ERROR: The LocalDocs database cannot be accessed or is not valid.&lt;/h3&gt;&lt;br&gt;&lt;i&gt;Note: You will need to restart after trying any of the following suggested fixes.&lt;/i&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Make sure that the folder set as &lt;b&gt;Download Path&lt;/b&gt; exists on the file system.&lt;/li&gt;&lt;li&gt;Check ownership as well as read and write permissions of the &lt;b&gt;Download Path&lt;/b&gt;.&lt;/li&gt;&lt;li&gt;If there is a &lt;b&gt;localdocs_v2.db&lt;/b&gt; file, check its ownership and read/write permissions, too.&lt;/li&gt;&lt;/ul&gt;&lt;br&gt;If the problem persists and there are any &apos;localdocs_v*.db&apos; files present, as a last resort you can&lt;br&gt;try backing them up and removing them. You will have to recreate your collections, however.</source>
@ -1561,12 +1537,12 @@ modelo instalado para funcionar</translation>
<location filename="../src/modellist.cpp" line="1226"/> <location filename="../src/modellist.cpp" line="1226"/>
<location filename="../src/modellist.cpp" line="1277"/> <location filename="../src/modellist.cpp" line="1277"/>
<source>cannot open &quot;%1&quot;: %2</source> <source>cannot open &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>não é possível abrir &quot;%1&quot;: %2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1238"/> <location filename="../src/modellist.cpp" line="1238"/>
<source>cannot create &quot;%1&quot;: %2</source> <source>cannot create &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>não é possível criar &quot;%1&quot;: %2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1288"/> <location filename="../src/modellist.cpp" line="1288"/>
@ -2099,47 +2075,47 @@ Quando um modelo GPT4All responder a você e você tiver optado por participar,
OBS.: Ao ativar este recurso, você estará enviando seus dados para o Datalake de Código Aberto do GPT4All. Você não deve ter nenhuma expectativa de privacidade no chat quando este recurso estiver ativado. No entanto, você deve ter a expectativa de uma atribuição opcional, se desejar. Seus dados de chat estarão disponíveis para qualquer pessoa baixar e serão usados pela Nomic AI para melhorar os futuros modelos GPT4All. A Nomic AI manterá todas as informações de atribuição anexadas aos seus dados e você será creditado como colaborador em qualquer versão do modelo GPT4All que utilize seus dados!</translation> OBS.: Ao ativar este recurso, você estará enviando seus dados para o Datalake de Código Aberto do GPT4All. Você não deve ter nenhuma expectativa de privacidade no chat quando este recurso estiver ativado. No entanto, você deve ter a expectativa de uma atribuição opcional, se desejar. Seus dados de chat estarão disponíveis para qualquer pessoa baixar e serão usados pela Nomic AI para melhorar os futuros modelos GPT4All. A Nomic AI manterá todas as informações de atribuição anexadas aos seus dados e você será creditado como colaborador em qualquer versão do modelo GPT4All que utilize seus dados!</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="63"/> <location filename="../src/qml/NetworkDialog.qml" line="70"/>
<source>Terms for opt-in</source> <source>Terms for opt-in</source>
<translation>Termos de participação</translation> <translation>Termos de participação</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="64"/> <location filename="../src/qml/NetworkDialog.qml" line="71"/>
<source>Describes what will happen when you opt-in</source> <source>Describes what will happen when you opt-in</source>
<translation>Descrição do que acontece ao participar</translation> <translation>Descrição do que acontece ao participar</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="72"/> <location filename="../src/qml/NetworkDialog.qml" line="79"/>
<source>Please provide a name for attribution (optional)</source> <source>Please provide a name for attribution (optional)</source>
<translation>Forneça um nome para atribuição (opcional)</translation> <translation>Forneça um nome para atribuição (opcional)</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="74"/> <location filename="../src/qml/NetworkDialog.qml" line="81"/>
<source>Attribution (optional)</source> <source>Attribution (optional)</source>
<translation>Atribuição (opcional)</translation> <translation>Atribuição (opcional)</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="75"/> <location filename="../src/qml/NetworkDialog.qml" line="82"/>
<source>Provide attribution</source> <source>Provide attribution</source>
<translation>Fornecer atribuição</translation> <translation>Fornecer atribuição</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="88"/> <location filename="../src/qml/NetworkDialog.qml" line="95"/>
<source>Enable</source> <source>Enable</source>
<translation>Habilitar</translation> <translation>Habilitar</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="89"/> <location filename="../src/qml/NetworkDialog.qml" line="96"/>
<source>Enable opt-in</source> <source>Enable opt-in</source>
<translation>Ativar participação</translation> <translation>Ativar participação</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="93"/> <location filename="../src/qml/NetworkDialog.qml" line="100"/>
<source>Cancel</source> <source>Cancel</source>
<translation>Cancelar</translation> <translation>Cancelar</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="94"/> <location filename="../src/qml/NetworkDialog.qml" line="101"/>
<source>Cancel opt-in</source> <source>Cancel opt-in</source>
<translation>Cancelar participação</translation> <translation>Cancelar participação</translation>
</message> </message>
@ -2181,13 +2157,6 @@ OBS.: Ao ativar este recurso, você estará enviando seus dados para o Datalake
<translation>Visível durante o processamento</translation> <translation>Visível durante o processamento</translation>
</message> </message>
</context> </context>
<context>
<name>QObject</name>
<message>
<source>Default</source>
<translation type="vanished">Padrão</translation>
</message>
</context>
<context> <context>
<name>SettingsView</name> <name>SettingsView</name>
<message> <message>

View File

@ -13,11 +13,6 @@
<source>Add Document Collection</source> <source>Add Document Collection</source>
<translation>Adaugă o Colecţie de documente</translation> <translation>Adaugă o Colecţie de documente</translation>
</message> </message>
<message>
<source>Add a folder containing plain text files, PDFs, or Markdown. Configure
additional extensions in Settings.</source>
<translation type="vanished">Adaugă un folder care conţine fişiere în cu text-simplu, PDF sau Markdown. Extensii suplimentare pot fi specificate în Configurare.</translation>
</message>
<message> <message>
<location filename="../src/qml/AddCollectionView.qml" line="78"/> <location filename="../src/qml/AddCollectionView.qml" line="78"/>
<source>Add a folder containing plain text files, PDFs, or Markdown. Configure additional extensions in Settings.</source> <source>Add a folder containing plain text files, PDFs, or Markdown. Configure additional extensions in Settings.</source>
@ -243,12 +238,6 @@
<source>&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;WARNING: Not recommended for your hardware. Model requires more memory (%1 GB) than your system has available (%2).&lt;/strong&gt;&lt;/font&gt;</source> <source>&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;WARNING: Not recommended for your hardware. Model requires more memory (%1 GB) than your system has available (%2).&lt;/strong&gt;&lt;/font&gt;</source>
<translation>&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;ATENŢIE: Nerecomandat pentru acest hardware. Modelul necesită mai multă memorie (%1 GB) decât are acest sistem (%2).&lt;/strong&gt;&lt;/font&gt;</translation> <translation>&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;ATENŢIE: Nerecomandat pentru acest hardware. Modelul necesită mai multă memorie (%1 GB) decât are acest sistem (%2).&lt;/strong&gt;&lt;/font&gt;</translation>
</message> </message>
<message>
<source>&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;WARNING: Not recommended for your
hardware. Model requires more memory (%1 GB) than your system has available
(%2).&lt;/strong&gt;&lt;/font&gt;</source>
<translation type="vanished">&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;ATENţIE: Nerecomandat pentru acest hardware. Modelul necesită mai multă memorie (%1 GB) decât are acest sistem (%2).&lt;/strong&gt;&lt;/font&gt;</translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="716"/> <location filename="../src/qml/AddModelView.qml" line="716"/>
<source>%1 GB</source> <source>%1 GB</source>
@ -265,11 +254,6 @@
<source>Describes an error that occurred when downloading</source> <source>Describes an error that occurred when downloading</source>
<translation>Descrie eroarea apărută în timpul descărcării</translation> <translation>Descrie eroarea apărută în timpul descărcării</translation>
</message> </message>
<message>
<source>&lt;strong&gt;&lt;font size=&quot;1&quot;&gt;&lt;a
href=&quot;#eroare&quot;&gt;Eroare&lt;/a&gt;&lt;/strong&gt;&lt;/font&gt;</source>
<translation type="vanished">&lt;strong&gt;&lt;font size=&quot;1&quot;&gt;&lt;a href=&quot;#eroare&quot;&gt;Eroare&lt;/a&gt;&lt;/strong&gt;&lt;/font&gt;</translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="527"/> <location filename="../src/qml/AddModelView.qml" line="527"/>
<source>Error for incompatible hardware</source> <source>Error for incompatible hardware</source>
@ -387,15 +371,9 @@
<translation>optional: partajarea (share) de comentarii/conversatii</translation> <translation>optional: partajarea (share) de comentarii/conversatii</translation>
</message> </message>
<message> <message>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt; <location filename="../src/qml/ApplicationSettings.qml" line="39"/>
to check for updates!&lt;br&gt;&lt;br&gt; <source>ERROR: Update system could not find the MaintenanceTool used to check for updates!&lt;br/&gt;&lt;br/&gt;Did you install this application using the online installer? If so, the MaintenanceTool executable should be located one directory above where this application resides on your filesystem.&lt;br/&gt;&lt;br/&gt;If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to reinstall.</source>
Did you install this application using the online installer? If so,&lt;br&gt; <translation>EROARE: Sistemul de Update nu poate găsi componenta MaintenanceTool necesară căutării de versiuni noi!&lt;br&gt;&lt;br&gt; Ai instalat acest program folosind kitul online? Dacă da, atunci MaintenanceTool trebuie fie un nivel mai sus de folderul unde ai instalat programul.&lt;br&gt;&lt;br&gt; Dacă nu poate fi lansată manual, atunci programul trebuie reinstalat.</translation>
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have
to&lt;br&gt;
reinstall.</source>
<translation type="vanished">EROARE: Sistemul de actualizare nu poate găsi componenta MaintenanceTool&lt;br&gt; necesară căutării de versiuni noi!&lt;br&gt;&lt;br&gt; Ai instalat acest program folosind kitul online? Dacă da,&lt;br&gt; atunci MaintenanceTool trebuie fie un nivel mai sus de folderul&lt;br&gt; unde ai instalat programul.&lt;br&gt;&lt;br&gt; Dacă nu poate fi lansată manual, atunci programul trebuie reinstalat.</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="48"/> <location filename="../src/qml/ApplicationSettings.qml" line="48"/>
@ -457,17 +435,6 @@
Metal.</source> Metal.</source>
<translation type="vanished">Dispozitivul de calcul utilizat pentru generarea de text. &quot;Auto&quot; apelează la Vulkan sau la Metal.</translation> <translation type="vanished">Dispozitivul de calcul utilizat pentru generarea de text. &quot;Auto&quot; apelează la Vulkan sau la Metal.</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="37"/>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt;
to check for updates!&lt;br&gt;&lt;br&gt;
Did you install this application using the online installer? If so,&lt;br&gt;
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to&lt;br&gt;
reinstall.</source>
<translation>EROARE: Sistemul de Update nu poate găsi componenta MaintenanceTool&lt;br&gt; necesară căutării de versiuni noi!&lt;br&gt;&lt;br&gt; Ai instalat acest program folosind kitul online? Dacă da,&lt;br&gt; atunci MaintenanceTool trebuie fie un nivel mai sus de folderul&lt;br&gt; unde ai instalat programul.&lt;br&gt;&lt;br&gt; Dacă nu poate fi lansată manual, atunci programul trebuie reinstalat.</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="151"/> <location filename="../src/qml/ApplicationSettings.qml" line="151"/>
<source>Small</source> <source>Small</source>
@ -609,16 +576,6 @@
<source>Expose an OpenAI-Compatible server to localhost. WARNING: Results in increased resource usage.</source> <source>Expose an OpenAI-Compatible server to localhost. WARNING: Results in increased resource usage.</source>
<translation>Activează pe localhost un Server compatibil cu Open-AI. ATENŢIE: Creşte consumul de resurse.</translation> <translation>Activează pe localhost un Server compatibil cu Open-AI. ATENŢIE: Creşte consumul de resurse.</translation>
</message> </message>
<message>
<source>Save the chat model&apos;s state to disk for faster loading. WARNING: Uses ~2GB
per chat.</source>
<translation type="vanished">Salvează pe disc starea modelului pentru încărcare mai rapidă. ATENŢIE: Consumă ~2GB/conversaţie.</translation>
</message>
<message>
<source>Expose an OpenAI-Compatible server to localhost. WARNING: Results in increased
resource usage.</source>
<translation type="vanished">Activează pe localhost un Server compatibil cu Open-AI. ATENŢIE: Creşte consumul de resurse.</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="522"/> <location filename="../src/qml/ApplicationSettings.qml" line="522"/>
<source>API Server Port</source> <source>API Server Port</source>
@ -881,11 +838,6 @@
<source>No Model Installed</source> <source>No Model Installed</source>
<translation>Niciun model instalat</translation> <translation>Niciun model instalat</translation>
</message> </message>
<message>
<source>GPT4All requires that you install at least one
model to get started</source>
<translation type="vanished">GPT4All necesită cel puţin un model pentru a putea porni</translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="58"/> <location filename="../src/qml/ChatView.qml" line="58"/>
<source>&lt;h3&gt;Encountered an error loading model:&lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;%1&quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Model loading failures can happen for a variety of reasons, but the most common causes include a bad file format, an incomplete or corrupted download, the wrong file type, not enough system RAM or an incompatible model type. Here are some suggestions for resolving the problem:&lt;br&gt;&lt;ul&gt;&lt;li&gt;Ensure the model file has a compatible format and type&lt;li&gt;Check the model file is complete in the download folder&lt;li&gt;You can find the download folder in the settings dialog&lt;li&gt;If you&apos;ve sideloaded the model ensure the file is not corrupt by checking md5sum&lt;li&gt;Read more about what models are supported in our &lt;a href=&quot;https://docs.gpt4all.io/&quot;&gt;documentation&lt;/a&gt; for the gui&lt;li&gt;Check out our &lt;a href=&quot;https://discord.gg/4M2QFmTt2k&quot;&gt;discord channel&lt;/a&gt; for help</source> <source>&lt;h3&gt;Encountered an error loading model:&lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;%1&quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Model loading failures can happen for a variety of reasons, but the most common causes include a bad file format, an incomplete or corrupted download, the wrong file type, not enough system RAM or an incompatible model type. Here are some suggestions for resolving the problem:&lt;br&gt;&lt;ul&gt;&lt;li&gt;Ensure the model file has a compatible format and type&lt;li&gt;Check the model file is complete in the download folder&lt;li&gt;You can find the download folder in the settings dialog&lt;li&gt;If you&apos;ve sideloaded the model ensure the file is not corrupt by checking md5sum&lt;li&gt;Read more about what models are supported in our &lt;a href=&quot;https://docs.gpt4all.io/&quot;&gt;documentation&lt;/a&gt; for the gui&lt;li&gt;Check out our &lt;a href=&quot;https://discord.gg/4M2QFmTt2k&quot;&gt;discord channel&lt;/a&gt; for help</source>
@ -941,10 +893,6 @@ model to get started</source>
<source>You</source> <source>You</source>
<translation>Tu</translation> <translation>Tu</translation>
</message> </message>
<message>
<source>recalculating context ...</source>
<translation type="vanished">se recalculează contextul...</translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="878"/> <location filename="../src/qml/ChatView.qml" line="878"/>
<source>response stopped ...</source> <source>response stopped ...</source>
@ -1318,10 +1266,6 @@ model to get started</source>
<source>nomic.ai</source> <source>nomic.ai</source>
<translation>nomic.ai</translation> <translation>nomic.ai</translation>
</message> </message>
<message>
<source>GitHub</source>
<translation type="vanished">GitHub</translation>
</message>
<message> <message>
<location filename="../src/qml/HomeView.qml" line="282"/> <location filename="../src/qml/HomeView.qml" line="282"/>
<source>Subscribe to Newsletter</source> <source>Subscribe to Newsletter</source>
@ -1350,11 +1294,6 @@ model to get started</source>
<source>Allowed File Extensions</source> <source>Allowed File Extensions</source>
<translation>Extensii compatibile de fişier</translation> <translation>Extensii compatibile de fişier</translation>
</message> </message>
<message>
<source>Comma-separated list. LocalDocs will only attempt to process files with these
extensions.</source>
<translation type="vanished">Extensiile, separate prin virgulă. LocalDocs va încerca procesarea numai a fişierelor cu aceste extensii.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsSettings.qml" line="100"/> <location filename="../src/qml/LocalDocsSettings.qml" line="100"/>
<source>Embedding</source> <source>Embedding</source>
@ -1365,22 +1304,11 @@ model to get started</source>
<source>Use Nomic Embed API</source> <source>Use Nomic Embed API</source>
<translation>Folosesc Nomic Embed API</translation> <translation>Folosesc Nomic Embed API</translation>
</message> </message>
<message>
<source>Embed documents using the fast Nomic API instead of a private local model.
Requires restart.</source>
<translation type="vanished">Embedding pe documente folosind API de la Nomic în locul unui model local. Necesită repornire.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsSettings.qml" line="130"/> <location filename="../src/qml/LocalDocsSettings.qml" line="130"/>
<source>Nomic API Key</source> <source>Nomic API Key</source>
<translation>Cheia API Nomic</translation> <translation>Cheia API Nomic</translation>
</message> </message>
<message>
<source>API key to use for Nomic Embed. Get one from the Atlas &lt;a
href=&quot;https://atlas.nomic.ai/cli-login&quot;&gt;API keys page&lt;/a&gt;.
Requires restart.</source>
<translation type="vanished">Cheia API de utilizat cu Nomic Embed. Obţine o cheie prin Atlas: &lt;a href=&quot;https://atlas.nomic.ai/cli-login&quot;&gt;pagina cheilor API&lt;/a&gt; Necesită repornire.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsSettings.qml" line="165"/> <location filename="../src/qml/LocalDocsSettings.qml" line="165"/>
<source>Embeddings Device</source> <source>Embeddings Device</source>
@ -1451,36 +1379,16 @@ model to get started</source>
<source>Max best N matches of retrieved document snippets to add to the context for prompt. Larger numbers increase likelihood of factual responses, but also result in slower generation.</source> <source>Max best N matches of retrieved document snippets to add to the context for prompt. Larger numbers increase likelihood of factual responses, but also result in slower generation.</source>
<translation>Numărul maxim al citatelor ce corespund şi care vor fi adăugate la contextul pentru prompt. Numere mari amplifică probabilitatea unor replici corecte, dar de asemenea cauzează generare lentă.</translation> <translation>Numărul maxim al citatelor ce corespund şi care vor fi adăugate la contextul pentru prompt. Numere mari amplifică probabilitatea unor replici corecte, dar de asemenea cauzează generare lentă.</translation>
</message> </message>
<message>
<source>
Values too large may cause localdocs failure, extremely slow responses or
failure to respond at all. Roughly speaking, the {N chars x N snippets} are added to
the model&apos;s context window. More info &lt;a
href=&quot;https://docs.gpt4all.io/gpt4all_desktop/localdocs.html&quot;&gt;here&lt;/a&gt;.</source>
<translation type="vanished">
Valori prea mari pot cauza erori cu LocalDocs, replici lente sau absenţa lor completă. în mare, numărul {N caractere x N citate} este adăugat la Context Window/Size/Length a modelului. Mai multe informaţii: &lt;a href=&quot;https://docs.gpt4all.io/gpt4all_desktop/localdocs.html&quot;&gt;aici&lt;/a&gt;.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsSettings.qml" line="266"/> <location filename="../src/qml/LocalDocsSettings.qml" line="266"/>
<source>Document snippet size (characters)</source> <source>Document snippet size (characters)</source>
<translation>Lungimea (în caractere) a citatelor din documente</translation> <translation>Lungimea (în caractere) a citatelor din documente</translation>
</message> </message>
<message>
<source>Number of characters per document snippet. Larger numbers increase likelihood of
factual responses, but also result in slower generation.</source>
<translation type="vanished">numărul caracterelor din fiecare citat. Numere mari amplifică probabilitatea unor replici corecte, dar de asemenea pot cauza generare lentă.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsSettings.qml" line="292"/> <location filename="../src/qml/LocalDocsSettings.qml" line="292"/>
<source>Max document snippets per prompt</source> <source>Max document snippets per prompt</source>
<translation>Numărul maxim de citate per prompt</translation> <translation>Numărul maxim de citate per prompt</translation>
</message> </message>
<message>
<source>Max best N matches of retrieved document snippets to add to the context for
prompt. Larger numbers increase likelihood of factual responses, but also result in
slower generation.</source>
<translation type="vanished">Numărul maxim al citatelor ce corespund şi care vor fi adăugate la contextul pentru prompt. Numere mari amplifică probabilitatea unor replici corecte, dar de asemenea pot cauza generare lentă.</translation>
</message>
</context> </context>
<context> <context>
<name>LocalDocsView</name> <name>LocalDocsView</name>
@ -1499,10 +1407,6 @@ model to get started</source>
<source> Add Collection</source> <source> Add Collection</source>
<translation> Adaugă o Colecţie</translation> <translation> Adaugă o Colecţie</translation>
</message> </message>
<message>
<source>ERROR: The LocalDocs database is not valid.</source>
<translation type="vanished">EROARE: Baza de date LocalDocs nu e validă.</translation>
</message>
<message> <message>
<location filename="../src/qml/LocalDocsView.qml" line="85"/> <location filename="../src/qml/LocalDocsView.qml" line="85"/>
<source>&lt;h3&gt;ERROR: The LocalDocs database cannot be accessed or is not valid.&lt;/h3&gt;&lt;br&gt;&lt;i&gt;Note: You will need to restart after trying any of the following suggested fixes.&lt;/i&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Make sure that the folder set as &lt;b&gt;Download Path&lt;/b&gt; exists on the file system.&lt;/li&gt;&lt;li&gt;Check ownership as well as read and write permissions of the &lt;b&gt;Download Path&lt;/b&gt;.&lt;/li&gt;&lt;li&gt;If there is a &lt;b&gt;localdocs_v2.db&lt;/b&gt; file, check its ownership and read/write permissions, too.&lt;/li&gt;&lt;/ul&gt;&lt;br&gt;If the problem persists and there are any &apos;localdocs_v*.db&apos; files present, as a last resort you can&lt;br&gt;try backing them up and removing them. You will have to recreate your collections, however.</source> <source>&lt;h3&gt;ERROR: The LocalDocs database cannot be accessed or is not valid.&lt;/h3&gt;&lt;br&gt;&lt;i&gt;Note: You will need to restart after trying any of the following suggested fixes.&lt;/i&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Make sure that the folder set as &lt;b&gt;Download Path&lt;/b&gt; exists on the file system.&lt;/li&gt;&lt;li&gt;Check ownership as well as read and write permissions of the &lt;b&gt;Download Path&lt;/b&gt;.&lt;/li&gt;&lt;li&gt;If there is a &lt;b&gt;localdocs_v2.db&lt;/b&gt; file, check its ownership and read/write permissions, too.&lt;/li&gt;&lt;/ul&gt;&lt;br&gt;If the problem persists and there are any &apos;localdocs_v*.db&apos; files present, as a last resort you can&lt;br&gt;try backing them up and removing them. You will have to recreate your collections, however.</source>
@ -1644,32 +1548,16 @@ model to get started</source>
</context> </context>
<context> <context>
<name>ModelList</name> <name>ModelList</name>
<message>
<source>
&lt;ul&gt;&lt;li&gt;Requires personal OpenAI API
key.&lt;/li&gt;&lt;li&gt;WARNING: Will send your chats to
OpenAI!&lt;/li&gt;&lt;li&gt;Your API key will be stored on
disk&lt;/li&gt;&lt;li&gt;Will only be used to communicate with
OpenAI&lt;/li&gt;&lt;li&gt;You can apply for an API key &lt;a
href=&quot;https://platform.openai.com/account/api-keys&quot;&gt;here.&lt;/a&gt;&lt;/li&gt;</source>
<translation type="vanished">
&lt;ul&gt;&lt;li&gt;Necesită o cheie API OpenAI personală. &lt;/li&gt;&lt;li&gt;ATENţIE: Conversaţiile tale vor fi trimise la OpenAI! &lt;/li&gt;&lt;li&gt;Cheia ta API va fi stocată pe disc (local) &lt;/li&gt;&lt;li&gt;Va fi utilizată numai pentru comunicarea cu OpenAI&lt;/li&gt;&lt;li&gt;Poţi solicita o cheie API aici: &lt;a href=&quot;https://platform.openai.com/account/api-keys&quot;&gt;aici.&lt;/a&gt;&lt;/li&gt;</translation>
</message>
<message>
<source>&lt;strong&gt;OpenAI&apos;s ChatGPT model GPT-3.5 Turbo&lt;/strong&gt;&lt;br&gt;
%1</source>
<translation type="vanished">&lt;strong&gt;Modelul ChatGPT GPT-3.5 Turbo al OpenAI&lt;/strong&gt;&lt;br&gt; %1</translation>
</message>
<message> <message>
<location filename="../src/modellist.cpp" line="1226"/> <location filename="../src/modellist.cpp" line="1226"/>
<location filename="../src/modellist.cpp" line="1277"/> <location filename="../src/modellist.cpp" line="1277"/>
<source>cannot open &quot;%1&quot;: %2</source> <source>cannot open &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>nu se poate deschide %1: %2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1238"/> <location filename="../src/modellist.cpp" line="1238"/>
<source>cannot create &quot;%1&quot;: %2</source> <source>cannot create &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>nu se poate crea %1: %2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1288"/> <location filename="../src/modellist.cpp" line="1288"/>
@ -1736,29 +1624,6 @@ model to get started</source>
<source>&lt;strong&gt;Created by %1.&lt;/strong&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Published on %2.&lt;li&gt;This model has %3 likes.&lt;li&gt;This model has %4 downloads.&lt;li&gt;More info can be found &lt;a href=&quot;https://huggingface.co/%5&quot;&gt;here.&lt;/a&gt;&lt;/ul&gt;</source> <source>&lt;strong&gt;Created by %1.&lt;/strong&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Published on %2.&lt;li&gt;This model has %3 likes.&lt;li&gt;This model has %4 downloads.&lt;li&gt;More info can be found &lt;a href=&quot;https://huggingface.co/%5&quot;&gt;here.&lt;/a&gt;&lt;/ul&gt;</source>
<translation>&lt;strong&gt;Creat de către %1.&lt;/strong&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Publicat in: %2.&lt;li&gt;Acest model are %3 Likes.&lt;li&gt;Acest model are %4 download-uri.&lt;li&gt;Mai multe informaţii pot fi găsite la: &lt;a href=&quot;https://huggingface.co/%5&quot;&gt;aici.&lt;/a&gt;&lt;/ul&gt;</translation> <translation>&lt;strong&gt;Creat de către %1.&lt;/strong&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Publicat in: %2.&lt;li&gt;Acest model are %3 Likes.&lt;li&gt;Acest model are %4 download-uri.&lt;li&gt;Mai multe informaţii pot fi găsite la: &lt;a href=&quot;https://huggingface.co/%5&quot;&gt;aici.&lt;/a&gt;&lt;/ul&gt;</translation>
</message> </message>
<message>
<source>&lt;br&gt;&lt;br&gt;&lt;i&gt;* Even if you pay OpenAI for ChatGPT-4 this does
not guarantee API key access. Contact OpenAI for more info.</source>
<translation type="vanished">&lt;br&gt;&lt;br&gt;&lt;i&gt;* Chiar dacă plăteşti la OpenAI pentru ChatGPT-4, aceasta nu garantează accesul la cheia API. Contactează OpenAI pentru mai multe informaţii.</translation>
</message>
<message>
<source>
&lt;ul&gt;&lt;li&gt;Requires personal Mistral API
key.&lt;/li&gt;&lt;li&gt;WARNING: Will send your chats to
Mistral!&lt;/li&gt;&lt;li&gt;Your API key will be stored on
disk&lt;/li&gt;&lt;li&gt;Will only be used to communicate with
Mistral&lt;/li&gt;&lt;li&gt;You can apply for an API key &lt;a
href=&quot;https://console.mistral.ai/user/api-keys&quot;&gt;here&lt;/a&gt;.&lt;/li&gt;</source>
<translation type="vanished">
&lt;ul&gt;&lt;li&gt;Necesită cheia personală Mistral API. &lt;/li&gt;&lt;li&gt;ATENţIE: Conversaţiile tale vor fi trimise la Mistral!&lt;/li&gt;&lt;li&gt;Cheia ta API va fi stocată pe disc (local)&lt;/li&gt;&lt;li&gt;Va fi utilizată numai pentru comunicarea cu Mistral&lt;/li&gt;&lt;li&gt;Poţi solicita o cheie API aici: &lt;a href=&quot;https://console.mistral.ai/user/api-keys&quot;&gt;aici&lt;/a&gt;.&lt;/li&gt;</translation>
</message>
<message>
<source>&lt;strong&gt;Created by
%1.&lt;/strong&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Published on %2.&lt;li&gt;This model
has %3 likes.&lt;li&gt;This model has %4 downloads.&lt;li&gt;More info can be found
&lt;a href=&quot;https://huggingface.co/%5&quot;&gt;here.&lt;/a&gt;&lt;/ul&gt;</source>
<translation type="vanished">&lt;strong&gt;Creat de către %1.&lt;/strong&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Publicat in: %2.&lt;li&gt;Acest model are %3 Likes.&lt;li&gt;Acest model are %4 download-uri.&lt;li&gt;Mai multe informaţii pot fi găsite la: &lt;a href=&quot;https://huggingface.co/%5&quot;&gt;aici.&lt;/a&gt;&lt;/ul&gt;</translation>
</message>
</context> </context>
<context> <context>
<name>ModelSettings</name> <name>ModelSettings</name>
@ -1797,11 +1662,6 @@ model to get started</source>
<source>System Prompt</source> <source>System Prompt</source>
<translation>System Prompt</translation> <translation>System Prompt</translation>
</message> </message>
<message>
<source>Prefixed at the beginning of every conversation. Must contain the appropriate
framing tokens.</source>
<translation type="vanished">Plasat la Începutul fiecărei conversaţii. Trebuie conţină token-uri(le) adecvate de Încadrare.</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelSettings.qml" line="205"/> <location filename="../src/qml/ModelSettings.qml" line="205"/>
<source>Prompt Template</source> <source>Prompt Template</source>
@ -1812,11 +1672,6 @@ model to get started</source>
<source>The template that wraps every prompt.</source> <source>The template that wraps every prompt.</source>
<translation>Standardul de formulare a fiecărui prompt.</translation> <translation>Standardul de formulare a fiecărui prompt.</translation>
</message> </message>
<message>
<source>Must contain the string &quot;%1&quot; to be replaced with the user&apos;s
input.</source>
<translation type="vanished">Trebuie conţină textul &quot;%1&quot; care va fi Înlocuit cu ceea ce scrie utilizatorul.</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelSettings.qml" line="255"/> <location filename="../src/qml/ModelSettings.qml" line="255"/>
<source>Chat Name Prompt</source> <source>Chat Name Prompt</source>
@ -1847,12 +1702,6 @@ model to get started</source>
<source>Number of input and output tokens the model sees.</source> <source>Number of input and output tokens the model sees.</source>
<translation>Numărul token-urilor de input şi de output văzute de model.</translation> <translation>Numărul token-urilor de input şi de output văzute de model.</translation>
</message> </message>
<message>
<source>Maximum combined prompt/response tokens before information is lost.
Using more context than the model was trained on will yield poor results.
NOTE: Does not take effect until you reload the model.</source>
<translation type="vanished">Numărul maxim combinat al token-urilor în prompt+replică înainte de a se pierde informaţie. Utilizarea unui context mai mare decât cel cu care a fost instruit modelul va întoarce rezultate mai slabe. NOTĂ: Nu are efect până la reîncărcarea modelului.</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelSettings.qml" line="412"/> <location filename="../src/qml/ModelSettings.qml" line="412"/>
<source>Temperature</source> <source>Temperature</source>
@ -1863,11 +1712,6 @@ model to get started</source>
<source>Randomness of model output. Higher -&gt; more variation.</source> <source>Randomness of model output. Higher -&gt; more variation.</source>
<translation>Libertatea/Confuzia din replica modelului. Mai mare -&gt; mai multă libertate.</translation> <translation>Libertatea/Confuzia din replica modelului. Mai mare -&gt; mai multă libertate.</translation>
</message> </message>
<message>
<source>Temperature increases the chances of choosing less likely tokens.
NOTE: Higher temperature gives more creative but less predictable outputs.</source>
<translation type="vanished">Temperatura creşte probabilitatea de alegere a unor token-uri puţin probabile. NOTĂ: O temperatură tot mai înaltă determină replici tot mai creative şi mai puţin predictibile.</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelSettings.qml" line="458"/> <location filename="../src/qml/ModelSettings.qml" line="458"/>
<source>Top-P</source> <source>Top-P</source>
@ -1878,11 +1722,6 @@ model to get started</source>
<source>Nucleus Sampling factor. Lower -&gt; more predictable.</source> <source>Nucleus Sampling factor. Lower -&gt; more predictable.</source>
<translation>Factorul de Nucleus Sampling. Mai mic -&gt; predictibilitate mai mare.</translation> <translation>Factorul de Nucleus Sampling. Mai mic -&gt; predictibilitate mai mare.</translation>
</message> </message>
<message>
<source>Only the most likely tokens up to a total probability of top_p can be chosen.
NOTE: Prevents choosing highly unlikely tokens.</source>
<translation type="vanished">Pot fi alese numai cele mai probabile token-uri a căror probabilitate totală este Top-P. NOTĂ: Se evită selectarea token-urilor foarte improbabile.</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelSettings.qml" line="159"/> <location filename="../src/qml/ModelSettings.qml" line="159"/>
<source>Prefixed at the beginning of every conversation. Must contain the appropriate framing tokens.</source> <source>Prefixed at the beginning of every conversation. Must contain the appropriate framing tokens.</source>
@ -1975,11 +1814,6 @@ Lower values increase CPU load and RAM usage, and make inference slower.
NOTE: Does not take effect until you reload the model.</source> NOTE: Does not take effect until you reload the model.</source>
<translation>Cât de multe layere ale modelului fie încărcate în VRAM. Valori mici trebuie folosite dacă GPT4All rămâne fără VRAM în timp ce încarcă modelul. Valorile tot mai mici cresc utilizarea CPU şi a RAM şi încetinesc inferenţa. NOTĂ: Nu are efect până la reîncărcarea modelului.</translation> <translation>Cât de multe layere ale modelului fie încărcate în VRAM. Valori mici trebuie folosite dacă GPT4All rămâne fără VRAM în timp ce încarcă modelul. Valorile tot mai mici cresc utilizarea CPU şi a RAM şi încetinesc inferenţa. NOTĂ: Nu are efect până la reîncărcarea modelului.</translation>
</message> </message>
<message>
<source>Amount of prompt tokens to process at once.
NOTE: Higher values can speed up reading prompts but will use more RAM.</source>
<translation type="vanished">numărul token-urilor procesate simultan. NOTĂ: Valori tot mai mari pot accelera citirea prompt-urilor, dar şi utiliza mai multă RAM.</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelSettings.qml" line="690"/> <location filename="../src/qml/ModelSettings.qml" line="690"/>
<source>Repeat Penalty</source> <source>Repeat Penalty</source>
@ -2010,13 +1844,6 @@ NOTE: Does not take effect until you reload the model.</source>
<source>Number of model layers to load into VRAM.</source> <source>Number of model layers to load into VRAM.</source>
<translation>Numărul layerelor modelului ce vor fi Încărcate în VRAM.</translation> <translation>Numărul layerelor modelului ce vor fi Încărcate în VRAM.</translation>
</message> </message>
<message>
<source>How many model layers to load into VRAM. Decrease this if GPT4All runs out of
VRAM while loading this model.
Lower values increase CPU load and RAM usage, and make inference slower.
NOTE: Does not take effect until you reload the model.</source>
<translation type="vanished">Cât de multe layere ale modelului fie încărcate în VRAM. Valori mici trebuie folosite dacă GPT4All rămâne fără VRAM în timp ce încarcă modelul. Valorile tot mai mici cresc utilizarea CPU şi a RAM şi încetinesc inferenţa. NOTĂ: Nu are efect până la reîncărcarea modelului.</translation>
</message>
</context> </context>
<context> <context>
<name>ModelsView</name> <name>ModelsView</name>
@ -2107,17 +1934,6 @@ NOTE: Does not take effect until you reload the model.</source>
<source>Install online model</source> <source>Install online model</source>
<translation>Instalez un model din online</translation> <translation>Instalez un model din online</translation>
</message> </message>
<message>
<source>&lt;strong&gt;&lt;font size=&quot;1&quot;&gt;&lt;a
href=&quot;#error&quot;&gt;Error&lt;/a&gt;&lt;/strong&gt;&lt;/font&gt;</source>
<translation type="vanished">&lt;strong&gt;&lt;font size=&quot;1&quot;&gt;&lt;a href=&quot;#eroare&quot;&gt;Eroare&lt;/a&gt;&lt;/strong&gt;&lt;/font&gt;</translation>
</message>
<message>
<source>&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;WARNING: Not recommended for your
hardware. Model requires more memory (%1 GB) than your system has available
(%2).&lt;/strong&gt;&lt;/font&gt;</source>
<translation type="vanished">&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;ATENţIE: Nerecomandat pentru acest hardware. Modelul necesită mai multă memorie (%1 GB) decât are sistemul tău (%2).&lt;/strong&gt;&lt;/font&gt;</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelsView.qml" line="496"/> <location filename="../src/qml/ModelsView.qml" line="496"/>
<source>%1 GB</source> <source>%1 GB</source>
@ -2288,39 +2104,6 @@ NOTE: Does not take effect until you reload the model.</source>
<source>Contribute data to the GPT4All Opensource Datalake.</source> <source>Contribute data to the GPT4All Opensource Datalake.</source>
<translation>Contribuie cu date/informaţii la componenta Open-source DataLake a GPT4All.</translation> <translation>Contribuie cu date/informaţii la componenta Open-source DataLake a GPT4All.</translation>
</message> </message>
<message>
<source>By enabling this feature, you will be able to participate in the democratic
process of training a large language model by contributing data for future model
improvements.
When a GPT4All model responds to you and you have opted-in, your conversation will
be sent to the GPT4All Open Source Datalake. Additionally, you can like/dislike its
response. If you dislike a response, you can suggest an alternative response. This
data will be collected and aggregated in the GPT4All Datalake.
NOTE: By turning on this feature, you will be sending your data to the GPT4All Open
Source Datalake. You should have no expectation of chat privacy when this feature is
enabled. You should; however, have an expectation of an optional attribution if you
wish. Your chat data will be openly available for anyone to download and will be
used by Nomic AI to improve future GPT4All models. Nomic AI will retain all
attribution information attached to your data and you will be credited as a
contributor to any GPT4All model release that uses your data!</source>
<translation type="vanished">Dacă activezi această funcţionalitate, vei participa la procesul democratic
de instruire a unui model LLM prin contribuţia ta cu date la îmbunătăţirea modelului.
Când un model în GPT4All Îţi răspunde şi îi accepţi replica, atunci conversaţia va fi
trimisă la componenta Open-source DataLake a GPT4All. Mai mult - îi poţi aprecia replica,
Dacă răspunsul Nu Îti Place, poţi sugera unul alternativ.
Aceste date vor fi colectate şi agregate în componenta DataLake a GPT4All.
NOTă: Dacă activezi această funcţionalitate, vei trimite datele tale la componenta
DataLake a GPT4All. Atunci nu te vei putea aştepta la intimitatea (privacy) conversaţiei dacă activezi
această funcţionalitate. Totuşi, te poţi aştepta la a beneficia de apreciere - opţional, dacă doreşti.
Datele din conversaţie vor fi disponibile pentru oricine vrea le descarce şi vor fi
utilizate de către Nomic AI pentru a îmbunătăţi modele viitoare în GPT4All. Nomic AI va păstra
toate informaţiile despre atribuire asociate datelor tale şi vei fi menţionat ca
participant contribuitor la orice lansare a unui model GPT4All care foloseşte datele tale!</translation>
</message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="55"/> <location filename="../src/qml/NetworkDialog.qml" line="55"/>
<source>By enabling this feature, you will be able to participate in the democratic process of training a large language model by contributing data for future model improvements. <source>By enabling this feature, you will be able to participate in the democratic process of training a large language model by contributing data for future model improvements.
@ -2328,64 +2111,54 @@ NOTE: Does not take effect until you reload the model.</source>
When a GPT4All model responds to you and you have opted-in, your conversation will be sent to the GPT4All Open Source Datalake. Additionally, you can like/dislike its response. If you dislike a response, you can suggest an alternative response. This data will be collected and aggregated in the GPT4All Datalake. When a GPT4All model responds to you and you have opted-in, your conversation will be sent to the GPT4All Open Source Datalake. Additionally, you can like/dislike its response. If you dislike a response, you can suggest an alternative response. This data will be collected and aggregated in the GPT4All Datalake.
NOTE: By turning on this feature, you will be sending your data to the GPT4All Open Source Datalake. You should have no expectation of chat privacy when this feature is enabled. You should; however, have an expectation of an optional attribution if you wish. Your chat data will be openly available for anyone to download and will be used by Nomic AI to improve future GPT4All models. Nomic AI will retain all attribution information attached to your data and you will be credited as a contributor to any GPT4All model release that uses your data!</source> NOTE: By turning on this feature, you will be sending your data to the GPT4All Open Source Datalake. You should have no expectation of chat privacy when this feature is enabled. You should; however, have an expectation of an optional attribution if you wish. Your chat data will be openly available for anyone to download and will be used by Nomic AI to improve future GPT4All models. Nomic AI will retain all attribution information attached to your data and you will be credited as a contributor to any GPT4All model release that uses your data!</source>
<translation>Dacă activezi această funcţionalitate, vei participa la procesul democratic <translation>Dacă activezi această funcţionalitate, vei participa la procesul democratic de instruire a unui model LLM prin contribuţia ta cu date la îmbunătăţirea modelului.
de instruire a unui model LLM prin contribuţia ta cu date la îmbunătăţirea modelului.
Când un model în GPT4All îţi răspunde şi îi accepţi replica, atunci conversaţia va fi Când un model în GPT4All îţi răspunde şi îi accepţi replica, atunci conversaţia va fi trimisă la componenta Open-source DataLake a GPT4All. Mai mult - îi poţi aprecia replica, Dacă răspunsul Nu Îti Place, poţi sugera unul alternativ. Aceste date vor fi colectate şi agregate în componenta DataLake a GPT4All.
trimisă la componenta Open-source DataLake a GPT4All. Mai mult - îi poţi aprecia replica,
Dacă răspunsul Nu Îti Place, poţi sugera unul alternativ.
Aceste date vor fi colectate şi agregate în componenta DataLake a GPT4All.
NOTĂ: Dacă activezi această funcţionalitate, vei trimite datele tale la componenta NOTĂ: Dacă activezi această funcţionalitate, vei trimite datele tale la componenta DataLake a GPT4All. Atunci nu te vei putea aştepta la intimitatea (privacy) conversaţiei dacă activezi această funcţionalitate. Totuşi, te poţi aştepta la a beneficia de apreciere - opţional, dacă doreşti. Datele din conversaţie vor fi disponibile pentru oricine vrea le descarce şi vor fi utilizate de către Nomic AI pentru a îmbunătăţi modele viitoare în GPT4All. Nomic AI va păstra toate informaţiile despre atribuire asociate datelor tale şi vei fi menţionat ca participant contribuitor la orice lansare a unui model GPT4All care foloseşte datele tale!</translation>
DataLake a GPT4All. Atunci nu te vei putea aştepta la intimitatea (privacy) conversaţiei dacă activezi
această funcţionalitate. Totuşi, te poţi aştepta la a beneficia de apreciere - opţional, dacă doreşti.
Datele din conversaţie vor fi disponibile pentru oricine vrea le descarce şi vor fi
utilizate de către Nomic AI pentru a îmbunătăţi modele viitoare în GPT4All. Nomic AI va păstra
toate informaţiile despre atribuire asociate datelor tale şi vei fi menţionat ca
participant contribuitor la orice lansare a unui model GPT4All care foloseşte datele tale!</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="63"/> <location filename="../src/qml/NetworkDialog.qml" line="70"/>
<source>Terms for opt-in</source> <source>Terms for opt-in</source>
<translation>Termenii pentru participare</translation> <translation>Termenii pentru participare</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="64"/> <location filename="../src/qml/NetworkDialog.qml" line="71"/>
<source>Describes what will happen when you opt-in</source> <source>Describes what will happen when you opt-in</source>
<translation>Descrie ce se întâmplă când participi</translation> <translation>Descrie ce se întâmplă când participi</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="72"/> <location filename="../src/qml/NetworkDialog.qml" line="79"/>
<source>Please provide a name for attribution (optional)</source> <source>Please provide a name for attribution (optional)</source>
<translation>Specifică o denumire pentru această apreciere (opţional)</translation> <translation>Specifică o denumire pentru această apreciere (opţional)</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="74"/> <location filename="../src/qml/NetworkDialog.qml" line="81"/>
<source>Attribution (optional)</source> <source>Attribution (optional)</source>
<translation>Apreciere (opţional)</translation> <translation>Apreciere (opţional)</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="75"/> <location filename="../src/qml/NetworkDialog.qml" line="82"/>
<source>Provide attribution</source> <source>Provide attribution</source>
<translation>Apreciază</translation> <translation>Apreciază</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="88"/> <location filename="../src/qml/NetworkDialog.qml" line="95"/>
<source>Enable</source> <source>Enable</source>
<translation>Activează</translation> <translation>Activează</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="89"/> <location filename="../src/qml/NetworkDialog.qml" line="96"/>
<source>Enable opt-in</source> <source>Enable opt-in</source>
<translation>Activează participarea</translation> <translation>Activează participarea</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="93"/> <location filename="../src/qml/NetworkDialog.qml" line="100"/>
<source>Cancel</source> <source>Cancel</source>
<translation>Anulare</translation> <translation>Anulare</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="94"/> <location filename="../src/qml/NetworkDialog.qml" line="101"/>
<source>Cancel opt-in</source> <source>Cancel opt-in</source>
<translation>Anulează participarea</translation> <translation>Anulează participarea</translation>
</message> </message>
@ -2462,14 +2235,6 @@ NOTE: By turning on this feature, you will be sending your data to the GPT4All O
<source>Welcome!</source> <source>Welcome!</source>
<translation>Bun venit!</translation> <translation>Bun venit!</translation>
</message> </message>
<message>
<source>### Release notes
%1### Contributors
%2</source>
<translation type="vanished">### Despre versiune
%1### Contributori
%2</translation>
</message>
<message> <message>
<location filename="../src/qml/StartupDialog.qml" line="71"/> <location filename="../src/qml/StartupDialog.qml" line="71"/>
<source>Release notes</source> <source>Release notes</source>
@ -2527,21 +2292,16 @@ NOTE: By turning on this feature, you will be sending your data to the GPT4All O
participant contribuitor la orice lansare a unui model GPT4All participant contribuitor la orice lansare a unui model GPT4All
care foloseşte datele tale!</translation> care foloseşte datele tale!</translation>
</message> </message>
<message>
<source>### Release notes
%1### Contributors
%2</source>
<translation type="vanished">### Despre versiune
%1### Contributori
%2</translation>
</message>
<message> <message>
<location filename="../src/qml/StartupDialog.qml" line="67"/> <location filename="../src/qml/StartupDialog.qml" line="67"/>
<source>### Release Notes <source>### Release Notes
%1&lt;br/&gt; %1&lt;br/&gt;
### Contributors ### Contributors
%2</source> %2</source>
<translation type="unfinished"></translation> <translation>### Despre versiune
%1&lt;br/&gt;
### Contributori
%2</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/StartupDialog.qml" line="87"/> <location filename="../src/qml/StartupDialog.qml" line="87"/>
@ -2651,11 +2411,6 @@ care foloseşte datele tale!</translation>
</context> </context>
<context> <context>
<name>SwitchModelDialog</name> <name>SwitchModelDialog</name>
<message>
<source>&lt;b&gt;Warning:&lt;/b&gt; changing the model will erase the current
conversation. Do you wish to continue?</source>
<translation type="vanished">&lt;b&gt;Atenţie:&lt;/b&gt; schimbarea modelului va şterge conversaţia curentă. Confirmi aceasta?</translation>
</message>
<message> <message>
<location filename="../src/qml/SwitchModelDialog.qml" line="22"/> <location filename="../src/qml/SwitchModelDialog.qml" line="22"/>
<source>&lt;b&gt;Warning:&lt;/b&gt; changing the model will erase the current conversation. Do you wish to continue?</source> <source>&lt;b&gt;Warning:&lt;/b&gt; changing the model will erase the current conversation. Do you wish to continue?</source>
@ -2713,38 +2468,11 @@ care foloseşte datele tale!</translation>
</context> </context>
<context> <context>
<name>main</name> <name>main</name>
<message>
<source>
&lt;h3&gt;Encountered an error starting
up:&lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;Incompatible hardware
detected.&quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Unfortunately, your CPU does not meet
the minimal requirements to run this program. In particular, it does not support AVX
intrinsics which this program requires to successfully run a modern large language
model. The only soluţion at this time is to upgrade your hardware to a more modern
CPU.&lt;br&gt;&lt;br&gt;See here for more information: &lt;a
href=&quot;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&quot;&gt;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&lt;/a&gt;</source>
<translation type="vanished">
&lt;h3&gt;A apărut o eroare la iniţializare:; &lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;Hardware incompatibil. &quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Din păcate, procesorul (CPU) nu întruneşte condiţiile minime pentru a rula acest program. În particular, nu suportă instrucţiunile AVX pe care programul le necesită pentru a integra un model conversaţional modern. În acest moment, unica soluţie este îţi aduci la zi sistemul hardware cu un CPU mai recent.&lt;br&gt;&lt;br&gt;Aici sunt mai multe informaţii: &lt;a href=&quot;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&quot;&gt;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&lt;/a&gt;</translation>
</message>
<message> <message>
<location filename="../src/main.qml" line="23"/> <location filename="../src/main.qml" line="23"/>
<source>GPT4All v%1</source> <source>GPT4All v%1</source>
<translation>GPT4All v%1</translation> <translation>GPT4All v%1</translation>
</message> </message>
<message>
<source>&lt;h3&gt;Encountered an error starting
up:&lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;Inability to access settings
file.&quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Unfortunately, something is preventing the
program from accessing the settings file. This could be caused by incorrect
permissions in the local app config directory where the settings file is located.
Check out our &lt;a href=&quot;https://discord.gg/4M2QFmTt2k&quot;&gt;discord
channel&lt;/a&gt; for help.</source>
<translation type="vanished">&lt;h3&gt;A apărut o eroare la iniţializare:; &lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;Nu poate fi accesat fişierul de configurare a programului.&quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Din păcate, ceva împiedică programul în a accesa acel fişier. Cauza poate fi un set de permisiuni incorecte pe directorul/folderul local de configurare unde se află acel fişier. Poţi parcurge canalul nostru &lt;a href=&quot;https://discord.gg/4M2QFmTt2k&quot;&gt;Discord&lt;/a&gt; unde vei putea primi asistenţă.</translation>
</message>
<message>
<source>&lt;h3&gt;Encountered an error starting up:&lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;Incompatible hardware detected.&quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Unfortunately, your CPU does not meet the minimal requirements to run this program. In particular, it does not support AVX intrinsics which this program requires to successfully run a modern large language model. The only soluţion at this time is to upgrade your hardware to a more modern CPU.&lt;br&gt;&lt;br&gt;See here for more information: &lt;a href=&quot;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&quot;&gt;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&lt;/a&gt;</source>
<translation type="vanished">&lt;h3&gt;A apărut o eroare la iniţializare:; &lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;Hardware incompatibil. &quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Din păcate, procesorul (CPU) nu întruneşte condiţiile minime pentru a rula acest program. În particular, nu suportă instrucţiunile AVX pe care programul le necesită pentru a integra un model conversaţional modern. În acest moment, unica soluţie este îţi aduci la zi sistemul hardware cu un CPU mai recent.&lt;br&gt;&lt;br&gt;Aici sunt mai multe informaţii: &lt;a href=&quot;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&quot;&gt;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&lt;/a&gt;</translation>
</message>
<message> <message>
<location filename="../src/main.qml" line="111"/> <location filename="../src/main.qml" line="111"/>
<source>&lt;h3&gt;Encountered an error starting up:&lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;Incompatible hardware detected.&quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Unfortunately, your CPU does not meet the minimal requirements to run this program. In particular, it does not support AVX intrinsics which this program requires to successfully run a modern large language model. The only solution at this time is to upgrade your hardware to a more modern CPU.&lt;br&gt;&lt;br&gt;See here for more information: &lt;a href=&quot;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&quot;&gt;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&lt;/a&gt;</source> <source>&lt;h3&gt;Encountered an error starting up:&lt;/h3&gt;&lt;br&gt;&lt;i&gt;&quot;Incompatible hardware detected.&quot;&lt;/i&gt;&lt;br&gt;&lt;br&gt;Unfortunately, your CPU does not meet the minimal requirements to run this program. In particular, it does not support AVX intrinsics which this program requires to successfully run a modern large language model. The only solution at this time is to upgrade your hardware to a more modern CPU.&lt;br&gt;&lt;br&gt;See here for more information: &lt;a href=&quot;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&quot;&gt;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&lt;/a&gt;</source>

View File

@ -86,10 +86,6 @@
<source>Text field for discovering and filtering downloadable models</source> <source>Text field for discovering and filtering downloadable models</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>Searching · </source>
<translation type="vanished"></translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="171"/> <location filename="../src/qml/AddModelView.qml" line="171"/>
<source>Initiate model discovery and filtering</source> <source>Initiate model discovery and filtering</source>
@ -120,10 +116,6 @@
<source>Recent</source> <source>Recent</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>Sort by: </source>
<translation type="vanished"></translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="216"/> <location filename="../src/qml/AddModelView.qml" line="216"/>
<source>Asc</source> <source>Asc</source>
@ -134,23 +126,11 @@
<source>Desc</source> <source>Desc</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>Sort dir: </source>
<translation type="vanished">:</translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="252"/> <location filename="../src/qml/AddModelView.qml" line="252"/>
<source>None</source> <source>None</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>Limit: </source>
<translation type="vanished"></translation>
</message>
<message>
<source>Network error: could not retrieve http://gpt4all.io/models/models3.json</source>
<translation type="vanished">访 http://gpt4all.io/models/models3.json</translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="101"/> <location filename="../src/qml/AddModelView.qml" line="101"/>
<source>Searching · %1</source> <source>Searching · %1</source>
@ -294,27 +274,11 @@
<source>?</source> <source>?</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>&lt;a href=&quot;#error&quot;&gt;Error&lt;/a&gt;</source>
<translation type="vanished">&lt;a href=&quot;#error&quot;&gt;&lt;/a&gt;</translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="508"/> <location filename="../src/qml/AddModelView.qml" line="508"/>
<source>Describes an error that occurred when downloading</source> <source>Describes an error that occurred when downloading</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;WARNING: Not recommended for your hardware.</source>
<translation type="vanished">&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;警告: 你的硬件不推荐.</translation>
</message>
<message>
<source> Model requires more memory (</source>
<translation type="vanished"></translation>
</message>
<message>
<source> GB) than your system has available (</source>
<translation type="vanished"> (</translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="527"/> <location filename="../src/qml/AddModelView.qml" line="527"/>
<source>Error for incompatible hardware</source> <source>Error for incompatible hardware</source>
@ -373,10 +337,6 @@
<source>RAM required</source> <source>RAM required</source>
<translation>RAM </translation> <translation>RAM </translation>
</message> </message>
<message>
<source> GB</source>
<translation type="vanished">GB</translation>
</message>
<message> <message>
<location filename="../src/qml/AddModelView.qml" line="733"/> <location filename="../src/qml/AddModelView.qml" line="733"/>
<source>Parameters</source> <source>Parameters</source>
@ -410,20 +370,6 @@
<source>opt-in to share feedback/conversations</source> <source>opt-in to share feedback/conversations</source>
<translation>/</translation> <translation>/</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="37"/>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt;
to check for updates!&lt;br&gt;&lt;br&gt;
Did you install this application using the online installer? If so,&lt;br&gt;
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to&lt;br&gt;
reinstall.</source>
<translation> MaintenanceTool
使线
MaintenanceTool
</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="48"/> <location filename="../src/qml/ApplicationSettings.qml" line="48"/>
<source>Error dialog</source> <source>Error dialog</source>
@ -459,6 +405,11 @@
<source>Light</source> <source>Light</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="39"/>
<source>ERROR: Update system could not find the MaintenanceTool used to check for updates!&lt;br/&gt;&lt;br/&gt;Did you install this application using the online installer? If so, the MaintenanceTool executable should be located one directory above where this application resides on your filesystem.&lt;br/&gt;&lt;br/&gt;If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to reinstall.</source>
<translation> MaintenanceTool&lt;br&gt;&lt;br&gt;使线MaintenanceTool &lt;br&gt;&lt;br&gt;</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="114"/> <location filename="../src/qml/ApplicationSettings.qml" line="114"/>
<source>LegacyDark</source> <source>LegacyDark</source>
@ -617,11 +568,7 @@
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="505"/> <location filename="../src/qml/ApplicationSettings.qml" line="505"/>
<source>Enable Local API Server</source> <source>Enable Local API Server</source>
<translation type="unfinished"></translation> <translation> API </translation>
</message>
<message>
<source>Enable Local Server</source>
<translation type="vanished"></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="506"/> <location filename="../src/qml/ApplicationSettings.qml" line="506"/>
@ -667,14 +614,6 @@
<source>Server Chat</source> <source>Server Chat</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>Prompt: </source>
<translation type="vanished"></translation>
</message>
<message>
<source>Response: </source>
<translation type="vanished"></translation>
</message>
</context> </context>
<context> <context>
<name>ChatAPIWorker</name> <name>ChatAPIWorker</name>
@ -787,14 +726,6 @@
</context> </context>
<context> <context>
<name>ChatView</name> <name>ChatView</name>
<message>
<source>&lt;h3&gt;Encountered an error loading model:&lt;/h3&gt;&lt;br&gt;</source>
<translation type="vanished">&lt;h3&gt;:&lt;/h3&gt;&lt;br&gt;</translation>
</message>
<message>
<source>&lt;br&gt;&lt;br&gt;Model loading failures can happen for a variety of reasons, but the most common causes include a bad file format, an incomplete or corrupted download, the wrong file type, not enough system RAM or an incompatible model type. Here are some suggestions for resolving the problem:&lt;br&gt;&lt;ul&gt;&lt;li&gt;Ensure the model file has a compatible format and type&lt;li&gt;Check the model file is complete in the download folder&lt;li&gt;You can find the download folder in the settings dialog&lt;li&gt;If you&apos;ve sideloaded the model ensure the file is not corrupt by checking md5sum&lt;li&gt;Read more about what models are supported in our &lt;a href=&quot;https://docs.gpt4all.io/&quot;&gt;documentation&lt;/a&gt; for the gui&lt;li&gt;Check out our &lt;a href=&quot;https://discord.gg/4M2QFmTt2k&quot;&gt;discord channel&lt;/a&gt; for help</source>
<translation type="vanished">lt;br&gt;&lt;br&gt; RAM &lt;br&gt;&lt;ul&gt;&lt;li&gt;&lt;li&gt;&lt;li&gt;&lt;li&gt; md5sum &lt;li&gt;&lt;a href=&quot;https://docs.gpt4all.io/&quot;&gt;文档&lt;/a&gt;中了解有关支持哪些模型的更多信息对于 gui&lt;li&gt;查看我们的&lt;a href=&quot;https://discord.gg/4M2QFmTt2k&quot;&gt;discord 频道&lt;/a&gt; 获取帮助</translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="77"/> <location filename="../src/qml/ChatView.qml" line="77"/>
<source>&lt;h3&gt;Warning&lt;/h3&gt;&lt;p&gt;%1&lt;/p&gt;</source> <source>&lt;h3&gt;Warning&lt;/h3&gt;&lt;p&gt;%1&lt;/p&gt;</source>
@ -820,10 +751,6 @@
<source>Code copied to clipboard.</source> <source>Code copied to clipboard.</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>Response: </source>
<translation type="vanished"></translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="231"/> <location filename="../src/qml/ChatView.qml" line="231"/>
<source>Chat panel</source> <source>Chat panel</source>
@ -874,14 +801,6 @@
<source>Not found: %1</source> <source>Not found: %1</source>
<translation>: %1</translation> <translation>: %1</translation>
</message> </message>
<message>
<source>Reload · </source>
<translation type="vanished">· </translation>
</message>
<message>
<source>Loading · </source>
<translation type="vanished">· </translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="463"/> <location filename="../src/qml/ChatView.qml" line="463"/>
<source>The top item is the current model</source> <source>The top item is the current model</source>
@ -903,14 +822,6 @@
<source>add collections of documents to the chat</source> <source>add collections of documents to the chat</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>Load · </source>
<translation type="vanished">· </translation>
</message>
<message>
<source> (default) </source>
<translation type="vanished">() </translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="738"/> <location filename="../src/qml/ChatView.qml" line="738"/>
<source>Load the default model</source> <source>Load the default model</source>
@ -962,31 +873,11 @@ model to get started</source>
<source>You</source> <source>You</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>Busy indicator</source>
<translation type="vanished"></translation>
</message>
<message>
<source>The model is thinking</source>
<translation type="vanished"></translation>
</message>
<message>
<source>recalculating context ...</source>
<translation type="vanished">...</translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="878"/> <location filename="../src/qml/ChatView.qml" line="878"/>
<source>response stopped ...</source> <source>response stopped ...</source>
<translation>...</translation> <translation>...</translation>
</message> </message>
<message>
<source>retrieving localdocs: </source>
<translation type="vanished"></translation>
</message>
<message>
<source>searching localdocs: </source>
<translation type="vanished"></translation>
</message>
<message> <message>
<location filename="../src/qml/ChatView.qml" line="881"/> <location filename="../src/qml/ChatView.qml" line="881"/>
<source>processing ...</source> <source>processing ...</source>
@ -1202,37 +1093,37 @@ model to get started</source>
<context> <context>
<name>Download</name> <name>Download</name>
<message> <message>
<location filename="../src/download.cpp" line="279"/> <location filename="../src/download.cpp" line="278"/>
<source>Model &quot;%1&quot; is installed successfully.</source> <source>Model &quot;%1&quot; is installed successfully.</source>
<translation> &quot;%1&quot; </translation> <translation> &quot;%1&quot; </translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="289"/> <location filename="../src/download.cpp" line="288"/>
<source>ERROR: $MODEL_NAME is empty.</source> <source>ERROR: $MODEL_NAME is empty.</source>
<translation>$MODEL_NAME </translation> <translation>$MODEL_NAME </translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="295"/> <location filename="../src/download.cpp" line="294"/>
<source>ERROR: $API_KEY is empty.</source> <source>ERROR: $API_KEY is empty.</source>
<translation>$API_KEY为空</translation> <translation>$API_KEY为空</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="301"/> <location filename="../src/download.cpp" line="300"/>
<source>ERROR: $BASE_URL is invalid.</source> <source>ERROR: $BASE_URL is invalid.</source>
<translation>$BASE_URL </translation> <translation>$BASE_URL </translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="307"/> <location filename="../src/download.cpp" line="306"/>
<source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source> <source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source>
<translation>错误: 模型 &quot;%1 (%2)&quot; .</translation> <translation>错误: 模型 &quot;%1 (%2)&quot; .</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="326"/> <location filename="../src/download.cpp" line="325"/>
<source>Model &quot;%1 (%2)&quot; is installed successfully.</source> <source>Model &quot;%1 (%2)&quot; is installed successfully.</source>
<translation> &quot;%1 (%2)&quot; .</translation> <translation> &quot;%1 (%2)&quot; .</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="350"/> <location filename="../src/download.cpp" line="349"/>
<source>Model &quot;%1&quot; is removed.</source> <source>Model &quot;%1&quot; is removed.</source>
<translation> &quot;%1&quot; .</translation> <translation> &quot;%1&quot; .</translation>
</message> </message>
@ -1465,10 +1356,6 @@ model to get started</source>
<source> Add Collection</source> <source> Add Collection</source>
<translation> </translation> <translation> </translation>
</message> </message>
<message>
<source>ERROR: The LocalDocs database is not valid.</source>
<translation type="vanished">错误: 本地文档数据库错误.</translation>
</message>
<message> <message>
<location filename="../qml/LocalDocsView.qml" line="85"/> <location filename="../qml/LocalDocsView.qml" line="85"/>
<location filename="../../build_gpt4all-chat_Desktop_Qt_6_7_2/gpt4all/qml/LocalDocsView.qml" line="85"/> <location filename="../../build_gpt4all-chat_Desktop_Qt_6_7_2/gpt4all/qml/LocalDocsView.qml" line="85"/>
@ -1616,12 +1503,12 @@ model to get started</source>
<location filename="../src/modellist.cpp" line="1226"/> <location filename="../src/modellist.cpp" line="1226"/>
<location filename="../src/modellist.cpp" line="1277"/> <location filename="../src/modellist.cpp" line="1277"/>
<source>cannot open &quot;%1&quot;: %2</source> <source>cannot open &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>%1%2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1238"/> <location filename="../src/modellist.cpp" line="1238"/>
<source>cannot create &quot;%1&quot;: %2</source> <source>cannot create &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>%1%2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1288"/> <location filename="../src/modellist.cpp" line="1288"/>
@ -1673,36 +1560,16 @@ model to get started</source>
<source>&lt;strong&gt;Connect to OpenAI-compatible API server&lt;/strong&gt;&lt;br&gt; %1</source> <source>&lt;strong&gt;Connect to OpenAI-compatible API server&lt;/strong&gt;&lt;br&gt; %1</source>
<translation>&lt;strong&gt; OpenAI API &lt;/strong&gt;&lt;br&gt; %1</translation> <translation>&lt;strong&gt; OpenAI API &lt;/strong&gt;&lt;br&gt; %1</translation>
</message> </message>
<message>
<source>&lt;strong&gt;OpenAI&apos;s ChatGPT model GPT-3.5 Turbo&lt;/strong&gt;&lt;br&gt;</source>
<translation type="vanished">&lt;strong&gt;OpenAI&apos;s ChatGPT model GPT-3.5 Turbo&lt;/strong&gt;&lt;br&gt;</translation>
</message>
<message> <message>
<location filename="../src/modellist.cpp" line="1598"/> <location filename="../src/modellist.cpp" line="1598"/>
<source>&lt;br&gt;&lt;br&gt;&lt;i&gt;* Even if you pay OpenAI for ChatGPT-4 this does not guarantee API key access. Contact OpenAI for more info.</source> <source>&lt;br&gt;&lt;br&gt;&lt;i&gt;* Even if you pay OpenAI for ChatGPT-4 this does not guarantee API key access. Contact OpenAI for more info.</source>
<translation>&lt;br&gt;&lt;br&gt;&lt;i&gt;* 使ChatGPT-4OpenAI付款API密钥访问OpenAI获取更多信息</translation> <translation>&lt;br&gt;&lt;br&gt;&lt;i&gt;* 使ChatGPT-4OpenAI付款API密钥访问OpenAI获取更多信息</translation>
</message> </message>
<message>
<source>&lt;strong&gt;OpenAI&apos;s ChatGPT model GPT-4&lt;/strong&gt;&lt;br&gt;</source>
<translation type="vanished">&lt;strong&gt;OpenAI&apos;s ChatGPT model GPT-4&lt;/strong&gt;&lt;br&gt;</translation>
</message>
<message> <message>
<location filename="../src/modellist.cpp" line="1625"/> <location filename="../src/modellist.cpp" line="1625"/>
<source>&lt;ul&gt;&lt;li&gt;Requires personal Mistral API key.&lt;/li&gt;&lt;li&gt;WARNING: Will send your chats to Mistral!&lt;/li&gt;&lt;li&gt;Your API key will be stored on disk&lt;/li&gt;&lt;li&gt;Will only be used to communicate with Mistral&lt;/li&gt;&lt;li&gt;You can apply for an API key &lt;a href=&quot;https://console.mistral.ai/user/api-keys&quot;&gt;here&lt;/a&gt;.&lt;/li&gt;</source> <source>&lt;ul&gt;&lt;li&gt;Requires personal Mistral API key.&lt;/li&gt;&lt;li&gt;WARNING: Will send your chats to Mistral!&lt;/li&gt;&lt;li&gt;Your API key will be stored on disk&lt;/li&gt;&lt;li&gt;Will only be used to communicate with Mistral&lt;/li&gt;&lt;li&gt;You can apply for an API key &lt;a href=&quot;https://console.mistral.ai/user/api-keys&quot;&gt;here&lt;/a&gt;.&lt;/li&gt;</source>
<translation>&lt;ul&gt;&lt;li&gt;Requires personal Mistral API key.&lt;/li&gt;&lt;li&gt;WARNING: Will send your chats to Mistral!&lt;/li&gt;&lt;li&gt;Your API key will be stored on disk&lt;/li&gt;&lt;li&gt;Will only be used to communicate with Mistral&lt;/li&gt;&lt;li&gt;You can apply for an API key &lt;a href=&quot;https://console.mistral.ai/user/api-keys&quot;&gt;here&lt;/a&gt;.&lt;/li&gt;</translation> <translation>&lt;ul&gt;&lt;li&gt;Requires personal Mistral API key.&lt;/li&gt;&lt;li&gt;WARNING: Will send your chats to Mistral!&lt;/li&gt;&lt;li&gt;Your API key will be stored on disk&lt;/li&gt;&lt;li&gt;Will only be used to communicate with Mistral&lt;/li&gt;&lt;li&gt;You can apply for an API key &lt;a href=&quot;https://console.mistral.ai/user/api-keys&quot;&gt;here&lt;/a&gt;.&lt;/li&gt;</translation>
</message> </message>
<message>
<source>&lt;strong&gt;Mistral Tiny model&lt;/strong&gt;&lt;br&gt;</source>
<translation type="vanished">&lt;strong&gt;Mistral Tiny model&lt;/strong&gt;&lt;br&gt;</translation>
</message>
<message>
<source>&lt;strong&gt;Mistral Small model&lt;/strong&gt;&lt;br&gt;</source>
<translation type="vanished">&lt;strong&gt;Mistral Small model&lt;/strong&gt;&lt;br&gt;</translation>
</message>
<message>
<source>&lt;strong&gt;Mistral Medium model&lt;/strong&gt;&lt;br&gt;</source>
<translation type="vanished">&lt;strong&gt;Mistral Medium model&lt;/strong&gt;&lt;br&gt;</translation>
</message>
<message> <message>
<location filename="../src/modellist.cpp" line="2138"/> <location filename="../src/modellist.cpp" line="2138"/>
<source>&lt;strong&gt;Created by %1.&lt;/strong&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Published on %2.&lt;li&gt;This model has %3 likes.&lt;li&gt;This model has %4 downloads.&lt;li&gt;More info can be found &lt;a href=&quot;https://huggingface.co/%5&quot;&gt;here.&lt;/a&gt;&lt;/ul&gt;</source> <source>&lt;strong&gt;Created by %1.&lt;/strong&gt;&lt;br&gt;&lt;ul&gt;&lt;li&gt;Published on %2.&lt;li&gt;This model has %3 likes.&lt;li&gt;This model has %4 downloads.&lt;li&gt;More info can be found &lt;a href=&quot;https://huggingface.co/%5&quot;&gt;here.&lt;/a&gt;&lt;/ul&gt;</source>
@ -1766,11 +1633,6 @@ model to get started</source>
<source>Must contain the string &quot;%1&quot; to be replaced with the user&apos;s input.</source> <source>Must contain the string &quot;%1&quot; to be replaced with the user&apos;s input.</source>
<translation> &quot;%1&quot; &apos;s .</translation> <translation> &quot;%1&quot; &apos;s .</translation>
</message> </message>
<message>
<source>Add
optional image</source>
<translation type="vanished"></translation>
</message>
<message> <message>
<location filename="../src/qml/ModelSettings.qml" line="255"/> <location filename="../src/qml/ModelSettings.qml" line="255"/>
<source>Chat Name Prompt</source> <source>Chat Name Prompt</source>
@ -2075,27 +1937,11 @@ NOTE: Does not take effect until you reload the model.</source>
<source>?</source> <source>?</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>&lt;a href=&quot;#error&quot;&gt;Error&lt;/a&gt;</source>
<translation type="vanished">&lt;a href=&quot;#&quot;&gt;&lt;/a&gt;</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelsView.qml" line="288"/> <location filename="../src/qml/ModelsView.qml" line="288"/>
<source>Describes an error that occurred when downloading</source> <source>Describes an error that occurred when downloading</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;WARNING: Not recommended for your hardware.</source>
<translation type="vanished">&lt;strong&gt;&lt;font size=&quot;2&quot;&gt;警告: 你的硬件不推荐.</translation>
</message>
<message>
<source> Model requires more memory (</source>
<translation type="vanished">(</translation>
</message>
<message>
<source> GB) than your system has available (</source>
<translation type="vanished">GB) (</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelsView.qml" line="307"/> <location filename="../src/qml/ModelsView.qml" line="307"/>
<source>Error for incompatible hardware</source> <source>Error for incompatible hardware</source>
@ -2159,10 +2005,6 @@ NOTE: Does not take effect until you reload the model.</source>
<source>RAM required</source> <source>RAM required</source>
<translation> RAM</translation> <translation> RAM</translation>
</message> </message>
<message>
<source> GB</source>
<translation type="vanished"> GB</translation>
</message>
<message> <message>
<location filename="../src/qml/ModelsView.qml" line="513"/> <location filename="../src/qml/ModelsView.qml" line="513"/>
<source>Parameters</source> <source>Parameters</source>
@ -2234,47 +2076,47 @@ NOTE: By turning on this feature, you will be sending your data to the GPT4All O
GPT4All Nomic AI GPT4All Nomic AI 使 GPT4All </translation> GPT4All Nomic AI GPT4All Nomic AI 使 GPT4All </translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="63"/> <location filename="../src/qml/NetworkDialog.qml" line="70"/>
<source>Terms for opt-in</source> <source>Terms for opt-in</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="64"/> <location filename="../src/qml/NetworkDialog.qml" line="71"/>
<source>Describes what will happen when you opt-in</source> <source>Describes what will happen when you opt-in</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="72"/> <location filename="../src/qml/NetworkDialog.qml" line="79"/>
<source>Please provide a name for attribution (optional)</source> <source>Please provide a name for attribution (optional)</source>
<translation> ()</translation> <translation> ()</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="74"/> <location filename="../src/qml/NetworkDialog.qml" line="81"/>
<source>Attribution (optional)</source> <source>Attribution (optional)</source>
<translation> ()</translation> <translation> ()</translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="75"/> <location filename="../src/qml/NetworkDialog.qml" line="82"/>
<source>Provide attribution</source> <source>Provide attribution</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="88"/> <location filename="../src/qml/NetworkDialog.qml" line="95"/>
<source>Enable</source> <source>Enable</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="89"/> <location filename="../src/qml/NetworkDialog.qml" line="96"/>
<source>Enable opt-in</source> <source>Enable opt-in</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="93"/> <location filename="../src/qml/NetworkDialog.qml" line="100"/>
<source>Cancel</source> <source>Cancel</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="94"/> <location filename="../src/qml/NetworkDialog.qml" line="101"/>
<source>Cancel opt-in</source> <source>Cancel opt-in</source>
<translation></translation> <translation></translation>
</message> </message>
@ -2315,13 +2157,6 @@ NOTE: By turning on this feature, you will be sending your data to the GPT4All O
<translation></translation> <translation></translation>
</message> </message>
</context> </context>
<context>
<name>QObject</name>
<message>
<source>Default</source>
<translation type="vanished"></translation>
</message>
</context>
<context> <context>
<name>SettingsView</name> <name>SettingsView</name>
<message> <message>
@ -2358,16 +2193,6 @@ NOTE: By turning on this feature, you will be sending your data to the GPT4All O
<source>Welcome!</source> <source>Welcome!</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<source>### Release notes
</source>
<translation type="vanished">### </translation>
</message>
<message>
<source>### Contributors
</source>
<translation type="vanished">### </translation>
</message>
<message> <message>
<location filename="../src/qml/StartupDialog.qml" line="67"/> <location filename="../src/qml/StartupDialog.qml" line="67"/>
<source>### Release Notes <source>### Release Notes
@ -2543,62 +2368,6 @@ model release that uses your data!</source>
</context> </context>
<context> <context>
<name>main</name> <name>main</name>
<message>
<source>GPT4All v</source>
<translation type="vanished">GPT4All v</translation>
</message>
<message>
<source>&lt;h3&gt;Encountered an error starting up:&lt;/h3&gt;&lt;br&gt;</source>
<translation type="vanished">&lt;h3&gt;:&lt;/h3&gt;&lt;br&gt;</translation>
</message>
<message>
<source>&lt;i&gt;&quot;Incompatible hardware detected.&quot;&lt;/i&gt;</source>
<translation type="vanished">&lt;i&gt;&quot;&quot;&lt;/i&gt;</translation>
</message>
<message>
<source>&lt;br&gt;&lt;br&gt;Unfortunately, your CPU does not meet the minimal requirements to run </source>
<translation type="vanished">&lt;br&gt;&lt;br&gt;CPU不符合运行的最低要求</translation>
</message>
<message>
<source>this program. In particular, it does not support AVX intrinsics which this </source>
<translation type="vanished">AVX </translation>
</message>
<message>
<source>program requires to successfully run a modern large language model. </source>
<translation type="vanished"></translation>
</message>
<message>
<source>The only solution at this time is to upgrade your hardware to a more modern CPU.</source>
<translation type="vanished">CPU</translation>
</message>
<message>
<source>&lt;br&gt;&lt;br&gt;See here for more information: &lt;a href=&quot;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&quot;&gt;</source>
<translation type="vanished">&lt;br&gt;&lt;br&gt; &lt;a href=&quot;https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&quot;&gt;</translation>
</message>
<message>
<source>https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&lt;/a&gt;</source>
<translation type="vanished">https://en.wikipedia.org/wiki/Advanced_Vector_Extensions&lt;/a&gt;</translation>
</message>
<message>
<source>&lt;i&gt;&quot;Inability to access settings file.&quot;&lt;/i&gt;</source>
<translation type="vanished">&lt;i&gt;&quot;访&quot;&lt;/i&gt;</translation>
</message>
<message>
<source>&lt;br&gt;&lt;br&gt;Unfortunately, something is preventing the program from accessing </source>
<translation type="vanished">&lt;br&gt;&lt;br&gt;西访 </translation>
</message>
<message>
<source>the settings file. This could be caused by incorrect permissions in the local </source>
<translation type="vanished"></translation>
</message>
<message>
<source>app config directory where the settings file is located. </source>
<translation type="vanished"></translation>
</message>
<message>
<source>Check out our &lt;a href=&quot;https://discord.gg/4M2QFmTt2k&quot;&gt;discord channel&lt;/a&gt; for help.</source>
<translation type="vanished"> &lt;a href=&quot;https://discord.gg/4M2QFmTt2k&quot;&gt;discord channel&lt;/a&gt; 寻求.</translation>
</message>
<message> <message>
<location filename="../src/main.qml" line="23"/> <location filename="../src/main.qml" line="23"/>
<source>GPT4All v%1</source> <source>GPT4All v%1</source>

View File

@ -371,21 +371,6 @@
<source>opt-in to share feedback/conversations</source> <source>opt-in to share feedback/conversations</source>
<translation>/</translation> <translation>/</translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="37"/>
<source>ERROR: Update system could not find the MaintenanceTool used&lt;br&gt;
to check for updates!&lt;br&gt;&lt;br&gt;
Did you install this application using the online installer? If so,&lt;br&gt;
the MaintenanceTool executable should be located one directory&lt;br&gt;
above where this application resides on your filesystem.&lt;br&gt;&lt;br&gt;
If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to&lt;br&gt;
reinstall.</source>
<translation>使&lt;br&gt;
使&lt;br&gt;
MaintenanceTool&lt;br&gt;
&lt;br&gt;
</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="48"/> <location filename="../src/qml/ApplicationSettings.qml" line="48"/>
<source>Error dialog</source> <source>Error dialog</source>
@ -504,13 +489,18 @@
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="505"/> <location filename="../src/qml/ApplicationSettings.qml" line="505"/>
<source>Enable Local API Server</source> <source>Enable Local API Server</source>
<translation type="unfinished"></translation> <translation> API </translation>
</message> </message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="340"/> <location filename="../src/qml/ApplicationSettings.qml" line="340"/>
<source>Generate suggested follow-up questions at the end of responses.</source> <source>Generate suggested follow-up questions at the end of responses.</source>
<translation></translation> <translation></translation>
</message> </message>
<message>
<location filename="../src/qml/ApplicationSettings.qml" line="39"/>
<source>ERROR: Update system could not find the MaintenanceTool used to check for updates!&lt;br/&gt;&lt;br/&gt;Did you install this application using the online installer? If so, the MaintenanceTool executable should be located one directory above where this application resides on your filesystem.&lt;br/&gt;&lt;br/&gt;If you can&apos;t start it manually, then I&apos;m afraid you&apos;ll have to reinstall.</source>
<translation>使&lt;br&gt;&lt;br&gt;使MaintenanceTool&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;</translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="224"/> <location filename="../src/qml/ApplicationSettings.qml" line="224"/>
<source>The compute device used for text generation.</source> <source>The compute device used for text generation.</source>
@ -577,10 +567,6 @@
<source>Save the chat model&apos;s state to disk for faster loading. WARNING: Uses ~2GB per chat.</source> <source>Save the chat model&apos;s state to disk for faster loading. WARNING: Uses ~2GB per chat.</source>
<translation>使 2GB</translation> <translation>使 2GB</translation>
</message> </message>
<message>
<source>Enable Local Server</source>
<translation type="vanished"></translation>
</message>
<message> <message>
<location filename="../src/qml/ApplicationSettings.qml" line="506"/> <location filename="../src/qml/ApplicationSettings.qml" line="506"/>
<source>Expose an OpenAI-Compatible server to localhost. WARNING: Results in increased resource usage.</source> <source>Expose an OpenAI-Compatible server to localhost. WARNING: Results in increased resource usage.</source>
@ -1105,37 +1091,37 @@ model to get started</source>
<context> <context>
<name>Download</name> <name>Download</name>
<message> <message>
<location filename="../src/download.cpp" line="279"/> <location filename="../src/download.cpp" line="278"/>
<source>Model &quot;%1&quot; is installed successfully.</source> <source>Model &quot;%1&quot; is installed successfully.</source>
<translation>%1</translation> <translation>%1</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="289"/> <location filename="../src/download.cpp" line="288"/>
<source>ERROR: $MODEL_NAME is empty.</source> <source>ERROR: $MODEL_NAME is empty.</source>
<translation>$MODEL_NAME </translation> <translation>$MODEL_NAME </translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="295"/> <location filename="../src/download.cpp" line="294"/>
<source>ERROR: $API_KEY is empty.</source> <source>ERROR: $API_KEY is empty.</source>
<translation>$API_KEY </translation> <translation>$API_KEY </translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="301"/> <location filename="../src/download.cpp" line="300"/>
<source>ERROR: $BASE_URL is invalid.</source> <source>ERROR: $BASE_URL is invalid.</source>
<translation>$BASE_URL </translation> <translation>$BASE_URL </translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="307"/> <location filename="../src/download.cpp" line="306"/>
<source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source> <source>ERROR: Model &quot;%1 (%2)&quot; is conflict.</source>
<translation>%1 %2</translation> <translation>%1 %2</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="326"/> <location filename="../src/download.cpp" line="325"/>
<source>Model &quot;%1 (%2)&quot; is installed successfully.</source> <source>Model &quot;%1 (%2)&quot; is installed successfully.</source>
<translation>%1%2</translation> <translation>%1%2</translation>
</message> </message>
<message> <message>
<location filename="../src/download.cpp" line="350"/> <location filename="../src/download.cpp" line="349"/>
<source>Model &quot;%1&quot; is removed.</source> <source>Model &quot;%1&quot; is removed.</source>
<translation>%1</translation> <translation>%1</translation>
</message> </message>
@ -1509,12 +1495,12 @@ model to get started</source>
<location filename="../src/modellist.cpp" line="1226"/> <location filename="../src/modellist.cpp" line="1226"/>
<location filename="../src/modellist.cpp" line="1277"/> <location filename="../src/modellist.cpp" line="1277"/>
<source>cannot open &quot;%1&quot;: %2</source> <source>cannot open &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>%1%2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1238"/> <location filename="../src/modellist.cpp" line="1238"/>
<source>cannot create &quot;%1&quot;: %2</source> <source>cannot create &quot;%1&quot;: %2</source>
<translation type="unfinished"></translation> <translation>%1%2</translation>
</message> </message>
<message> <message>
<location filename="../src/modellist.cpp" line="1288"/> <location filename="../src/modellist.cpp" line="1288"/>
@ -2088,47 +2074,47 @@ NOTE: By turning on this feature, you will be sending your data to the GPT4All O
Nomic AI 使 GPT4All </translation> Nomic AI 使 GPT4All </translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="63"/> <location filename="../src/qml/NetworkDialog.qml" line="70"/>
<source>Terms for opt-in</source> <source>Terms for opt-in</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="64"/> <location filename="../src/qml/NetworkDialog.qml" line="71"/>
<source>Describes what will happen when you opt-in</source> <source>Describes what will happen when you opt-in</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="72"/> <location filename="../src/qml/NetworkDialog.qml" line="79"/>
<source>Please provide a name for attribution (optional)</source> <source>Please provide a name for attribution (optional)</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="74"/> <location filename="../src/qml/NetworkDialog.qml" line="81"/>
<source>Attribution (optional)</source> <source>Attribution (optional)</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="75"/> <location filename="../src/qml/NetworkDialog.qml" line="82"/>
<source>Provide attribution</source> <source>Provide attribution</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="88"/> <location filename="../src/qml/NetworkDialog.qml" line="95"/>
<source>Enable</source> <source>Enable</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="89"/> <location filename="../src/qml/NetworkDialog.qml" line="96"/>
<source>Enable opt-in</source> <source>Enable opt-in</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="93"/> <location filename="../src/qml/NetworkDialog.qml" line="100"/>
<source>Cancel</source> <source>Cancel</source>
<translation></translation> <translation></translation>
</message> </message>
<message> <message>
<location filename="../src/qml/NetworkDialog.qml" line="94"/> <location filename="../src/qml/NetworkDialog.qml" line="101"/>
<source>Cancel opt-in</source> <source>Cancel opt-in</source>
<translation></translation> <translation></translation>
</message> </message>