mirror of
https://github.com/0xJacky/nginx-ui.git
synced 2025-05-12 10:55:51 +02:00
Merge pull request #815 from irexyc/typo
This commit is contained in:
commit
8cf7884f74
13 changed files with 18 additions and 18 deletions
|
@ -2505,11 +2505,11 @@ msgstr ""
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
#, fuzzy
|
#, fuzzy
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
"لاستخدام نموذج كبير محلي، قم بنشره باستخدام vllm أو imdeploy. فهي توفر نقطة "
|
"لاستخدام نموذج كبير محلي، قم بنشره باستخدام vllm أو lmdeploy. فهي توفر نقطة "
|
||||||
"نهاية API متوافقة مع OpenAI، لذا قم فقط بتعيين baseUrl إلىAPI المحلية الخاصة "
|
"نهاية API متوافقة مع OpenAI، لذا قم فقط بتعيين baseUrl إلىAPI المحلية الخاصة "
|
||||||
"بك."
|
"بك."
|
||||||
|
|
||||||
|
|
|
@ -2657,12 +2657,12 @@ msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
"Um ein lokales großes Modell zu verwenden, implementiere es mit ollama, vllm "
|
"Um ein lokales großes Modell zu verwenden, implementiere es mit ollama, vllm "
|
||||||
"oder imdeploy. Sie bieten einen OpenAI-kompatiblen API-Endpunkt, also setze "
|
"oder lmdeploy. Sie bieten einen OpenAI-kompatiblen API-Endpunkt, also setze "
|
||||||
"die baseUrl auf deine lokale API."
|
"die baseUrl auf deine lokale API."
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:72
|
#: src/views/preference/OpenAISettings.vue:72
|
||||||
|
|
|
@ -2598,7 +2598,7 @@ msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
|
@ -2579,11 +2579,11 @@ msgstr ""
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
#, fuzzy
|
#, fuzzy
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
"Para utilizar un modelo local grande, impleméntelo con vllm o imdeploy. "
|
"Para utilizar un modelo local grande, impleméntelo con vllm o lmdeploy. "
|
||||||
"Estos proporcionan un API endpoint compatible con OpenAI, por lo que solo "
|
"Estos proporcionan un API endpoint compatible con OpenAI, por lo que solo "
|
||||||
"debe configurar la baseUrl en su API local."
|
"debe configurar la baseUrl en su API local."
|
||||||
|
|
||||||
|
|
|
@ -2621,7 +2621,7 @@ msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
|
@ -2585,7 +2585,7 @@ msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
|
@ -2400,7 +2400,7 @@ msgid "To make sure the certification auto-renewal can work normally, we need to
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid "To use a local large model, deploy it with ollama, vllm or imdeploy. They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API."
|
msgid "To use a local large model, deploy it with ollama, vllm or lmdeploy. They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:72
|
#: src/views/preference/OpenAISettings.vue:72
|
||||||
|
|
|
@ -2565,7 +2565,7 @@ msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
|
@ -2779,11 +2779,11 @@ msgstr ""
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
#, fuzzy
|
#, fuzzy
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
"Yerel bir büyük model kullanmak için, vllm veya imdeploy ile dağıtın. OpenAI "
|
"Yerel bir büyük model kullanmak için, vllm veya lmdeploy ile dağıtın. OpenAI "
|
||||||
"uyumlu bir API uç noktası sağlarlar, bu nedenle baseUrl'yi yerel API'nize "
|
"uyumlu bir API uç noktası sağlarlar, bu nedenle baseUrl'yi yerel API'nize "
|
||||||
"ayarlamanız yeterlidir."
|
"ayarlamanız yeterlidir."
|
||||||
|
|
||||||
|
|
|
@ -2619,7 +2619,7 @@ msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
|
@ -2453,11 +2453,11 @@ msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
"要使用本地大型模型,可使用 ollama、vllm 或 imdeploy 进行部署。它们提供了与 "
|
"要使用本地大型模型,可使用 ollama、vllm 或 lmdeploy 进行部署。它们提供了与 "
|
||||||
"OpenAI 兼容的 API 端点,因此只需将 baseUrl 设置为本地 API 即可。"
|
"OpenAI 兼容的 API 端点,因此只需将 baseUrl 设置为本地 API 即可。"
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:72
|
#: src/views/preference/OpenAISettings.vue:72
|
||||||
|
|
|
@ -2504,7 +2504,7 @@ msgstr ""
|
||||||
|
|
||||||
#: src/views/preference/OpenAISettings.vue:48
|
#: src/views/preference/OpenAISettings.vue:48
|
||||||
msgid ""
|
msgid ""
|
||||||
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
|
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
|
||||||
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
|
||||||
"local API."
|
"local API."
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
|
@ -45,7 +45,7 @@ const models = shallowRef([
|
||||||
:validate-status="errors?.openai?.base_url ? 'error' : ''"
|
:validate-status="errors?.openai?.base_url ? 'error' : ''"
|
||||||
:help="errors?.openai?.base_url === 'url'
|
:help="errors?.openai?.base_url === 'url'
|
||||||
? $gettext('The url is invalid.')
|
? $gettext('The url is invalid.')
|
||||||
: $gettext('To use a local large model, deploy it with ollama, vllm or imdeploy. '
|
: $gettext('To use a local large model, deploy it with ollama, vllm or lmdeploy. '
|
||||||
+ 'They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API.')"
|
+ 'They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API.')"
|
||||||
>
|
>
|
||||||
<AInput
|
<AInput
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue