mirror of
https://github.com/ollama/ollama.git
synced 2025-05-11 10:26:53 +02:00
Clean up documentation (#1506)
* Clean up documentation Will probably need to update with PRs for new release. Signed-off-by: Matt Williams <m@technovangelist.com> * Correcting to fit in 0.1.15 changes Signed-off-by: Matt Williams <m@technovangelist.com> * Update README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * addressing comments Signed-off-by: Matt Williams <m@technovangelist.com> * more api cleanup Signed-off-by: Matt Williams <m@technovangelist.com> * its llava not llama Signed-off-by: Matt Williams <m@technovangelist.com> * Update docs/troubleshooting.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Updated hosting to server and documented all env vars Signed-off-by: Matt Williams <m@technovangelist.com> * remove last of the cli descriptions Signed-off-by: Matt Williams <m@technovangelist.com> * Update README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * update further per conversation with jeff earlier today Signed-off-by: Matt Williams <m@technovangelist.com> * cleanup the doc readme Signed-off-by: Matt Williams <m@technovangelist.com> * move upgrade to faq Signed-off-by: Matt Williams <m@technovangelist.com> * first change Signed-off-by: Matt Williams <m@technovangelist.com> * updated Signed-off-by: Matt Williams <m@technovangelist.com> * Update docs/faq.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * examples in parent Signed-off-by: Matt Williams <m@technovangelist.com> * add exapmle for create model. Signed-off-by: Matt Williams <m@technovangelist.com> * update faq Signed-off-by: Matt Williams <m@technovangelist.com> * update create model api Signed-off-by: Matt Williams <m@technovangelist.com> * Update docs/api.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/faq.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/troubleshooting.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * update the readme in docs Signed-off-by: Matt Williams <m@technovangelist.com> * update a few more things Signed-off-by: Matt Williams <m@technovangelist.com> * Update docs/troubleshooting.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/faq.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update README.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/modelfile.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> * Update docs/troubleshooting.md Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com> --------- Signed-off-by: Matt Williams <m@technovangelist.com> Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com>
This commit is contained in:
parent
9db28af84e
commit
291700c92d
7 changed files with 380 additions and 249 deletions
148
docs/faq.md
148
docs/faq.md
|
@ -1,138 +1,90 @@
|
|||
# FAQ
|
||||
|
||||
## How can I upgrade Ollama?
|
||||
|
||||
To upgrade Ollama, run the installation process again. On the Mac, click the Ollama icon in the menubar and choose the restart option if an update is available.
|
||||
|
||||
## How can I view the logs?
|
||||
|
||||
On macOS:
|
||||
Review the [Troubleshooting](./troubleshooting.md) docs for more about using logs.
|
||||
|
||||
```
|
||||
cat ~/.ollama/logs/server.log
|
||||
```
|
||||
## How do I use Ollama server environment variables on Mac
|
||||
|
||||
On Linux:
|
||||
On macOS, Ollama runs in the background and is managed by the menubar app. If adding environment variables, Ollama will need to be run manually.
|
||||
|
||||
```
|
||||
journalctl -u ollama
|
||||
```
|
||||
1. Click the menubar icon for Ollama and choose **Quit Ollama**.
|
||||
2. Open a new terminal window and run the following command (this example uses `OLLAMA_HOST` with an IP address of `123.1.1.1`):
|
||||
|
||||
If you're running `ollama serve` directly, the logs will be printed to the console.
|
||||
```bash
|
||||
OLLAMA_HOST=123.1.1.1 ollama serve
|
||||
```
|
||||
|
||||
## How do I use Ollama server environment variables on Linux?
|
||||
|
||||
If Ollama is installed with the install script, a systemd service was created, running as the Ollama user. To add an environment variable, such as OLLAMA_HOST, follow these steps:
|
||||
|
||||
1. Create a `systemd` drop-in directory and add a config file. This is only needed once.
|
||||
|
||||
```bash
|
||||
mkdir -p /etc/systemd/system/ollama.service.d
|
||||
echo '[Service]' >>/etc/systemd/system/ollama.service.d/environment.conf
|
||||
```
|
||||
|
||||
2. For each environment variable, add it to the config file:
|
||||
|
||||
```bash
|
||||
echo 'Environment="OLLAMA_HOST=0.0.0.0:11434"' >>/etc/systemd/system/ollama.service.d/environment.conf
|
||||
```
|
||||
|
||||
3. Reload `systemd` and restart Ollama:
|
||||
|
||||
```bash
|
||||
systemctl daemon-reload
|
||||
systemctl restart ollama
|
||||
```
|
||||
|
||||
## How can I expose Ollama on my network?
|
||||
|
||||
Ollama binds to 127.0.0.1 port 11434 by default. Change the bind address with the `OLLAMA_HOST` environment variable.
|
||||
|
||||
On macOS:
|
||||
|
||||
```bash
|
||||
OLLAMA_HOST=0.0.0.0:11434 ollama serve
|
||||
```
|
||||
|
||||
On Linux:
|
||||
|
||||
Create a `systemd` drop-in directory and set `Environment=OLLAMA_HOST`
|
||||
|
||||
```bash
|
||||
mkdir -p /etc/systemd/system/ollama.service.d
|
||||
echo '[Service]' >>/etc/systemd/system/ollama.service.d/environment.conf
|
||||
```
|
||||
|
||||
```bash
|
||||
echo 'Environment="OLLAMA_HOST=0.0.0.0:11434"' >>/etc/systemd/system/ollama.service.d/environment.conf
|
||||
```
|
||||
|
||||
Reload `systemd` and restart Ollama:
|
||||
|
||||
```bash
|
||||
systemctl daemon-reload
|
||||
systemctl restart ollama
|
||||
```
|
||||
Ollama binds to 127.0.0.1 port 11434 by default. Change the bind address with the `OLLAMA_HOST` environment variable. Refer to the section above for how to use environment variables on your platform.
|
||||
|
||||
## How can I allow additional web origins to access Ollama?
|
||||
|
||||
Ollama allows cross origin requests from `127.0.0.1` and `0.0.0.0` by default. Add additional origins with the `OLLAMA_ORIGINS` environment variable:
|
||||
Ollama allows cross-origin requests from `127.0.0.1` and `0.0.0.0` by default. Add additional origins with the `OLLAMA_ORIGINS` environment variable. For example, to add all ports on 192.168.1.1 and https://example.com, use:
|
||||
|
||||
On macOS:
|
||||
|
||||
```bash
|
||||
OLLAMA_ORIGINS=http://192.168.1.1:*,https://example.com ollama serve
|
||||
```shell
|
||||
OLLAMA_ORIGINS=http://192.168.1.1:*,https://example.com
|
||||
```
|
||||
|
||||
On Linux:
|
||||
|
||||
```bash
|
||||
echo 'Environment="OLLAMA_ORIGINS=http://192.168.1.1:*,https://example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf
|
||||
```
|
||||
|
||||
Reload `systemd` and restart Ollama:
|
||||
|
||||
```bash
|
||||
systemctl daemon-reload
|
||||
systemctl restart ollama
|
||||
```
|
||||
Refer to the section above for how to use environment variables on your platform.
|
||||
|
||||
## Where are models stored?
|
||||
|
||||
- macOS: Raw model data is stored under `~/.ollama/models`.
|
||||
- Linux: Raw model data is stored under `/usr/share/ollama/.ollama/models`
|
||||
- macOS: `~/.ollama/models`.
|
||||
- Linux: `/usr/share/ollama/.ollama/models`
|
||||
|
||||
Below the models directory you will find a structure similar to the following:
|
||||
See [the CLI Documentation](./cli.md) for more on this.
|
||||
|
||||
```shell
|
||||
.
|
||||
├── blobs
|
||||
└── manifests
|
||||
└── registry.ollama.ai
|
||||
├── f0rodo
|
||||
├── library
|
||||
├── mattw
|
||||
└── saikatkumardey
|
||||
```
|
||||
## How do I set them to a different location?
|
||||
|
||||
There is a `manifests/registry.ollama.ai/namespace` path. In example above, the user has downloaded models from the official `library`, `f0rodo`, `mattw`, and `saikatkumardey` namespaces. Within each of those directories, you will find directories for each of the models downloaded. And in there you will find a file name representing each tag. Each tag file is the manifest for the model.
|
||||
|
||||
The manifest lists all the layers used in this model. You will see a `media type` for each layer, along with a digest. That digest corresponds with a file in the `models/blobs directory`.
|
||||
|
||||
### How can I change where Ollama stores models?
|
||||
|
||||
To modify where models are stored, you can use the `OLLAMA_MODELS` environment variable. Note that on Linux this means defining `OLLAMA_MODELS` in a drop-in `/etc/systemd/system/ollama.service.d` service file, reloading systemd, and restarting the ollama service.
|
||||
If a different directory needs to be used, set the environment variable `OLLAMA_MODELS` to the chosen directory. Refer to the section above for how to use environment variables on your platform.
|
||||
|
||||
## Does Ollama send my prompts and answers back to Ollama.ai to use in any way?
|
||||
|
||||
No. Anything you do with Ollama, such as generate a response from the model, stays with you. We don't collect any data about how you use the model. You are always in control of your own data.
|
||||
No, Ollama runs entirely locally, and conversation data will never leave your machine.
|
||||
|
||||
## How can I use Ollama in Visual Studio Code?
|
||||
|
||||
There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. You can see the list of [extensions & plugins](https://github.com/jmorganca/ollama#extensions--plugins) at the bottom of the main repository readme.
|
||||
There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. See the list of [extensions & plugins](https://github.com/jmorganca/ollama#extensions--plugins) at the bottom of the main repository readme.
|
||||
|
||||
## How do I use Ollama behind a proxy?
|
||||
|
||||
Ollama is compatible with proxy servers if `HTTP_PROXY` or `HTTPS_PROXY` are configured. When using either variables, ensure it is set where `ollama serve` can access the values.
|
||||
|
||||
When using `HTTPS_PROXY`, ensure the proxy certificate is installed as a system certificate.
|
||||
|
||||
On macOS:
|
||||
|
||||
```bash
|
||||
HTTPS_PROXY=http://proxy.example.com ollama serve
|
||||
```
|
||||
|
||||
On Linux:
|
||||
|
||||
```bash
|
||||
echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf
|
||||
```
|
||||
|
||||
Reload `systemd` and restart Ollama:
|
||||
|
||||
```bash
|
||||
systemctl daemon-reload
|
||||
systemctl restart ollama
|
||||
```
|
||||
Ollama is compatible with proxy servers if `HTTP_PROXY` or `HTTPS_PROXY` are configured. When using either variables, ensure it is set where `ollama serve` can access the values. When using `HTTPS_PROXY`, ensure the proxy certificate is installed as a system certificate. Refer to the section above for how to use environment variables on your platform.
|
||||
|
||||
### How do I use Ollama behind a proxy in Docker?
|
||||
|
||||
The Ollama Docker container image can be configured to use a proxy by passing `-e HTTPS_PROXY=https://proxy.example.com` when starting the container.
|
||||
|
||||
Alternatively, Docker daemon can be configured to use a proxy. Instructions are available for Docker Desktop on [macOS](https://docs.docker.com/desktop/settings/mac/#proxies), [Windows](https://docs.docker.com/desktop/settings/windows/#proxies), and [Linux](https://docs.docker.com/desktop/settings/linux/#proxies), and Docker [daemon with systemd](https://docs.docker.com/config/daemon/systemd/#httphttps-proxy).
|
||||
Alternatively, the Docker daemon can be configured to use a proxy. Instructions are available for Docker Desktop on [macOS](https://docs.docker.com/desktop/settings/mac/#proxies), [Windows](https://docs.docker.com/desktop/settings/windows/#proxies), and [Linux](https://docs.docker.com/desktop/settings/linux/#proxies), and Docker [daemon with systemd](https://docs.docker.com/config/daemon/systemd/#httphttps-proxy).
|
||||
|
||||
Ensure the certificate is installed as a system certificate when using HTTPS. This may require a new Docker image when using a self-signed certificate.
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue