Ollama
This section is dedicated to the library for working with Ollama API. On this page, all the steps necessary to start working are described
Getting started
- Download Ollama from official website and run it on the target machine
(optional)
To be able to send your models to the Ollama servers, you are also required to create an account and add a public key to the local solution by following the instructions in the Ollama keys section
(optional)
To add authorization and the ability to handle requests from outside, you must configure proxying through a third-party web server
By default, the Ollama server is only available on the local network at
localhost:11434