The ollama() function acts as an interface for interacting with local AI models via the Ollama API.
It integrates seamlessly with the main tidyllm verbs such as chat() and embed().
Arguments
- ...
Parameters to be passed to the appropriate Ollama-specific function, such as model configuration, input text, or API-specific options.
- .called_from
An internal argument specifying the verb (e.g.,
chat,embed) the function is invoked from. This argument is automatically managed bytidyllmand should not be set by the user.
Details
Some functionalities, like ollama_download_model() or ollama_list_models()
are unique to the Ollama API and do not have a general verb counterpart.
These functions can be only accessed directly.
Supported Verbs:
chat(): Sends a message to an Ollama model and retrieves the model's response.embed(): Generates embeddings for input texts using an Ollama model.send_batch(): Behaves different than the othersend_batch()verbs since it immediately processes the answers
