Ollama Provider
What Is Ollama Provider?
The Ollama Provider module allows the AI Core module to connect with an Ollama LLM to provide its functionality. For more information about Ollama, please see their website.
Enabling and configuration
- Enable the module as usual.
- Visit /admin/config/ai/providers/ollama and enter the connection details for your Ollama LLM.
- The Provider will then be available for the AI module to use; visit /admin/config/ai/settings to select it as a default provider for your chosen actions.
Setting up Ollama
To avoid duplication and out of date information, we recommend using the documentation on Ollama's github pages to get started with Ollama. Ollama also provides a Docker image, which can be used in a DDEV local development environment.