From cf81f7a096ba472afd8eecbb5909e2713911021e Mon Sep 17 00:00:00 2001 From: Michael Neale Date: Sat, 21 Sep 2024 10:41:49 +1000 Subject: [PATCH] docs: add in ollama (#82) --- README.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 375723424..81724c7b8 100644 --- a/README.md +++ b/README.md @@ -101,14 +101,16 @@ default: You can edit this configuration file to use different LLMs and toolkits in `goose`. `goose can also be extended to support any LLM or combination of LLMs #### provider -Provider of LLM. LLM providers that currently are supported by `goose`: +Provider of LLM. LLM providers that currently are supported by `goose` (more can be supported by plugins): | Provider | Required environment variable(s) to access provider | | :----- | :------------------------------ | | openai | `OPENAI_API_KEY` | | anthropic | `ANTHROPIC_API_KEY` | | databricks | `DATABRICKS_HOST` and `DATABRICKS_TOKEN` | +| ollama * | `OLLAMA_HOST` and ollama running | +* ollama is for local LLMs, and is limited by the tool calling model you can choose and run on local hardware, considered experimental. #### processor Model for complex, multi-step tasks such as writing code and executing commands. Example: `gpt-4o`. You should choose the model based the provider you configured.