Skip to content

Commit

Permalink
Merge branch 'main' into use-lsp
Browse files Browse the repository at this point in the history
* main: (23 commits)
  feat: Run with resume session (#153)
  refactor: move langfuse wrapper to a module in exchange instead of a package (#138)
  docs: add subheaders to the 'Other ways to run Goose' section (#155)
  fix: Remove tools from exchange when summarizing files (#157)
  chore: use primitives instead of typing imports and fixes completion … (#149)
  chore: make vcr tests pretty-print JSON (#146)
  chore(release): goose 0.9.5 (#159)
  chore(release): exchange 0.9.5 (#158)
  chore: updates ollama default model from mistral-nemo to qwen2.5 (#150)
  feat: add vision support for Google (#141)
  fix: session resume with arg handled incorrectly (#145)
  docs: add release instructions to CONTRIBUTING.md (#143)
  docs: add link to action, IDE words (#140)
  docs: goosehints doc fix only (#142)
  chore(release): release 0.9.4 (#136)
  revert: "feat: add local langfuse tracing option  (#106)" (#137)
  feat: add local langfuse tracing option  (#106)
  feat: add groq provider (#134)
  feat: add a deep thinking reasoner model (o1-preview/mini) (#68)
  fix: use concrete SessionNotifier (#135)
  ...
  • Loading branch information
lukealvoeiro committed Oct 17, 2024
2 parents bfa4f8b + 5fc0c3a commit ef0aee6
Show file tree
Hide file tree
Showing 89 changed files with 2,087 additions and 502 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,9 @@ celerybeat.pid
.env.*
.venv

# exception for local langfuse init vars
!**/packages/exchange/.env.langfuse.local

# Spyder project settings
.spyderproject
.spyproject
Expand Down
14 changes: 12 additions & 2 deletions .goosehints
Original file line number Diff line number Diff line change
@@ -1,3 +1,13 @@
This is a python CLI app that uses UV. Read CONTRIBUTING.md for information on how to build and test it as needed.
Some key concepts are that it is run as a command line interface, dependes on the "ai-exchange" package, and has the concept of toolkits which are ways that its behavior can be extended. Look in src/goose and tests.
Once the user has UV installed it should be able to be used effectively along with uvx to run tasks as needed

Some key concepts are that it is run as a command line interface, dependes on the "ai-exchange" package (which is in packages/exchange in this repo), and has the concept of toolkits which are ways that its behavior can be extended. Look in src/goose and tests.

Assume the user has UV installed and ensure UV is used to run any python related commands.

To run tests:

```sh
uv sync && uv run pytest tests -m 'not integration'
```

ideally after each change
38 changes: 38 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,44 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.9.5] - 2024-10-15
- chore: updates ollama default model from mistral-nemo to qwen2.5 (#150)
- feat: add vision support for Google (#141)
- fix: session resume with arg handled incorrectly (#145)
- docs: add release instructions to CONTRIBUTING.md (#143)
- docs: add link to action, IDE words (#140)
- docs: goosehints doc fix only (#142)

## [0.9.4] - 2024-10-10

- revert: "feat: add local langfuse tracing option (#106)"
- feat: add local langfuse tracing option (#106)
- feat: add groq provider (#134)
- feat: add a deep thinking reasoner model (o1-preview/mini) (#68)
- fix: use concrete SessionNotifier (#135)
- feat: add guards to session management (#101)
- fix: Set default model configuration for the Google provider. (#131)
- test: convert Google Gemini tests to VCR (#118)
- chore: Add goose providers list command (#116)
- docs: working ollama for desktop (#125)
- docs: format and clean up warnings/errors (#120)
- docs: update deploy workflow (#124)
- feat: Implement a goose run command (#121)
- feat: saved api_key to keychain for user (#104)
- docs: add callout plugin (#119)
- chore: add a page to docs for Goose application examples (#117)
- fix: exit the goose and show the error message when provider environment variable is not set (#103)
- fix: Update OpenAI pricing per https://openai.com/api/pricing/ (#110)
- fix: update developer tool prompts to use plan task status to match allowable statuses update_plan tool call (#107)
- fix: removed the panel in the output so that the user won't have unnecessary pane borders in the copied content (#109)
- docs: update links to exchange to the new location (#108)
- chore: setup workspace for exchange (#105)
- fix: resolve uvx when using a git client or IDE (#98)
- ci: add include-markdown for mkdocs (#100)
- chore: fix broken badge on readme (#102)
- feat: add global optional user goosehints file (#73)
- docs: update docs (#99)

## [0.9.3] - 2024-09-25

- feat: auto save sessions before next user input (#94)
Expand Down
25 changes: 25 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,21 @@ or, as a shortcut,
just test
```

### Enable traces in Goose with [locally hosted Langfuse](https://langfuse.com/docs/deployment/self-host)
> [!NOTE]
> This integration is experimental and we don't currently have integration tests for it.
Developers can use locally hosted Langfuse tracing by applying the custom `observe_wrapper` decorator defined in `packages/exchange/src/langfuse_wrapper.py` to functions for automatic integration with Langfuse.

- Run `just langfuse-server` to start your local Langfuse server. It requires Docker.
- Go to http://localhost:3000 and log in with the default email/password output by the shell script (values can also be found in the `.env.langfuse.local` file).
- Run Goose with the --tracing flag enabled i.e., `goose session start --tracing`
- View your traces at http://localhost:3000

To extend tracing to additional functions, import `from exchange.langfuse_wrapper import observe_wrapper` and use the `observe_wrapper()` decorator on functions you wish to enable tracing for. `observe_wrapper` functions the same way as Langfuse's observe decorator.

Read more about Langfuse's decorator-based tracing [here](https://langfuse.com/docs/sdk/python/decorators).

## Exchange

The lower level generation behind goose is powered by the [`exchange`][ai-exchange] package, also in this repo.
Expand All @@ -73,6 +88,16 @@ Additions to the [developer toolkit][developer] change the core performance, and

This project follows the [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/) specification for PR titles. Conventional Commits make it easier to understand the history of a project and facilitate automation around versioning and changelog generation.

## Release

In order to release a new version of goose, you need to do the following:
1. Update CHANGELOG.md. To get the commit messages since last release, run: `just release-notes`
2. Update version in `pyproject.toml` for `goose` and package dependencies such as `exchange`
3. Create a PR and merge it into main branch
4. Tag the HEAD commit in main branch. To do this, switch to main branch and run: `just tag-push`
5. Publish a new release from the [Github Release UI](https://github.com/block-open-source/goose/releases)


[issues]: https://github.com/block-open-source/goose/issues
[goose-plugins]: https://github.com/block-open-source/goose-plugins
[ai-exchange]: https://github.com/block-open-source/goose/tree/main/packages/exchange
Expand Down
56 changes: 54 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,6 @@ To install Goose, use `pipx`. First ensure [pipx][pipx] is installed:
brew install pipx
pipx ensurepath
```
You can also place `.goosehints` in `~/.config/goose/.goosehints` if you like for always loaded hints personal to you.
Then install Goose:

Expand All @@ -131,7 +130,21 @@ You will see the Goose prompt `G❯`:
G❯ type your instructions here exactly as you would tell a developer.
```

Now you are interacting with Goose in conversational sessions - something like a natural language driven code interpreter. The default toolkit allows Goose to take actions through shell commands and file edits. You can interrupt Goose with `CTRL+D` or `ESC+Enter` at any time to help redirect its efforts.
Now you are interacting with Goose in conversational sessions - think of it as like giving direction to a junior developer. The default toolkit allows Goose to take actions through shell commands and file edits. You can interrupt Goose with `CTRL+D` or `ESC+Enter` at any time to help redirect its efforts.

> [!TIP]
> You can place a `.goosehints` text file in any directory you launch goose from to give it some background info for new sessions in plain language (eg how to test, what instructions to read to get started or just tell it to read the README!) You can also put a global one `~/.config/goose/.goosehints` if you like for always loaded hints personal to you.
### Running a goose tasks (one off)

You can run goose to do things just as a one off, such as tidying up, and then exiting:

```sh
goose run instructions.md
```

This will run until completion as best it can. You can also pass `--resume-session` and it will re-use the first session it finds for context


#### Exit the session

Expand All @@ -147,16 +160,55 @@ goose session resume

To see more documentation on the CLI commands currently available to Goose check out the documentation [here][cli]. If you’d like to develop your own CLI commands for Goose, check out the [Contributing document][contributing].

### Tracing with Langfuse
> [!NOTE]
> This Langfuse integration is experimental and we don't currently have integration tests for it.
The exchange package provides a [Langfuse](https://langfuse.com/) wrapper module. The wrapper serves to initialize Langfuse appropriately if the Langfuse server is running locally and otherwise to skip applying the Langfuse observe descorators.

#### Start your local Langfuse server

Run `just langfuse-server` to start your local Langfuse server. It requires Docker.

Read more about local Langfuse deployments [here](https://langfuse.com/docs/deployment/local).

#### Exchange and Goose integration

Import `from exchange.langfuse_wrapper import observe_wrapper` and use the `observe_wrapper()` decorator on functions you wish to enable tracing for. `observe_wrapper` functions the same way as Langfuse's observe decorator.

Read more about Langfuse's decorator-based tracing [here](https://langfuse.com/docs/sdk/python/decorators).

In Goose, initialization requires certain environment variables to be present:

- `LANGFUSE_PUBLIC_KEY`: Your Langfuse public key
- `LANGFUSE_SECRET_KEY`: Your Langfuse secret key
- `LANGFUSE_BASE_URL`: The base URL of your Langfuse instance

By default your local deployment and Goose will use the values in `.env.langfuse.local`.



### Next steps

Learn how to modify your Goose profiles.yaml file to add and remove functionality (toolkits) and providing context to get the most out of Goose in our [Getting Started Guide][getting-started].

## Other ways to run goose

**Want to move out of the terminal and into an IDE?**

We have some experimental IDE integrations for VSCode and JetBrains IDEs:
* https://github.com/square/goose-vscode
* https://github.com/Kvadratni/goose-intellij

**Goose as a Github Action**

There is also an experimental Github action to run goose as part of your workflow (for example if you ask it to fix an issue):
https://github.com/marketplace/actions/goose-ai-developer-agent

**With Docker**

There is also a `Dockerfile` in the root of this project you can use if you want to run goose in a sandboxed fashion.

## Getting involved!

There is a lot to do! If you're interested in contributing, a great place to start is picking a `good-first-issue`-labelled ticket from our [issues list][gh-issues]. More details on how to develop Goose can be found in our [Contributing Guide][contributing]. We are a friendly, collaborative group and look forward to working together![^1]
Expand Down
4 changes: 3 additions & 1 deletion docs/plugins/cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,13 @@ Lists the version of Goose and any associated plugins.

**Usage:**
```sh
goose session start [--profile PROFILE] [--plan PLAN]
goose session start [--profile PROFILE] [--plan PLAN] [--log-level [DEBUG|INFO|WARNING|ERROR|CRITICAL]] [--tracing]
```
Starts a new Goose session.
If you want to enable locally hosted Langfuse tracing, pass the --tracing flag after starting your local Langfuse server as outlined in the [Contributing Guide's][contributing] Development guidelines.
#### `resume`
**Usage:**
Expand Down
1 change: 1 addition & 0 deletions docs/plugins/providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ Providers in Goose mean "LLM providers" that Goose can interact with. Providers
* Azure
* Bedrock
* Databricks
* Google
* Ollama
* OpenAI

Expand Down
7 changes: 7 additions & 0 deletions justfile
Original file line number Diff line number Diff line change
Expand Up @@ -70,3 +70,10 @@ tag:
tag-push:
just tag
git push origin tag v$(just tag_version)

# get commit messages for a release
release-notes:
git log --pretty=format:"- %s" v$(just tag_version)..HEAD

langfuse-server:
./scripts/setup_langfuse.sh
16 changes: 16 additions & 0 deletions packages/exchange/.env.langfuse.local
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# These variables are default initialization variables for locally hosted Langfuse server
LANGFUSE_INIT_PROJECT_NAME=goose-local
LANGFUSE_INIT_PROJECT_PUBLIC_KEY=publickey-local
LANGFUSE_INIT_PROJECT_SECRET_KEY=secretkey-local
[email protected]
LANGFUSE_INIT_USER_NAME=localdev
LANGFUSE_INIT_USER_PASSWORD=localpwd

LANGFUSE_INIT_ORG_ID=local-id
LANGFUSE_INIT_ORG_NAME=local-org
LANGFUSE_INIT_PROJECT_ID=goose

# These variables are used by Goose
LANGFUSE_PUBLIC_KEY=publickey-local
LANGFUSE_SECRET_KEY=secretkey-local
LANGFUSE_HOST=http://localhost:3000
5 changes: 4 additions & 1 deletion packages/exchange/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "ai-exchange"
version = "0.9.3"
version = "0.9.5"
description = "a uniform python SDK for message generation with LLMs"
readme = "README.md"
requires-python = ">=3.10"
Expand All @@ -13,6 +13,8 @@ dependencies = [
"tiktoken>=0.7.0",
"httpx>=0.27.0",
"tenacity>=9.0.0",
"python-dotenv>=1.0.1",
"langfuse>=2.38.2"
]

[tool.hatch.build.targets.wheel]
Expand All @@ -33,6 +35,7 @@ anthropic = "exchange.providers.anthropic:AnthropicProvider"
bedrock = "exchange.providers.bedrock:BedrockProvider"
ollama = "exchange.providers.ollama:OllamaProvider"
google = "exchange.providers.google:GoogleProvider"
groq = "exchange.providers.groq:GroqProvider"

[project.entry-points."exchange.moderator"]
passive = "exchange.moderators.passive:PassiveModerator"
Expand Down
3 changes: 1 addition & 2 deletions packages/exchange/src/exchange/checkpoint.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
from copy import deepcopy
from typing import List
from attrs import define, field


Expand Down Expand Up @@ -31,7 +30,7 @@ class CheckpointData:
total_token_count: int = field(default=0)

# in order list of individual checkpoints in the exchange
checkpoints: List[Checkpoint] = field(factory=list)
checkpoints: list[Checkpoint] = field(factory=list)

# the offset to apply to the message index when calculating the last message index
# this is useful because messages on the exchange behave like a queue, where you can only
Expand Down
8 changes: 4 additions & 4 deletions packages/exchange/src/exchange/content.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from typing import Any, Dict, Optional
from typing import Optional

from attrs import define, asdict

Expand All @@ -7,11 +7,11 @@


class Content:
def __init_subclass__(cls, **kwargs: Dict[str, Any]) -> None:
def __init_subclass__(cls, **kwargs: dict[str, any]) -> None:
super().__init_subclass__(**kwargs)
CONTENT_TYPES[cls.__name__] = cls

def to_dict(self) -> Dict[str, Any]:
def to_dict(self) -> dict[str, any]:
data = asdict(self, recurse=True)
data["type"] = self.__class__.__name__
return data
Expand All @@ -26,7 +26,7 @@ class Text(Content):
class ToolUse(Content):
id: str
name: str
parameters: Any
parameters: any
is_error: bool = False
error_message: Optional[str] = None

Expand Down
18 changes: 10 additions & 8 deletions packages/exchange/src/exchange/exchange.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
import json
import traceback
from copy import deepcopy
from typing import Any, Dict, List, Mapping, Tuple

from typing import Mapping
from attrs import define, evolve, field, Factory
from exchange.langfuse_wrapper import observe_wrapper
from tiktoken import get_encoding

from exchange.checkpoint import Checkpoint, CheckpointData
Expand Down Expand Up @@ -41,16 +41,16 @@ class Exchange:
model: str
system: str
moderator: Moderator = field(default=ContextTruncate())
tools: Tuple[Tool] = field(factory=tuple, converter=tuple)
messages: List[Message] = field(factory=list)
tools: tuple[Tool, ...] = field(factory=tuple, converter=tuple)
messages: list[Message] = field(factory=list)
checkpoint_data: CheckpointData = field(factory=CheckpointData)
generation_args: dict = field(default=Factory(dict))

@property
def _toolmap(self) -> Mapping[str, Tool]:
return {tool.name: tool for tool in self.tools}

def replace(self, **kwargs: Dict[str, Any]) -> "Exchange":
def replace(self, **kwargs: dict[str, any]) -> "Exchange":
"""Make a copy of the exchange, replacing any passed arguments"""
# TODO: ensure that the checkpoint data is updated correctly. aka,
# if we replace the messages, we need to update the checkpoint data
Expand Down Expand Up @@ -127,6 +127,7 @@ def reply(self, max_tool_use: int = 128) -> Message:

return response

@observe_wrapper()
def call_function(self, tool_use: ToolUse) -> ToolResult:
"""Call the function indicated by the tool use"""
tool = self._toolmap.get(tool_use.name)
Expand Down Expand Up @@ -264,7 +265,7 @@ def pop_first_message(self) -> Message:
# we've removed all the checkpoints, so we need to reset the message index offset
self.checkpoint_data.message_index_offset = 0

def pop_last_checkpoint(self) -> Tuple[Checkpoint, List[Message]]:
def pop_last_checkpoint(self) -> tuple[Checkpoint, list[Message]]:
"""
Reverts the exchange back to the last checkpoint, removing associated messages
"""
Expand All @@ -275,7 +276,7 @@ def pop_last_checkpoint(self) -> Tuple[Checkpoint, List[Message]]:
messages.append(self.messages.pop())
return removed_checkpoint, messages

def pop_first_checkpoint(self) -> Tuple[Checkpoint, List[Message]]:
def pop_first_checkpoint(self) -> tuple[Checkpoint, list[Message]]:
"""
Pop the first checkpoint from the exchange, removing associated messages
"""
Expand Down Expand Up @@ -332,5 +333,6 @@ def is_allowed_to_call_llm(self) -> bool:
# this to be a required method of the provider instead.
return len(self.messages) > 0 and self.messages[-1].role == "user"

def get_token_usage(self) -> Dict[str, Usage]:
@staticmethod
def get_token_usage() -> dict[str, Usage]:
return _token_usage_collector.get_token_usage_group_by_model()
5 changes: 1 addition & 4 deletions packages/exchange/src/exchange/invalid_choice_error.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,5 @@
from typing import List


class InvalidChoiceError(Exception):
def __init__(self, attribute_name: str, attribute_value: str, available_values: List[str]) -> None:
def __init__(self, attribute_name: str, attribute_value: str, available_values: list[str]) -> None:
self.attribute_name = attribute_name
self.attribute_value = attribute_value
self.available_values = available_values
Expand Down
Loading

0 comments on commit ef0aee6

Please sign in to comment.