Skip to content

Commit

Permalink
Fix Python terminology (#198)
Browse files Browse the repository at this point in the history
  • Loading branch information
dsblank authored Sep 9, 2024
1 parent 17d9e3f commit 72fcaf5
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 5 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ sidebar_label: Log Distributed Traces

# Log Distributed Traces

When working with complex LLM applications, it is common to need to track a traces across multiple services. Comet supports distributed tracing out of the box when integrating using function annotators using a mechanism that is similar to how OpenTelemetry implements distributed tracing.
When working with complex LLM applications, it is common to need to track a traces across multiple services. Comet supports distributed tracing out of the box when integrating using function decorators using a mechanism that is similar to how OpenTelemetry implements distributed tracing.

For the purposes of this guide, we will assume that you have a simple LLM application that is made up of two services: a client and a server. We will assume that the client will create the trace and span, while the server will add a nested span. In order to do this, the `trace_id` and `span_id` will be passed in the headers of the request from the client to the server.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,15 +15,15 @@ To log traces to the Comet LLM Evaluation platform using the Python SDK, you wil
pip install opik
```

Once the SDK is installed, you can log traces to using one our Comet's integration, function annotations or manually.
Once the SDK is installed, you can log traces to using one our Comet's integration, function decorators or manually.

:::tip
Opik has a number of integrations for popular LLM frameworks like LangChain or OpenAI, checkout a full list of integrations in the [integrations](/tracing/integrations/overview.md) section.
:::

## Log using function annotators
## Log using function decorators

If you are manually defining your LLM chains and not using LangChain for example, you can use the `track` function annotators to track LLM calls:
If you are manually defining your LLM chains and not using LangChain for example, you can use the `track` function decorators to track LLM calls:

```python
from opik import track
Expand Down Expand Up @@ -61,7 +61,7 @@ print(result)
```

:::tip
If the `track` function annotators are used in conjunction with the `track_openai` or `CometTracer` callbacks, the LLM calls will be automatically logged to the corresponding trace.
If the `track` function decorators are used in conjunction with the `track_openai` or `CometTracer` callbacks, the LLM calls will be automatically logged to the corresponding trace.
:::

## Log traces and spans manually
Expand Down

0 comments on commit 72fcaf5

Please sign in to comment.