Skip to content

Latest commit

 

History

History
187 lines (142 loc) · 9.37 KB

README.md

File metadata and controls

187 lines (142 loc) · 9.37 KB

arize banner



Overview

The arize-otel package is meant to be a very lightweight convenience package to help set up OpenTelemetry for tracing LLM applications and send the traces to Arize, Phoenix, or custom collectors.

Installation

Install arize-otel using pip

pip install arize-otel

Quickstart

You only need one import to use this package:

from arize_otel import register_otel, Endpoints

The following examples showcase how to use register_otel to setup Opentelemetry in order to send traces to a collector. However, this is NOT the same as instrumenting your application. For instance, you can use any of our OpenInference AutoInstrumentators. Assuming we use the OpenAI AutoInstrumentation, we need to run instrument() after using register_otel:

# Setup OTEL via our convenience function
register_otel(
    # See details in examples below...
)

# Instrument your application using OpenInference AutoInstrumentators
from openinference.instrumentation.openai import OpenAIInstrumentor
OpenAIInstrumentor().instrument()

The above code snippet will yield a fully setup and instrumented application. It is worth noting that this is completely optional. The usage of this package is for convenience only, you can set up OpenTelemetry and send traces to Arize and Phoenix without installing this or any other package from Arize.

In the following sections we have examples on how to use the register_otel function:

Send traces to Arize

To send traces to Arize you need to authenticate via the Space ID and API Key. You can find them in the Space Settings page in the Arize platform. In addition, you'll need to specify the model ID, an unique name to identify your model in the Arize platform. Optionally, you can set the model version, which serves to to group a subset of data, given the same model ID, to compare and track changes.

register_otel(
    endpoints = Endpoints.ARIZE,
    space_id = "your-arize-space-id",
    api_key = "your-arize-api-key",
    model_id = "your-model-id",
    model_version = "your-model-version", # OPTIONAL
)

Send traces to Local Phoenix

To send traces to your local Phoenix server you just need to provide the correct endpoint. In the example below we have specified the local Phoenix endpoint, but you can specify your own (explained in an example below). Optionally, you can specify a project to send the traces to. A project is a collection of traces that are related to a single application or service. You can have multiple projects, each with multiple traces.

Send traces via HTTP:

register_otel(
    endpoints = Endpoints.LOCAL_PHOENIX_HTTP,
    project_name = "your-project-name", # OPTIONAL
)

Send traces via gRPC:

register_otel(
    endpoints = Endpoints.LOCAL_PHOENIX_GRPC,
    project_name = "your-project-name", # OPTIONAL
)

Send traces to Hosted Phoenix

To send traces to your Hosted Phoenix, also known as Llamatrace, you just need to provide the correct endpoint. In the example below we have specified said endpoint, as well as the Phoenix API key required. Optionally, you can specify a project to which to send the traces, exactly as above with a local Phoenix instance.

register_otel(
    endpoints = Endpoints.HOSTED_PHOENIX,
    api_key = "your-hosted-phoenix-api-key",
    project_name = "your-project-name", # OPTIONAL
)

Send traces to Custom Endpoint

Sending traces to a collector on a custom endpoint is simple, you just need to provide the endpoint. If this endpoint corresponds to an Arize or Phoenix deployment, you can add any of the options described in the examples above.

register_otel(
    endpoints = "https://my-custom-endpoint"
    # any other options...
)

Send traces to Multiple Endpoints

In this example we send traces to the default Arize and Phoenix endpoints, as well as to a third custom one. We also set all the options mentioned until now.

register_otel(
    endpoints = [
        Endpoints.ARIZE,
        Endpoints.PHOENIX_LOCAL,
        "https://my-custom-endpoint",
    ],
    space_id = "your-space-id",
    api_key = "your-api-key",
    model_id = "your-model-id",
    model_version = "your-model-version", # OPTIONAL
    project_name = "your-project-name", # OPTIONAL
)

Debug

As you're setting up your tracing, it is helpful to print to console the spans created. You can achieve this by setting log_to_console=True.

register_otel(
    # other options...
    log_to_console=True
)

Turn off batch processing of spans

We default to using BatchSpanProcessor from OpenTelemetry because it is non-blocking in case telemetry goes down. In contrast, "SimpleSpanProcessor processes spans as they are created." This can be helpful in development. You can use SimpleSpanProcessor with the option use_batch_processor=False.

register_otel(
    # other options...
    use_batch_processor=False
)

Questions?

Find us in our Slack Community or email [email protected]

Copyright, Patent, and License

Copyright 2024 Arize AI, Inc. All Rights Reserved.

This software is licensed under the terms of the 3-Clause BSD License. See LICENSE.