- Website •
+ Website •
Slack community •
Twitter •
Documentation
@@ -59,17 +59,22 @@ You can use Opik for:
## 🛠️ Installation
-
+Opik is available as a fully open source local installation or using Comet.com as a hosted solution.
The easiest way to get started with Opik is by creating a free Comet account at [comet.com](https://www.comet.com/signup?from=llm?utm_source=opik&utm_medium=github&utm_content=install).
+If you'd like to self-host Opik, you can do so by cloning the repository and starting the platform using Docker Compose:
+```bash
+# Clone the Opik repository
+git clone https://github.com/comet-ml/opik.git
-If you'd like to self-host Opik, you create a simple local version of Opik using::
+# Navigate to the opik/deployment/docker-compose directory
+cd opik/deployment/docker-compose
-```bash
-pip install opik-installer
+# Start the Opik platform
+docker compose up --detach
-opik-server install
+# You can now visit http://localhost:5173 on your browser!
```
For more information about the different deployment options, please see our deployment guides:
@@ -82,18 +87,23 @@ For more information about the different deployment options, please see our depl
## 🏁 Get Started
-If you are logging traces to the Cloud Opik platform, you will need to get your API key from the user menu and set it as the `OPIK_API_KEY` environment variable:
+To get started, you will need to first install the Python SDK:
```bash
-export OPIK_API_KEY=
-export OPIK_WORKSPACE=
+pip install opik
```
-If you are using a local Opik instance, you don't need to set the `OPIK_API_KEY` or `OPIK_WORKSPACE` environment variable and isntead set the environment variable `OPIK_BASE_URL` to point to your local Opik instance:
+Once the SDK is installed, you can configure it by running the `opik configure` command:
```bash
-export OPIK_BASE_URL=http://localhost:5173
+opik configure
```
+This will allow you to configure Opik locally by setting the correct local server address or if you're using the Cloud platform by setting the API Key
+
+
+> [!TIP]
+> You can also call the `opik.configure(use_local=True)` method from your Python code to configure the SDK to run on the local installation.
+
You are now ready to start logging traces using the [Python SDK](https://www.comet.com/docs/opik/python-sdk-reference/?utm_source=opik&utm_medium=github&utm_content=sdk_link2).
@@ -103,9 +113,11 @@ The easiest way to get started is to use one of our integrations. Opik supports:
| Integration | Description | Documentation | Try in Colab |
| ----------- | ----------- | ------------- | ------------ |
-| OpenAI | Log traces for all OpenAI LLM calls | [Documentation](https://www.comet.com//docs/opik/tracing/integrations/openai/?utm_source=opik&utm_medium=github&utm_content=openai_link) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/openai.ipynb) |
+| OpenAI | Log traces for all OpenAI LLM calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/openai/?utm_source=opik&utm_medium=github&utm_content=openai_link) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/openai.ipynb) |
| LangChain | Log traces for all LangChain LLM calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/langchain/?utm_source=opik&utm_medium=github&utm_content=langchain_link) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/langchain.ipynb) |
| LlamaIndex | Log traces for all LlamaIndex LLM calls | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/llama_index?utm_source=opik&utm_medium=github&utm_content=llama_index_link) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/llama-index.ipynb) |
+| Predibase | Fine-tune and serve open-source Large Language Models | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/predibase?utm_source=opik&utm_medium=github&utm_content=predibase_link) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/predibase.ipynb) |
+| Ragas | Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines | [Documentation](https://www.comet.com/docs/opik/tracing/integrations/ragas?utm_source=opik&utm_medium=github&utm_content=ragas_link) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/ragas.ipynb) |
> [!TIP]
> If the framework you are using is not listed above, feel free to [open an issue](https://github.com/comet-ml/opik/issues) or submit a PR with the integration.
@@ -113,9 +125,11 @@ The easiest way to get started is to use one of our integrations. Opik supports:
If you are not using any of the frameworks above, you can also using the `track` function decorator to [log traces](https://www.comet.com/docs/opik/tracing/log_traces/?utm_source=opik&utm_medium=github&utm_content=traces_link):
```python
-from opik import track
+import opik
+
+opik.configure(use_local=True) # Run locally
-@track
+@opik.track
def my_llm_function(user_question: str) -> str:
# Your LLM code here
diff --git a/apps/opik-backend/config.yml b/apps/opik-backend/config.yml
index 25e11fd1b7..0fce155dce 100644
--- a/apps/opik-backend/config.yml
+++ b/apps/opik-backend/config.yml
@@ -70,4 +70,4 @@ rateLimit:
enabled: ${RATE_LIMIT_ENABLED:-false}
generalEvents:
limit: ${RATE_LIMIT_GENERAL_EVENTS_LIMIT:-5000}
- durationInSeconds: ${RATE_LIMIT_GENERAL_EVENTS_DURATION_SEC:-1}
\ No newline at end of file
+ durationInSeconds: ${RATE_LIMIT_GENERAL_EVENTS_DURATION_IN_SEC:-1}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/OpikApplication.java b/apps/opik-backend/src/main/java/com/comet/opik/OpikApplication.java
index 3168ce059e..e45aac1319 100644
--- a/apps/opik-backend/src/main/java/com/comet/opik/OpikApplication.java
+++ b/apps/opik-backend/src/main/java/com/comet/opik/OpikApplication.java
@@ -5,6 +5,7 @@
import com.comet.opik.infrastructure.bundle.LiquibaseBundle;
import com.comet.opik.infrastructure.db.DatabaseAnalyticsModule;
import com.comet.opik.infrastructure.db.IdGeneratorModule;
+import com.comet.opik.infrastructure.db.NameGeneratorModule;
import com.comet.opik.infrastructure.ratelimit.RateLimitModule;
import com.comet.opik.infrastructure.redis.RedisModule;
import com.comet.opik.utils.JsonBigDecimalDeserializer;
@@ -60,7 +61,7 @@ public void initialize(Bootstrap bootstrap) {
.bundles(JdbiBundle.forDatabase((conf, env) -> conf.getDatabase())
.withPlugins(new SqlObjectPlugin(), new Jackson2Plugin()))
.modules(new DatabaseAnalyticsModule(), new IdGeneratorModule(), new AuthModule(), new RedisModule(),
- new RateLimitModule())
+ new RateLimitModule(), new NameGeneratorModule())
.enableAutoConfig()
.build());
}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/api/Experiment.java b/apps/opik-backend/src/main/java/com/comet/opik/api/Experiment.java
index 22319829b4..3a0da66464 100644
--- a/apps/opik-backend/src/main/java/com/comet/opik/api/Experiment.java
+++ b/apps/opik-backend/src/main/java/com/comet/opik/api/Experiment.java
@@ -21,7 +21,7 @@ public record Experiment(
Experiment.View.Public.class, Experiment.View.Write.class}) UUID id,
@JsonView({Experiment.View.Public.class, Experiment.View.Write.class}) @NotBlank String datasetName,
@JsonView({Experiment.View.Public.class}) @Schema(accessMode = Schema.AccessMode.READ_ONLY) UUID datasetId,
- @JsonView({Experiment.View.Public.class, Experiment.View.Write.class}) @NotBlank String name,
+ @JsonView({Experiment.View.Public.class, Experiment.View.Write.class}) String name,
@JsonView({Experiment.View.Public.class, Experiment.View.Write.class}) JsonNode metadata,
@JsonView({
Experiment.View.Public.class}) @Schema(accessMode = Schema.AccessMode.READ_ONLY) List feedbackScores,
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/api/Project.java b/apps/opik-backend/src/main/java/com/comet/opik/api/Project.java
index 60e62580be..a0479bcad6 100644
--- a/apps/opik-backend/src/main/java/com/comet/opik/api/Project.java
+++ b/apps/opik-backend/src/main/java/com/comet/opik/api/Project.java
@@ -39,8 +39,9 @@ public static class Public {
}
}
- public record ProjectPage(@JsonView( {
- Project.View.Public.class}) int page,
+ public record ProjectPage(
+ @JsonView( {
+ Project.View.Public.class}) int page,
@JsonView({Project.View.Public.class}) int size,
@JsonView({Project.View.Public.class}) long total,
@JsonView({Project.View.Public.class}) List content)
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/api/TraceCountResponse.java b/apps/opik-backend/src/main/java/com/comet/opik/api/TraceCountResponse.java
new file mode 100644
index 0000000000..e3e97aea69
--- /dev/null
+++ b/apps/opik-backend/src/main/java/com/comet/opik/api/TraceCountResponse.java
@@ -0,0 +1,26 @@
+package com.comet.opik.api;
+
+import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
+import com.fasterxml.jackson.databind.PropertyNamingStrategies;
+import com.fasterxml.jackson.databind.annotation.JsonNaming;
+import lombok.Builder;
+
+import java.util.List;
+
+@Builder(toBuilder = true)
+@JsonIgnoreProperties(ignoreUnknown = true)
+@JsonNaming(PropertyNamingStrategies.SnakeCaseStrategy.class)
+public record TraceCountResponse(
+ List workspacesTracesCount) {
+ public static TraceCountResponse empty() {
+ return new TraceCountResponse(List.of());
+ }
+
+ @Builder(toBuilder = true)
+ @JsonIgnoreProperties(ignoreUnknown = true)
+ @JsonNaming(PropertyNamingStrategies.SnakeCaseStrategy.class)
+ public record WorkspaceTraceCount(
+ String workspace,
+ int traceCount) {
+ }
+}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/api/resources/v1/internal/UsageResource.java b/apps/opik-backend/src/main/java/com/comet/opik/api/resources/v1/internal/UsageResource.java
new file mode 100644
index 0000000000..c2fbbb59bf
--- /dev/null
+++ b/apps/opik-backend/src/main/java/com/comet/opik/api/resources/v1/internal/UsageResource.java
@@ -0,0 +1,40 @@
+package com.comet.opik.api.resources.v1.internal;
+
+import com.codahale.metrics.annotation.Timed;
+import com.comet.opik.api.TraceCountResponse;
+import com.comet.opik.domain.TraceService;
+import io.swagger.v3.oas.annotations.Operation;
+import io.swagger.v3.oas.annotations.media.Content;
+import io.swagger.v3.oas.annotations.media.Schema;
+import io.swagger.v3.oas.annotations.responses.ApiResponse;
+import io.swagger.v3.oas.annotations.tags.Tag;
+import jakarta.ws.rs.Consumes;
+import jakarta.ws.rs.GET;
+import jakarta.ws.rs.Path;
+import jakarta.ws.rs.Produces;
+import jakarta.ws.rs.core.MediaType;
+import jakarta.ws.rs.core.Response;
+import lombok.NonNull;
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+
+@Path("/v1/internal/usage")
+@Produces(MediaType.APPLICATION_JSON)
+@Consumes(MediaType.APPLICATION_JSON)
+@Timed
+@Slf4j
+@RequiredArgsConstructor(onConstructor_ = @jakarta.inject.Inject)
+@Tag(name = "System usage", description = "System usage related resource")
+public class UsageResource {
+ private final @NonNull TraceService traceService;
+
+ @GET
+ @Path("/workspace-trace-counts")
+ @Operation(operationId = "getTracesCountForWorkspaces", summary = "Get traces count on previous day for all available workspaces", description = "Get traces count on previous day for all available workspaces", responses = {
+ @ApiResponse(responseCode = "200", description = "TraceCountResponse resource", content = @Content(schema = @Schema(implementation = TraceCountResponse.class)))})
+ public Response getTracesCountForWorkspaces() {
+ return traceService.countTracesPerWorkspace()
+ .map(tracesCountResponse -> Response.ok(tracesCountResponse).build())
+ .block();
+ }
+}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/domain/ExperimentService.java b/apps/opik-backend/src/main/java/com/comet/opik/domain/ExperimentService.java
index 6cb15f7273..6576742993 100644
--- a/apps/opik-backend/src/main/java/com/comet/opik/domain/ExperimentService.java
+++ b/apps/opik-backend/src/main/java/com/comet/opik/domain/ExperimentService.java
@@ -14,6 +14,7 @@
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
+import org.apache.commons.lang3.StringUtils;
import reactor.core.publisher.Mono;
import reactor.core.scheduler.Schedulers;
@@ -31,6 +32,7 @@ public class ExperimentService {
private final @NonNull ExperimentDAO experimentDAO;
private final @NonNull DatasetService datasetService;
private final @NonNull IdGenerator idGenerator;
+ private final @NonNull NameGenerator nameGenerator;
public Mono find(
int page, int size, @NonNull ExperimentSearchCriteria experimentSearchCriteria) {
@@ -73,34 +75,35 @@ public Mono getById(@NonNull UUID id) {
public Mono create(@NonNull Experiment experiment) {
var id = experiment.id() == null ? idGenerator.generateId() : experiment.id();
IdGenerator.validateVersion(id, "Experiment");
+ var name = StringUtils.getIfBlank(experiment.name(), nameGenerator::generateName);
- return getOrCreateDataset(experiment)
- .onErrorResume(e -> handleDatasetCreationError(e, experiment).map(Dataset::id))
- .flatMap(datasetId -> create(experiment, id, datasetId))
+ return getOrCreateDataset(experiment.datasetName())
+ .onErrorResume(e -> handleDatasetCreationError(e, experiment.datasetName()).map(Dataset::id))
+ .flatMap(datasetId -> create(experiment, id, name, datasetId))
.onErrorResume(exception -> handleCreateError(exception, id));
}
- private Mono getOrCreateDataset(Experiment experiment) {
+ private Mono getOrCreateDataset(String datasetName) {
return Mono.deferContextual(ctx -> {
String userName = ctx.get(RequestContext.USER_NAME);
String workspaceId = ctx.get(RequestContext.WORKSPACE_ID);
- return Mono.fromCallable(() -> datasetService.getOrCreate(workspaceId, experiment.datasetName(), userName))
+ return Mono.fromCallable(() -> datasetService.getOrCreate(workspaceId, datasetName, userName))
.subscribeOn(Schedulers.boundedElastic());
});
}
- private Mono create(Experiment experiment, UUID id, UUID datasetId) {
- var newExperiment = experiment.toBuilder().id(id).datasetId(datasetId).build();
- return experimentDAO.insert(newExperiment).thenReturn(newExperiment);
+ private Mono create(Experiment experiment, UUID id, String name, UUID datasetId) {
+ experiment = experiment.toBuilder().id(id).name(name).datasetId(datasetId).build();
+ return experimentDAO.insert(experiment).thenReturn(experiment);
}
- private Mono handleDatasetCreationError(Throwable throwable, Experiment experiment) {
+ private Mono handleDatasetCreationError(Throwable throwable, String datasetName) {
if (throwable instanceof EntityAlreadyExistsException) {
return Mono.deferContextual(ctx -> {
String workspaceId = ctx.get(RequestContext.WORKSPACE_ID);
- return Mono.fromCallable(() -> datasetService.findByName(workspaceId, experiment.datasetName()))
+ return Mono.fromCallable(() -> datasetService.findByName(workspaceId, datasetName))
.subscribeOn(Schedulers.boundedElastic());
});
}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/domain/NameGenerator.java b/apps/opik-backend/src/main/java/com/comet/opik/domain/NameGenerator.java
new file mode 100644
index 0000000000..3f13d31aba
--- /dev/null
+++ b/apps/opik-backend/src/main/java/com/comet/opik/domain/NameGenerator.java
@@ -0,0 +1,28 @@
+package com.comet.opik.domain;
+
+import lombok.Builder;
+import lombok.NonNull;
+
+import java.security.SecureRandom;
+import java.util.List;
+
+@Builder
+public class NameGenerator {
+
+ private final @NonNull SecureRandom secureRandom;
+
+ private final @NonNull List adjectives;
+ private final @NonNull List nouns;
+
+ public String generateName() {
+ var adjective = getRandom(adjectives);
+ var noun = getRandom(nouns);
+ var number = secureRandom.nextInt(0, 10000);
+ return "%s_%s_%s".formatted(adjective, noun, number);
+ }
+
+ private String getRandom(List strings) {
+ int index = secureRandom.nextInt(0, strings.size());
+ return strings.get(index);
+ }
+}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/domain/TraceDAO.java b/apps/opik-backend/src/main/java/com/comet/opik/domain/TraceDAO.java
index e31135a3c7..bf967efb43 100644
--- a/apps/opik-backend/src/main/java/com/comet/opik/domain/TraceDAO.java
+++ b/apps/opik-backend/src/main/java/com/comet/opik/domain/TraceDAO.java
@@ -1,6 +1,7 @@
package com.comet.opik.domain;
import com.comet.opik.api.Trace;
+import com.comet.opik.api.TraceCountResponse;
import com.comet.opik.api.TraceSearchCriteria;
import com.comet.opik.api.TraceUpdate;
import com.comet.opik.domain.filter.FilterQueryBuilder;
@@ -64,6 +65,8 @@ interface TraceDAO {
Mono> getTraceWorkspace(Set traceIds, Connection connection);
Mono batchInsert(List traces, Connection connection);
+
+ Flux countTracesPerWorkspace(Connection connection);
}
@Slf4j
@@ -274,6 +277,16 @@ AND id in (
;
""";
+ private static final String TRACE_COUNT_BY_WORKSPACE_ID = """
+ SELECT
+ workspace_id,
+ COUNT(DISTINCT id) as trace_count
+ FROM traces
+ WHERE created_at BETWEEN toStartOfDay(yesterday()) AND toStartOfDay(today())
+ GROUP BY workspace_id
+ ;
+ """;
+
private static final String COUNT_BY_PROJECT_ID = """
SELECT
count(id) as count
@@ -797,4 +810,14 @@ private String getOrDefault(JsonNode value) {
return value != null ? value.toString() : "";
}
+ @com.newrelic.api.agent.Trace(dispatcher = true)
+ public Flux countTracesPerWorkspace(Connection connection) {
+
+ var statement = connection.createStatement(TRACE_COUNT_BY_WORKSPACE_ID);
+
+ return Mono.from(statement.execute())
+ .flatMapMany(result -> result.map((row, rowMetadata) -> TraceCountResponse.WorkspaceTraceCount.builder()
+ .workspace(row.get("workspace_id", String.class))
+ .traceCount(row.get("trace_count", Integer.class)).build()));
+ }
}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/domain/TraceService.java b/apps/opik-backend/src/main/java/com/comet/opik/domain/TraceService.java
index 35dfded1da..1b2ab35dab 100644
--- a/apps/opik-backend/src/main/java/com/comet/opik/domain/TraceService.java
+++ b/apps/opik-backend/src/main/java/com/comet/opik/domain/TraceService.java
@@ -4,6 +4,7 @@
import com.comet.opik.api.Project;
import com.comet.opik.api.Trace;
import com.comet.opik.api.TraceBatch;
+import com.comet.opik.api.TraceCountResponse;
import com.comet.opik.api.TraceSearchCriteria;
import com.comet.opik.api.TraceUpdate;
import com.comet.opik.api.error.EntityAlreadyExistsException;
@@ -57,6 +58,8 @@ public interface TraceService {
Mono validateTraceWorkspace(String workspaceId, Set traceIds);
+ Mono countTracesPerWorkspace();
+
}
@Slf4j
@@ -323,4 +326,15 @@ public Mono validateTraceWorkspace(@NonNull String workspaceId, @NonNul
.allMatch(trace -> workspaceId.equals(trace.workspaceId()))));
}
+ @Override
+ public Mono countTracesPerWorkspace() {
+ return template.stream(dao::countTracesPerWorkspace)
+ .collectList()
+ .flatMap(items -> Mono.just(
+ TraceCountResponse.builder()
+ .workspacesTracesCount(items)
+ .build()))
+ .switchIfEmpty(Mono.just(TraceCountResponse.empty()));
+ }
+
}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/infrastructure/db/NameGeneratorModule.java b/apps/opik-backend/src/main/java/com/comet/opik/infrastructure/db/NameGeneratorModule.java
new file mode 100644
index 0000000000..0fb20de797
--- /dev/null
+++ b/apps/opik-backend/src/main/java/com/comet/opik/infrastructure/db/NameGeneratorModule.java
@@ -0,0 +1,38 @@
+package com.comet.opik.infrastructure.db;
+
+import com.comet.opik.domain.NameGenerator;
+import com.comet.opik.infrastructure.OpikConfiguration;
+import com.comet.opik.utils.JsonUtils;
+import com.fasterxml.jackson.core.type.TypeReference;
+import com.google.inject.Provides;
+import jakarta.inject.Singleton;
+import ru.vyarus.dropwizard.guice.module.support.DropwizardAwareModule;
+
+import java.io.FileNotFoundException;
+import java.security.NoSuchAlgorithmException;
+import java.security.SecureRandom;
+import java.util.List;
+
+public class NameGeneratorModule extends DropwizardAwareModule {
+
+ private static final TypeReference> STRING_LIST_TYPE_REFERENCE = new TypeReference<>() {
+ };
+
+ @Provides
+ @Singleton
+ public NameGenerator getNameGenerator() throws FileNotFoundException, NoSuchAlgorithmException {
+ return NameGenerator.builder()
+ .secureRandom(SecureRandom.getInstanceStrong())
+ .adjectives(getResource("/name-generator/adjectives.json"))
+ .nouns(getResource("/name-generator/nouns.json"))
+ .build();
+ }
+
+ private List getResource(String path) throws FileNotFoundException {
+ var inputStream = NameGeneratorModule.class.getResourceAsStream(path);
+ if (inputStream == null) {
+ throw new FileNotFoundException("Resource not found in path '%s'".formatted(path));
+ }
+ return JsonUtils.readValue(inputStream, NameGeneratorModule.STRING_LIST_TYPE_REFERENCE);
+ }
+}
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/infrastructure/ratelimit/RateLimitInterceptor.java b/apps/opik-backend/src/main/java/com/comet/opik/infrastructure/ratelimit/RateLimitInterceptor.java
index 7de4f45006..f7c7364f60 100644
--- a/apps/opik-backend/src/main/java/com/comet/opik/infrastructure/ratelimit/RateLimitInterceptor.java
+++ b/apps/opik-backend/src/main/java/com/comet/opik/infrastructure/ratelimit/RateLimitInterceptor.java
@@ -9,6 +9,7 @@
import lombok.extern.slf4j.Slf4j;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
+import org.apache.hc.core5.http.HttpStatus;
import java.lang.reflect.Method;
import java.util.List;
@@ -56,7 +57,7 @@ public Object invoke(MethodInvocation invocation) throws Throwable {
if (Boolean.TRUE.equals(limitExceeded)) {
setLimitHeaders(apiKey, bucket);
- throw new ClientErrorException("Too Many Requests", 429);
+ throw new ClientErrorException("Too Many Requests", HttpStatus.SC_TOO_MANY_REQUESTS);
}
try {
diff --git a/apps/opik-backend/src/main/java/com/comet/opik/utils/JsonUtils.java b/apps/opik-backend/src/main/java/com/comet/opik/utils/JsonUtils.java
index cf36565de7..bb18c6b227 100644
--- a/apps/opik-backend/src/main/java/com/comet/opik/utils/JsonUtils.java
+++ b/apps/opik-backend/src/main/java/com/comet/opik/utils/JsonUtils.java
@@ -13,6 +13,7 @@
import java.io.ByteArrayOutputStream;
import java.io.IOException;
+import java.io.InputStream;
import java.io.UncheckedIOException;
import java.math.BigDecimal;
@@ -46,6 +47,14 @@ public T readValue(@NonNull String content, @NonNull TypeReference valueT
}
}
+ public T readValue(@NonNull InputStream inputStream, @NonNull TypeReference valueTypeRef) {
+ try {
+ return MAPPER.readValue(inputStream, valueTypeRef);
+ } catch (IOException exception) {
+ throw new UncheckedIOException(exception);
+ }
+ }
+
public String writeValueAsString(@NonNull Object value) {
try {
return MAPPER.writeValueAsString(value);
diff --git a/apps/opik-backend/src/main/resources/name-generator/adjectives.json b/apps/opik-backend/src/main/resources/name-generator/adjectives.json
new file mode 100644
index 0000000000..50f91b09d7
--- /dev/null
+++ b/apps/opik-backend/src/main/resources/name-generator/adjectives.json
@@ -0,0 +1,987 @@
+[
+ "able",
+ "above",
+ "absent",
+ "absolute",
+ "abstract",
+ "abundant",
+ "academic",
+ "acceptable",
+ "accepted",
+ "accessible",
+ "accurate",
+ "accused",
+ "active",
+ "actual",
+ "acute",
+ "added",
+ "additional",
+ "adequate",
+ "adjacent",
+ "administrative",
+ "adorable",
+ "advanced",
+ "adverse",
+ "advisory",
+ "aesthetic",
+ "aggregate",
+ "aggressive",
+ "agreeable",
+ "agreed",
+ "agricultural",
+ "alert",
+ "alive",
+ "alleged",
+ "allied",
+ "alone",
+ "alright",
+ "alternative",
+ "amateur",
+ "amazing",
+ "ambitious",
+ "amused",
+ "ancient",
+ "angry",
+ "annoyed",
+ "annual",
+ "anonymous",
+ "anxious",
+ "appalling",
+ "apparent",
+ "applicable",
+ "appropriate",
+ "arbitrary",
+ "architectural",
+ "armed",
+ "arrogant",
+ "artificial",
+ "artistic",
+ "ashamed",
+ "asleep",
+ "assistant",
+ "associated",
+ "atomic",
+ "automatic",
+ "autonomous",
+ "available",
+ "average",
+ "awake",
+ "aware",
+ "back",
+ "bad",
+ "balanced",
+ "bare",
+ "basic",
+ "beautiful",
+ "beneficial",
+ "better",
+ "bewildered",
+ "big",
+ "binding",
+ "biological",
+ "bitter",
+ "bizarre",
+ "blank",
+ "blind",
+ "blonde",
+ "blushing",
+ "boiling",
+ "bold",
+ "bored",
+ "boring",
+ "bottom",
+ "brainy",
+ "brave",
+ "breakable",
+ "breezy",
+ "brief",
+ "bright",
+ "brilliant",
+ "broad",
+ "broken",
+ "bumpy",
+ "burning",
+ "busy",
+ "calm",
+ "capable",
+ "careful",
+ "casual",
+ "causal",
+ "cautious",
+ "central",
+ "certain",
+ "changing",
+ "characteristic",
+ "charming",
+ "cheerful",
+ "chemical",
+ "chief",
+ "chilly",
+ "chosen",
+ "chronic",
+ "circular",
+ "civic",
+ "civil",
+ "civilian",
+ "classic",
+ "classical",
+ "clean",
+ "clear",
+ "clever",
+ "clinical",
+ "close",
+ "closed",
+ "cloudy",
+ "clumsy",
+ "coastal",
+ "cognitive",
+ "coherent",
+ "cold",
+ "collective",
+ "colonial",
+ "colorful",
+ "colossal",
+ "coloured",
+ "colourful",
+ "combined",
+ "comfortable",
+ "coming",
+ "commercial",
+ "common",
+ "compact",
+ "comparable",
+ "comparative",
+ "compatible",
+ "competent",
+ "competitive",
+ "complete",
+ "complex",
+ "complicated",
+ "comprehensive",
+ "conceptual",
+ "concerned",
+ "concrete",
+ "condemned",
+ "confident",
+ "confidential",
+ "confused",
+ "conscious",
+ "conservation",
+ "considerable",
+ "consistent",
+ "constant",
+ "constitutional",
+ "contemporary",
+ "content",
+ "continental",
+ "continued",
+ "continuing",
+ "continuous",
+ "controlled",
+ "controversial",
+ "convenient",
+ "conventional",
+ "convinced",
+ "convincing",
+ "cooing",
+ "cool",
+ "cooperative",
+ "corporate",
+ "correct",
+ "corresponding",
+ "costly",
+ "courageous",
+ "crazy",
+ "creative",
+ "critical",
+ "crooked",
+ "crowded",
+ "crucial",
+ "curious",
+ "current",
+ "daily",
+ "damp",
+ "dear",
+ "decent",
+ "decisive",
+ "deep",
+ "defensive",
+ "defiant",
+ "definite",
+ "deliberate",
+ "delicate",
+ "delicious",
+ "delighted",
+ "delightful",
+ "detailed",
+ "determined",
+ "developed",
+ "developing",
+ "devoted",
+ "different",
+ "difficult",
+ "digital",
+ "diplomatic",
+ "direct",
+ "disappointed",
+ "disastrous",
+ "disciplinary",
+ "distant",
+ "distinct",
+ "distinctive",
+ "distinguished",
+ "diverse",
+ "divine",
+ "dizzy",
+ "double",
+ "doubtful",
+ "dramatic",
+ "driving",
+ "dry",
+ "dual",
+ "due",
+ "dull",
+ "dusty",
+ "dutch",
+ "dynamic",
+ "eager",
+ "early",
+ "easy",
+ "economic",
+ "educational",
+ "eerie",
+ "effective",
+ "efficient",
+ "elaborate",
+ "elated",
+ "electric",
+ "electrical",
+ "electronic",
+ "elegant",
+ "eligible",
+ "embarrassed",
+ "embarrassing",
+ "emotional",
+ "empirical",
+ "empty",
+ "enchanting",
+ "encouraging",
+ "endless",
+ "energetic",
+ "enormous",
+ "enthusiastic",
+ "entire",
+ "entitled",
+ "envious",
+ "environmental",
+ "equal",
+ "equivalent",
+ "essential",
+ "established",
+ "estimated",
+ "ethical",
+ "eventual",
+ "everyday",
+ "evident",
+ "evolutionary",
+ "exact",
+ "excellent",
+ "exceptional",
+ "excess",
+ "excessive",
+ "excited",
+ "exciting",
+ "exclusive",
+ "existing",
+ "expected",
+ "expensive",
+ "experienced",
+ "experimental",
+ "explicit",
+ "extended",
+ "extensive",
+ "external",
+ "extra",
+ "extraordinary",
+ "extreme",
+ "exuberant",
+ "faint",
+ "faithful",
+ "familiar",
+ "famous",
+ "fancy",
+ "fantastic",
+ "far",
+ "fascinating",
+ "fashionable",
+ "fast",
+ "favourable",
+ "favourite",
+ "federal",
+ "fellow",
+ "few",
+ "fierce",
+ "final",
+ "financial",
+ "fine",
+ "firm",
+ "fiscal",
+ "fixed",
+ "flaky",
+ "flat",
+ "flexible",
+ "fluffy",
+ "fluttering",
+ "flying",
+ "following",
+ "fond",
+ "formal",
+ "formidable",
+ "forthcoming",
+ "fortunate",
+ "forward",
+ "frantic",
+ "free",
+ "frequent",
+ "fresh",
+ "friendly",
+ "frightened",
+ "front",
+ "frozen",
+ "full",
+ "fun",
+ "functional",
+ "fundamental",
+ "funny",
+ "furious",
+ "future",
+ "fuzzy",
+ "general",
+ "generous",
+ "genetic",
+ "gentle",
+ "genuine",
+ "geographical",
+ "giant",
+ "gigantic",
+ "given",
+ "glad",
+ "glamorous",
+ "gleaming",
+ "global",
+ "glorious",
+ "golden",
+ "good",
+ "gorgeous",
+ "gothic",
+ "governing",
+ "graceful",
+ "gradual",
+ "grand",
+ "grateful",
+ "greasy",
+ "great",
+ "grim",
+ "growing",
+ "grubby",
+ "grumpy",
+ "happy",
+ "harsh",
+ "head",
+ "healthy",
+ "heavy",
+ "helpful",
+ "helpless",
+ "hidden",
+ "hilarious",
+ "hissing",
+ "historic",
+ "historical",
+ "hollow",
+ "holy",
+ "honest",
+ "horizontal",
+ "huge",
+ "human",
+ "hungry",
+ "hurt",
+ "hushed",
+ "icy",
+ "ideal",
+ "identical",
+ "ideological",
+ "ill",
+ "imaginative",
+ "immediate",
+ "immense",
+ "imperial",
+ "implicit",
+ "important",
+ "impossible",
+ "impressed",
+ "impressive",
+ "improved",
+ "inclined",
+ "increased",
+ "increasing",
+ "incredible",
+ "independent",
+ "indirect",
+ "individual",
+ "industrial",
+ "inevitable",
+ "influential",
+ "informal",
+ "inherent",
+ "initial",
+ "injured",
+ "inland",
+ "inner",
+ "innocent",
+ "innovative",
+ "inquisitive",
+ "instant",
+ "institutional",
+ "intact",
+ "integral",
+ "integrated",
+ "intellectual",
+ "intelligent",
+ "intense",
+ "intensive",
+ "interested",
+ "interesting",
+ "interim",
+ "interior",
+ "intermediate",
+ "internal",
+ "international",
+ "invisible",
+ "involved",
+ "irrelevant",
+ "isolated",
+ "itchy",
+ "jittery",
+ "joint",
+ "jolly",
+ "joyous",
+ "judicial",
+ "just",
+ "keen",
+ "key",
+ "kind",
+ "known",
+ "labour",
+ "large",
+ "late",
+ "lazy",
+ "leading",
+ "left",
+ "legal",
+ "legislative",
+ "legitimate",
+ "lengthy",
+ "level",
+ "lexical",
+ "light",
+ "like",
+ "likely",
+ "limited",
+ "linear",
+ "linguistic",
+ "liquid",
+ "literary",
+ "little",
+ "live",
+ "lively",
+ "living",
+ "local",
+ "logical",
+ "lonely",
+ "long",
+ "loose",
+ "lost",
+ "loud",
+ "lovely",
+ "loyal",
+ "ltd",
+ "lucky",
+ "magic",
+ "magnetic",
+ "magnificent",
+ "main",
+ "major",
+ "mammoth",
+ "managerial",
+ "managing",
+ "manual",
+ "many",
+ "marine",
+ "marked",
+ "marvellous",
+ "massive",
+ "mathematical",
+ "maximum",
+ "mean",
+ "meaningful",
+ "mechanical",
+ "medical",
+ "medieval",
+ "melodic",
+ "melted",
+ "mighty",
+ "mild",
+ "miniature",
+ "minimal",
+ "minimum",
+ "misty",
+ "mobile",
+ "modern",
+ "modest",
+ "molecular",
+ "monetary",
+ "monthly",
+ "moral",
+ "motionless",
+ "muddy",
+ "multiple",
+ "mushy",
+ "musical",
+ "mute",
+ "mutual",
+ "mysterious",
+ "narrow",
+ "national",
+ "native",
+ "natural",
+ "naval",
+ "near",
+ "nearby",
+ "neat",
+ "necessary",
+ "neighbouring",
+ "nervous",
+ "net",
+ "neutral",
+ "new",
+ "nice",
+ "noble",
+ "noisy",
+ "normal",
+ "northern",
+ "nosy",
+ "notable",
+ "novel",
+ "numerous",
+ "nursing",
+ "nutritious",
+ "objective",
+ "obliged",
+ "obnoxious",
+ "obvious",
+ "occasional",
+ "occupational",
+ "odd",
+ "official",
+ "ok",
+ "okay",
+ "olympic",
+ "only",
+ "open",
+ "operational",
+ "opposite",
+ "optimistic",
+ "ordinary",
+ "organic",
+ "organisational",
+ "original",
+ "other",
+ "outdoor",
+ "outer",
+ "outrageous",
+ "outside",
+ "outstanding",
+ "overall",
+ "overseas",
+ "overwhelming",
+ "panicky",
+ "parallel",
+ "parental",
+ "parliamentary",
+ "partial",
+ "particular",
+ "passing",
+ "passive",
+ "past",
+ "patient",
+ "payable",
+ "peaceful",
+ "peculiar",
+ "perfect",
+ "permanent",
+ "persistent",
+ "personal",
+ "petite",
+ "philosophical",
+ "physical",
+ "plain",
+ "planned",
+ "plastic",
+ "pleasant",
+ "pleased",
+ "poised",
+ "polite",
+ "popular",
+ "positive",
+ "possible",
+ "potential",
+ "powerful",
+ "practical",
+ "precious",
+ "precise",
+ "preferred",
+ "preliminary",
+ "premier",
+ "prepared",
+ "present",
+ "presidential",
+ "previous",
+ "prickly",
+ "primary",
+ "prime",
+ "principal",
+ "printed",
+ "prior",
+ "probable",
+ "productive",
+ "professional",
+ "profitable",
+ "profound",
+ "prominent",
+ "promising",
+ "proper",
+ "proposed",
+ "prospective",
+ "protective",
+ "provincial",
+ "public",
+ "puzzled",
+ "quaint",
+ "qualified",
+ "quick",
+ "quickest",
+ "quiet",
+ "rainy",
+ "random",
+ "rapid",
+ "rare",
+ "raspy",
+ "rational",
+ "ready",
+ "real",
+ "realistic",
+ "rear",
+ "reasonable",
+ "recent",
+ "reduced",
+ "redundant",
+ "regional",
+ "registered",
+ "regular",
+ "regulatory",
+ "related",
+ "relative",
+ "relaxed",
+ "relevant",
+ "reliable",
+ "relieved",
+ "reluctant",
+ "remaining",
+ "remarkable",
+ "remote",
+ "renewed",
+ "representative",
+ "required",
+ "resident",
+ "residential",
+ "resonant",
+ "respectable",
+ "respective",
+ "responsible",
+ "resulting",
+ "retail",
+ "right",
+ "rising",
+ "robust",
+ "rolling",
+ "round",
+ "royal",
+ "rubber",
+ "running",
+ "safe",
+ "salty",
+ "scared",
+ "scattered",
+ "scientific",
+ "secondary",
+ "secret",
+ "secure",
+ "select",
+ "selected",
+ "selective",
+ "semantic",
+ "sensible",
+ "sensitive",
+ "separate",
+ "serious",
+ "severe",
+ "shaky",
+ "shallow",
+ "shared",
+ "sharp",
+ "sheer",
+ "shiny",
+ "shivering",
+ "shocked",
+ "short",
+ "shy",
+ "significant",
+ "silent",
+ "silky",
+ "silly",
+ "similar",
+ "simple",
+ "single",
+ "skilled",
+ "sleepy",
+ "slight",
+ "slim",
+ "slimy",
+ "slippery",
+ "slow",
+ "small",
+ "smart",
+ "smiling",
+ "smoggy",
+ "smooth",
+ "social",
+ "soft",
+ "solar",
+ "sole",
+ "solid",
+ "sophisticated",
+ "sore",
+ "sorry",
+ "sound",
+ "sour",
+ "spare",
+ "sparkling",
+ "spatial",
+ "special",
+ "specific",
+ "specified",
+ "spectacular",
+ "spicy",
+ "spiritual",
+ "splendid",
+ "spontaneous",
+ "sporting",
+ "spotless",
+ "spotty",
+ "square",
+ "stable",
+ "stale",
+ "standard",
+ "static",
+ "statistical",
+ "statutory",
+ "steady",
+ "steep",
+ "sticky",
+ "stiff",
+ "still",
+ "stingy",
+ "stormy",
+ "straight",
+ "straightforward",
+ "strange",
+ "strategic",
+ "strict",
+ "striking",
+ "striped",
+ "strong",
+ "structural",
+ "stuck",
+ "subjective",
+ "subsequent",
+ "substantial",
+ "subtle",
+ "successful",
+ "successive",
+ "sudden",
+ "sufficient",
+ "suitable",
+ "sunny",
+ "super",
+ "superb",
+ "superior",
+ "supporting",
+ "supposed",
+ "supreme",
+ "sure",
+ "surprised",
+ "surprising",
+ "surrounding",
+ "surviving",
+ "suspicious",
+ "sweet",
+ "swift",
+ "symbolic",
+ "sympathetic",
+ "systematic",
+ "tall",
+ "tame",
+ "tart",
+ "technical",
+ "technological",
+ "temporary",
+ "tender",
+ "tense",
+ "territorial",
+ "theoretical",
+ "thirsty",
+ "thorough",
+ "thoughtful",
+ "thoughtless",
+ "thundering",
+ "tight",
+ "tired",
+ "top",
+ "total",
+ "tough",
+ "tragic",
+ "tremendous",
+ "tricky",
+ "tropical",
+ "typical",
+ "ultimate",
+ "uncertain",
+ "unchanged",
+ "uncomfortable",
+ "unconscious",
+ "underground",
+ "underlying",
+ "uneven",
+ "unexpected",
+ "uniform",
+ "uninterested",
+ "unique",
+ "united",
+ "universal",
+ "unknown",
+ "unlikely",
+ "unnecessary",
+ "unusual",
+ "unwilling",
+ "upset",
+ "urgent",
+ "useful",
+ "usual",
+ "vague",
+ "valid",
+ "valuable",
+ "variable",
+ "varied",
+ "various",
+ "varying",
+ "vast",
+ "verbal",
+ "vertical",
+ "very",
+ "victorious",
+ "visible",
+ "visiting",
+ "visual",
+ "vital",
+ "vocational",
+ "voluntary",
+ "wandering",
+ "warm",
+ "wasteful",
+ "watery",
+ "weekly",
+ "welcome",
+ "well",
+ "wet",
+ "whispering",
+ "whole",
+ "widespread",
+ "wild",
+ "willing",
+ "wise",
+ "witty",
+ "wonderful",
+ "wooden",
+ "working",
+ "worldwide",
+ "worried",
+ "worrying",
+ "worthwhile",
+ "worthy",
+ "written",
+ "wrong",
+ "yummy",
+ "zany",
+ "zealous",
+ "amaranth",
+ "amber",
+ "amethyst",
+ "apricot",
+ "aqua",
+ "aquamarine",
+ "azure",
+ "beige",
+ "black",
+ "blue",
+ "blush",
+ "bronze",
+ "brown",
+ "chocolate",
+ "coffee",
+ "copper",
+ "coral",
+ "crimson",
+ "cyan",
+ "emerald",
+ "fuchsia",
+ "gold",
+ "gray",
+ "green",
+ "indigo",
+ "ivory",
+ "jade",
+ "lavender",
+ "lime",
+ "magenta",
+ "maroon",
+ "moccasin",
+ "olive",
+ "orange",
+ "peach",
+ "pink",
+ "plum",
+ "purple",
+ "red",
+ "rose",
+ "salmon",
+ "sapphire",
+ "scarlet",
+ "silver",
+ "tan",
+ "teal",
+ "tomato",
+ "turquoise",
+ "violet",
+ "white",
+ "yellow"
+]
diff --git a/apps/opik-backend/src/main/resources/name-generator/nouns.json b/apps/opik-backend/src/main/resources/name-generator/nouns.json
new file mode 100644
index 0000000000..e79e24d9aa
--- /dev/null
+++ b/apps/opik-backend/src/main/resources/name-generator/nouns.json
@@ -0,0 +1,1158 @@
+[
+ "aardvark",
+ "aardwolf",
+ "albatross",
+ "alligator",
+ "alpaca",
+ "amphibian",
+ "anaconda",
+ "angelfish",
+ "anglerfish",
+ "ant",
+ "anteater",
+ "antelope",
+ "antlion",
+ "ape",
+ "aphid",
+ "armadillo",
+ "asp",
+ "baboon",
+ "badger",
+ "bandicoot",
+ "barnacle",
+ "barracuda",
+ "basilisk",
+ "bass",
+ "bat",
+ "bear",
+ "beaver",
+ "bedbug",
+ "bee",
+ "beetle",
+ "bird",
+ "bison",
+ "blackbird",
+ "boa",
+ "boar",
+ "bobcat",
+ "bobolink",
+ "bonobo",
+ "booby",
+ "bovid",
+ "bug",
+ "butterfly",
+ "buzzard",
+ "camel",
+ "canid",
+ "canidae",
+ "capybara",
+ "cardinal",
+ "caribou",
+ "carp",
+ "cat",
+ "caterpillar",
+ "catfish",
+ "catshark",
+ "cattle",
+ "centipede",
+ "cephalopod",
+ "chameleon",
+ "cheetah",
+ "chickadee",
+ "chicken",
+ "chimpanzee",
+ "chinchilla",
+ "chipmunk",
+ "cicada",
+ "clam",
+ "clownfish",
+ "cobra",
+ "cockroach",
+ "cod",
+ "condor",
+ "constrictor",
+ "coral",
+ "cougar",
+ "cow",
+ "coyote",
+ "crab",
+ "crane",
+ "crawdad",
+ "crayfish",
+ "cricket",
+ "crocodile",
+ "crow",
+ "cuckoo",
+ "damselfly",
+ "deer",
+ "dingo",
+ "dinosaur",
+ "dog",
+ "dolphin",
+ "donkey",
+ "dormouse",
+ "dove",
+ "dragon",
+ "dragonfly",
+ "duck",
+ "eagle",
+ "earthworm",
+ "earwig",
+ "echidna",
+ "eel",
+ "egret",
+ "elephant",
+ "elk",
+ "emu",
+ "ermine",
+ "falcon",
+ "felidae",
+ "ferret",
+ "finch",
+ "firefly",
+ "fish",
+ "flamingo",
+ "flea",
+ "fly",
+ "flyingfish",
+ "fowl",
+ "fox",
+ "frog",
+ "galliform",
+ "gamefowl",
+ "gayal",
+ "gazelle",
+ "gecko",
+ "gerbil",
+ "gibbon",
+ "giraffe",
+ "goat",
+ "goldfish",
+ "goose",
+ "gopher",
+ "gorilla",
+ "grasshopper",
+ "grouse",
+ "guan",
+ "guanaco",
+ "guineafowl",
+ "gull",
+ "guppy",
+ "haddock",
+ "halibut",
+ "hamster",
+ "hare",
+ "harrier",
+ "hawk",
+ "hedgehog",
+ "heron",
+ "herring",
+ "hippopotamus",
+ "hookworm",
+ "hornet",
+ "horse",
+ "hoverfly",
+ "hummingbird",
+ "hyena",
+ "iguana",
+ "impala",
+ "jackal",
+ "jaguar",
+ "jay",
+ "jellyfish",
+ "junglefowl",
+ "kangaroo",
+ "kingfisher",
+ "kite",
+ "kiwi",
+ "koala",
+ "koi",
+ "krill",
+ "ladybug",
+ "lamprey",
+ "landfowl",
+ "lark",
+ "leech",
+ "lemming",
+ "lemur",
+ "leopard",
+ "leopon",
+ "limpet",
+ "lion",
+ "lizard",
+ "llama",
+ "lobster",
+ "locust",
+ "loon",
+ "louse",
+ "lungfish",
+ "lynx",
+ "macaw",
+ "mackerel",
+ "magpie",
+ "mammal",
+ "manatee",
+ "mandrill",
+ "marlin",
+ "marmoset",
+ "marmot",
+ "marsupial",
+ "marten",
+ "mastodon",
+ "meadowlark",
+ "meerkat",
+ "mink",
+ "minnow",
+ "mite",
+ "mockingbird",
+ "mole",
+ "mollusk",
+ "mongoose",
+ "monkey",
+ "moose",
+ "mosquito",
+ "moth",
+ "mouse",
+ "mule",
+ "muskox",
+ "narwhal",
+ "newt",
+ "nightingale",
+ "ocelot",
+ "octopus",
+ "opossum",
+ "orangutan",
+ "orca",
+ "ostrich",
+ "otter",
+ "owl",
+ "ox",
+ "panda",
+ "panther",
+ "parakeet",
+ "parrot",
+ "parrotfish",
+ "partridge",
+ "peacock",
+ "peafowl",
+ "pelican",
+ "penguin",
+ "perch",
+ "pheasant",
+ "pig",
+ "pigeon",
+ "pike",
+ "pinniped",
+ "piranha",
+ "planarian",
+ "platypus",
+ "pony",
+ "porcupine",
+ "porpoise",
+ "possum",
+ "prawn",
+ "primate",
+ "ptarmigan",
+ "puffin",
+ "puma",
+ "python",
+ "quail",
+ "quelea",
+ "quokka",
+ "rabbit",
+ "raccoon",
+ "rat",
+ "rattlesnake",
+ "raven",
+ "reindeer",
+ "reptile",
+ "rhinoceros",
+ "roadrunner",
+ "rodent",
+ "rook",
+ "rooster",
+ "roundworm",
+ "sailfish",
+ "salamander",
+ "salmon",
+ "sawfish",
+ "scallop",
+ "scorpion",
+ "seahorse",
+ "shark",
+ "sheep",
+ "shrew",
+ "shrimp",
+ "silkworm",
+ "silverfish",
+ "skink",
+ "skunk",
+ "sloth",
+ "slug",
+ "smelt",
+ "snail",
+ "snake",
+ "snipe",
+ "sole",
+ "sparrow",
+ "spider",
+ "spoonbill",
+ "squid",
+ "squirrel",
+ "starfish",
+ "stingray",
+ "stoat",
+ "stork",
+ "sturgeon",
+ "swallow",
+ "swan",
+ "swift",
+ "swordfish",
+ "swordtail",
+ "tahr",
+ "takin",
+ "tapir",
+ "tarantula",
+ "tarsier",
+ "termite",
+ "tern",
+ "thrush",
+ "tick",
+ "tiger",
+ "tiglon",
+ "toad",
+ "tortoise",
+ "toucan",
+ "trout",
+ "tuna",
+ "turkey",
+ "turtle",
+ "tyrannosaurus",
+ "unicorn",
+ "urial",
+ "vicuna",
+ "viper",
+ "vole",
+ "vulture",
+ "wallaby",
+ "walrus",
+ "warbler",
+ "wasp",
+ "weasel",
+ "whale",
+ "whippet",
+ "whitefish",
+ "wildcat",
+ "wildebeest",
+ "wildfowl",
+ "wolf",
+ "wolverine",
+ "wombat",
+ "woodpecker",
+ "worm",
+ "wren",
+ "xerinae",
+ "yak",
+ "zebra",
+ "apple",
+ "apricot",
+ "avocado",
+ "banana",
+ "bilberry",
+ "blackberry",
+ "blackcurrant",
+ "blueberry",
+ "boysenberry",
+ "currant",
+ "cherry",
+ "cherimoya",
+ "cloudberry",
+ "coconut",
+ "cranberry",
+ "cucumber",
+ "damson",
+ "date",
+ "dragonfruit",
+ "durian",
+ "elderberry",
+ "feijoa",
+ "fig",
+ "gooseberry",
+ "grape",
+ "raisin",
+ "grapefruit",
+ "guava",
+ "honeyberry",
+ "huckleberry",
+ "jabuticaba",
+ "jackfruit",
+ "jambul",
+ "jujube",
+ "kiwano",
+ "kiwifruit",
+ "kumquat",
+ "lemon",
+ "lime",
+ "loquat",
+ "longan",
+ "lychee",
+ "mango",
+ "mangosteen",
+ "marionberry",
+ "melon",
+ "cantaloupe",
+ "honeydew",
+ "watermelon",
+ "mulberry",
+ "nectarine",
+ "nance",
+ "olive",
+ "orange",
+ "clementine",
+ "mandarine",
+ "tangerine",
+ "papaya",
+ "passionfruit",
+ "peach",
+ "pear",
+ "persimmon",
+ "physalis",
+ "plantain",
+ "plum",
+ "prune",
+ "pineapple",
+ "plumcot",
+ "pomegranate",
+ "pomelo",
+ "quince",
+ "raspberry",
+ "salmonberry",
+ "rambutan",
+ "redcurrant",
+ "salak",
+ "satsuma",
+ "soursop",
+ "strawberry",
+ "tamarillo",
+ "tamarind",
+ "yuzu",
+ "abbey",
+ "airport",
+ "arch",
+ "arena",
+ "armory",
+ "bakery",
+ "bank",
+ "barn",
+ "barracks",
+ "bridge",
+ "bunker",
+ "cabana",
+ "cafe",
+ "capitol",
+ "cathedral",
+ "chalet",
+ "chapel",
+ "chateau",
+ "church",
+ "cinema",
+ "cottage",
+ "crypt",
+ "depot",
+ "dome",
+ "dormitory",
+ "duplex",
+ "embassy",
+ "factory",
+ "fort",
+ "fortress",
+ "foundry",
+ "gallery",
+ "garage",
+ "gazebo",
+ "hall",
+ "hangar",
+ "hospital",
+ "hostel",
+ "hotel",
+ "jail",
+ "kiosk",
+ "laboratory",
+ "library",
+ "lighthouse",
+ "lodge",
+ "mall",
+ "manor",
+ "marina",
+ "market",
+ "mill",
+ "monastery",
+ "monument",
+ "mosque",
+ "motel",
+ "museum",
+ "observatory",
+ "pagoda",
+ "palace",
+ "pavilion",
+ "plant",
+ "prison",
+ "rectory",
+ "refinery",
+ "restaurant",
+ "school",
+ "shed",
+ "shrine",
+ "silo",
+ "skyscraper",
+ "spire",
+ "stable",
+ "stadium",
+ "station",
+ "store",
+ "temple",
+ "terminal",
+ "theater",
+ "tower",
+ "triplex",
+ "university",
+ "vault",
+ "amberjack",
+ "anchovy",
+ "angler",
+ "ayu",
+ "barbel",
+ "barracuda",
+ "bass",
+ "betta",
+ "blowfish",
+ "bocaccio",
+ "burbot",
+ "carp",
+ "cobbler",
+ "cod",
+ "eel",
+ "flounder",
+ "grouper",
+ "haddock",
+ "halibut",
+ "herring",
+ "mackerel",
+ "marlin",
+ "mullet",
+ "perch",
+ "pollock",
+ "salmon",
+ "sardine",
+ "scallop",
+ "shark",
+ "snapper",
+ "sole",
+ "tilapia",
+ "trout",
+ "tuna",
+ "acorn",
+ "alfalfa",
+ "bamboo",
+ "bark",
+ "bean",
+ "berry",
+ "blade",
+ "brush",
+ "bud",
+ "bulb",
+ "bush",
+ "cactus",
+ "clover",
+ "cork",
+ "corolla",
+ "fern",
+ "flora",
+ "flower",
+ "forest",
+ "fruit",
+ "garden",
+ "grain",
+ "grass",
+ "grove",
+ "herb",
+ "ivy",
+ "jungle",
+ "juniper",
+ "kelp",
+ "kudzu",
+ "leaf",
+ "lily",
+ "moss",
+ "nectar",
+ "nut",
+ "palm",
+ "petal",
+ "pollen",
+ "resin",
+ "root",
+ "sage",
+ "sap",
+ "seed",
+ "shrub",
+ "spore",
+ "stalk",
+ "spine",
+ "sprout",
+ "stem",
+ "thorn",
+ "tree",
+ "trunk",
+ "twig",
+ "vine",
+ "weed",
+ "wood",
+ "aroma",
+ "bagel",
+ "batter",
+ "beans",
+ "beer",
+ "biscuit",
+ "bread",
+ "broth",
+ "burger",
+ "butter",
+ "cake",
+ "candy",
+ "caramel",
+ "caviar",
+ "cheese",
+ "chili",
+ "chocolate",
+ "cider",
+ "cocoa",
+ "coffee",
+ "cookie",
+ "cream",
+ "croissant",
+ "crumble",
+ "cuisine",
+ "curd",
+ "dessert",
+ "dish",
+ "drink",
+ "eggs",
+ "entree",
+ "filet",
+ "fish",
+ "flour",
+ "food",
+ "glaze",
+ "grill",
+ "hamburger",
+ "ice",
+ "juice",
+ "ketchup",
+ "kitchen",
+ "lard",
+ "margarine",
+ "marinade",
+ "mayo",
+ "mayonnaise",
+ "meat",
+ "milk",
+ "mousse",
+ "muffin",
+ "mushroom",
+ "noodle",
+ "nut",
+ "oil",
+ "olive",
+ "omelette",
+ "pan",
+ "pasta",
+ "paste",
+ "pastry",
+ "pie",
+ "pizza",
+ "plate",
+ "pot",
+ "poutine",
+ "pudding",
+ "raclette",
+ "recipe",
+ "rice",
+ "salad",
+ "salsa",
+ "sandwich",
+ "sauce",
+ "seasoning",
+ "skillet",
+ "soda",
+ "soup",
+ "soy",
+ "spice",
+ "steak",
+ "stew",
+ "syrup",
+ "tartar",
+ "taste",
+ "tea",
+ "toast",
+ "vinegar",
+ "waffle",
+ "water",
+ "wheat",
+ "wine",
+ "wok",
+ "yeast",
+ "yogurt",
+ "account",
+ "accrual",
+ "actuary",
+ "annuity",
+ "appreciation",
+ "asset",
+ "auditor",
+ "balance",
+ "basis",
+ "bond",
+ "book",
+ "budget",
+ "buyout",
+ "callable",
+ "capital",
+ "cash",
+ "change",
+ "collateral",
+ "contingency",
+ "contract",
+ "cost",
+ "cycle",
+ "debt",
+ "dividend",
+ "expenditure",
+ "expense",
+ "flow",
+ "gain",
+ "interest",
+ "inventory",
+ "lease",
+ "ledger",
+ "liability",
+ "loan",
+ "paper",
+ "plan",
+ "price",
+ "report",
+ "shares",
+ "statement",
+ "stock",
+ "trust",
+ "arcade",
+ "arch",
+ "archway",
+ "balcony",
+ "baluster",
+ "balustrade",
+ "belvedere",
+ "brace",
+ "bracket",
+ "colonnade",
+ "column",
+ "cornice",
+ "courtyard",
+ "cupola",
+ "facade",
+ "frieze",
+ "gallerie",
+ "molding",
+ "panel",
+ "parapet",
+ "patio",
+ "pavilion",
+ "pediment",
+ "pergola",
+ "pilaster",
+ "portico",
+ "projection",
+ "roundel",
+ "setback",
+ "spire",
+ "terrace",
+ "tower",
+ "truss",
+ "turret",
+ "veranda",
+ "brightness",
+ "conduction",
+ "convection",
+ "core",
+ "density",
+ "dust",
+ "electron",
+ "energy",
+ "envelope",
+ "flux",
+ "fusion",
+ "gravity",
+ "hadron",
+ "halo",
+ "lepton",
+ "luminosity",
+ "magnitude",
+ "neutrino",
+ "neutron",
+ "nucleus",
+ "omega",
+ "opacity",
+ "parallax",
+ "photometry",
+ "photon",
+ "proton",
+ "pulsar",
+ "quasar",
+ "radian",
+ "radius",
+ "redshift",
+ "relativity",
+ "singularity",
+ "supernova",
+ "cabriolet",
+ "car",
+ "convertible",
+ "coupe",
+ "dragster",
+ "hatchback",
+ "hearse",
+ "hotrod",
+ "humvee",
+ "hybrid",
+ "jeep",
+ "landaulet",
+ "limo",
+ "limousine",
+ "minivan",
+ "roadster",
+ "sedan",
+ "subcompact",
+ "suv",
+ "taxi",
+ "truck",
+ "van",
+ "wagon",
+ "acre",
+ "adapter",
+ "adhesive",
+ "aerator",
+ "aggregate",
+ "airway",
+ "ampere",
+ "apron",
+ "arbor",
+ "asphalt",
+ "balustrade",
+ "beam",
+ "berm",
+ "bevel",
+ "biscuit",
+ "blend",
+ "board",
+ "bow",
+ "bracket",
+ "brad",
+ "breezeway",
+ "buck",
+ "bulldozer",
+ "burl",
+ "cabinet",
+ "cap",
+ "casing",
+ "caulk",
+ "cellulose",
+ "cement",
+ "centerline",
+ "chamfer",
+ "circuit",
+ "clearance",
+ "column",
+ "concrete",
+ "condensation",
+ "conduit",
+ "core",
+ "cornice",
+ "course",
+ "cricket",
+ "damper",
+ "darby",
+ "datum",
+ "detail",
+ "dowel",
+ "drip",
+ "drywall",
+ "easement",
+ "eaves",
+ "elbow",
+ "enamel",
+ "fall",
+ "fascia",
+ "faucet",
+ "filler",
+ "firestop",
+ "fitting",
+ "fixture",
+ "flagstone",
+ "flashing",
+ "flitch",
+ "flue",
+ "footing",
+ "frame",
+ "fuse",
+ "gable",
+ "gauge",
+ "girder",
+ "glazing",
+ "gloss",
+ "grade",
+ "grain",
+ "granite",
+ "gravel",
+ "groove",
+ "grout",
+ "gum",
+ "gusset",
+ "hearth",
+ "heel",
+ "hip",
+ "inlay",
+ "insulation",
+ "jamb",
+ "jig",
+ "jigsaw",
+ "joint",
+ "joist",
+ "kerf",
+ "knot",
+ "lacquer",
+ "laminate",
+ "landing",
+ "lath",
+ "layout",
+ "level",
+ "light",
+ "limestone",
+ "lintel",
+ "louver",
+ "lumber",
+ "lumen",
+ "mantel",
+ "marble",
+ "mason",
+ "mastic",
+ "miter",
+ "molding",
+ "mullion",
+ "muntin",
+ "nailer",
+ "newel",
+ "nosing",
+ "notch",
+ "offset",
+ "paint",
+ "panel",
+ "partition",
+ "patio",
+ "pedestal",
+ "penny",
+ "pergola",
+ "pier",
+ "pigment",
+ "pilaster",
+ "piles",
+ "pitch",
+ "plank",
+ "plaster",
+ "plate",
+ "ply",
+ "plywood",
+ "porch",
+ "post",
+ "preservative",
+ "primer",
+ "pumice",
+ "purlin",
+ "putty",
+ "radial",
+ "radon",
+ "rafter",
+ "rake",
+ "rasp",
+ "ravvet",
+ "resin",
+ "reveal",
+ "ribbon",
+ "ridge",
+ "rise",
+ "riser",
+ "roof",
+ "rosin",
+ "rout",
+ "router",
+ "rubble",
+ "run",
+ "runoff",
+ "saddle",
+ "sanding",
+ "sandstone",
+ "sap",
+ "sapwood",
+ "sash",
+ "scaffold",
+ "scarfing",
+ "screed",
+ "sealer",
+ "section",
+ "setback",
+ "shim",
+ "siding",
+ "sill",
+ "slab",
+ "slate",
+ "sleeper",
+ "slope",
+ "soil",
+ "soldier",
+ "solvent",
+ "span",
+ "spline",
+ "square",
+ "stain",
+ "story",
+ "strata",
+ "stucco",
+ "stud",
+ "subdivision",
+ "subfloor",
+ "sump",
+ "survey",
+ "swale",
+ "taper",
+ "taping",
+ "template",
+ "thinner",
+ "threshold",
+ "tint",
+ "title",
+ "tongue",
+ "transom",
+ "trap",
+ "tread",
+ "trellis",
+ "trim",
+ "truss",
+ "turpentine",
+ "valance",
+ "valley",
+ "canity",
+ "varnish",
+ "vehicle",
+ "veneer",
+ "volt",
+ "warp",
+ "watt",
+ "wattage",
+ "wax",
+ "banquette",
+ "bench",
+ "chair",
+ "chaise",
+ "couch",
+ "futon",
+ "loveseat",
+ "ottoman",
+ "pouf",
+ "sectional",
+ "settee",
+ "sofa",
+ "stool",
+ "altitude",
+ "archipelago",
+ "area",
+ "atlas",
+ "atoll",
+ "azimuth",
+ "bay",
+ "border",
+ "butte",
+ "canal",
+ "canyon",
+ "cape",
+ "capital",
+ "cave",
+ "channel",
+ "chart",
+ "city",
+ "cliff",
+ "compass",
+ "continent",
+ "contour",
+ "country",
+ "cove",
+ "degree",
+ "delta",
+ "desert",
+ "dune",
+ "east",
+ "elevation",
+ "equator",
+ "estuary",
+ "fjord",
+ "geyser",
+ "glacier",
+ "globe",
+ "gulf",
+ "hill",
+ "island",
+ "key",
+ "lagoon",
+ "lake",
+ "land",
+ "landform",
+ "latitude",
+ "legend",
+ "longitude",
+ "map",
+ "marsh",
+ "meridian",
+ "mesa",
+ "mountain",
+ "nation",
+ "north",
+ "oasis",
+ "ocean",
+ "parallel",
+ "peak",
+ "peninsula",
+ "plain",
+ "plateau",
+ "pole",
+ "pond",
+ "prairie",
+ "projection",
+ "range",
+ "reef",
+ "region",
+ "reservoir",
+ "river",
+ "scale",
+ "sea",
+ "sound",
+ "source",
+ "south",
+ "strait",
+ "swamp",
+ "tributary",
+ "tropics",
+ "tundra",
+ "valley",
+ "volcano",
+ "waterfall",
+ "west",
+ "wetland",
+ "world"
+]
diff --git a/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/internal/UsageResourceTest.java b/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/internal/UsageResourceTest.java
new file mode 100644
index 0000000000..78e22c95ec
--- /dev/null
+++ b/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/internal/UsageResourceTest.java
@@ -0,0 +1,193 @@
+package com.comet.opik.api.resources.v1.internal;
+
+import com.comet.opik.api.Trace;
+import com.comet.opik.api.TraceCountResponse;
+import com.comet.opik.api.resources.utils.AuthTestUtils;
+import com.comet.opik.api.resources.utils.ClickHouseContainerUtils;
+import com.comet.opik.api.resources.utils.ClientSupportUtils;
+import com.comet.opik.api.resources.utils.MigrationUtils;
+import com.comet.opik.api.resources.utils.MySQLContainerUtils;
+import com.comet.opik.api.resources.utils.RedisContainerUtils;
+import com.comet.opik.api.resources.utils.TestDropwizardAppExtensionUtils;
+import com.comet.opik.api.resources.utils.TestUtils;
+import com.comet.opik.api.resources.utils.WireMockUtils;
+import com.comet.opik.infrastructure.db.TransactionTemplate;
+import com.comet.opik.podam.PodamFactoryUtils;
+import com.redis.testcontainers.RedisContainer;
+import jakarta.ws.rs.client.Entity;
+import jakarta.ws.rs.core.HttpHeaders;
+import jakarta.ws.rs.core.MediaType;
+import org.jdbi.v3.core.Jdbi;
+import org.junit.jupiter.api.AfterAll;
+import org.junit.jupiter.api.BeforeAll;
+import org.junit.jupiter.api.DisplayName;
+import org.junit.jupiter.api.Nested;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.TestInstance;
+import org.junit.jupiter.api.extension.RegisterExtension;
+import org.testcontainers.containers.ClickHouseContainer;
+import org.testcontainers.containers.MySQLContainer;
+import org.testcontainers.junit.jupiter.Testcontainers;
+import reactor.core.publisher.Mono;
+import ru.vyarus.dropwizard.guice.test.ClientSupport;
+import ru.vyarus.dropwizard.guice.test.jupiter.ext.TestDropwizardAppExtension;
+import uk.co.jemos.podam.api.PodamFactory;
+
+import java.sql.SQLException;
+import java.util.UUID;
+
+import static com.comet.opik.api.resources.utils.ClickHouseContainerUtils.DATABASE_NAME;
+import static com.comet.opik.api.resources.utils.MigrationUtils.CLICKHOUSE_CHANGELOG_FILE;
+import static com.comet.opik.domain.ProjectService.DEFAULT_PROJECT;
+import static com.comet.opik.infrastructure.auth.RequestContext.WORKSPACE_HEADER;
+import static org.assertj.core.api.Assertions.assertThat;
+
+@Testcontainers(parallel = true)
+@DisplayName("Usage Resource Test")
+@TestInstance(TestInstance.Lifecycle.PER_CLASS)
+public class UsageResourceTest {
+ public static final String USAGE_RESOURCE_URL_TEMPLATE = "%s/v1/internal/usage";
+ public static final String TRACE_RESOURCE_URL_TEMPLATE = "%s/v1/private/traces";
+
+ private static final String USER = UUID.randomUUID().toString();
+
+ private static final RedisContainer REDIS = RedisContainerUtils.newRedisContainer();
+
+ private static final MySQLContainer> MYSQL_CONTAINER = MySQLContainerUtils.newMySQLContainer();
+
+ private static final ClickHouseContainer CLICK_HOUSE_CONTAINER = ClickHouseContainerUtils.newClickHouseContainer();
+
+ @RegisterExtension
+ private static final TestDropwizardAppExtension app;
+
+ private static final WireMockUtils.WireMockRuntime wireMock;
+
+ static {
+ MYSQL_CONTAINER.start();
+ CLICK_HOUSE_CONTAINER.start();
+ REDIS.start();
+
+ wireMock = WireMockUtils.startWireMock();
+
+ var databaseAnalyticsFactory = ClickHouseContainerUtils.newDatabaseAnalyticsFactory(
+ CLICK_HOUSE_CONTAINER, DATABASE_NAME);
+
+ app = TestDropwizardAppExtensionUtils.newTestDropwizardAppExtension(
+ MYSQL_CONTAINER.getJdbcUrl(), databaseAnalyticsFactory, wireMock.runtimeInfo(), REDIS.getRedisURI());
+ }
+
+ private final PodamFactory factory = PodamFactoryUtils.newPodamFactory();
+
+ private String baseURI;
+ private ClientSupport client;
+ private TransactionTemplate template;
+
+ @BeforeAll
+ void setUpAll(ClientSupport client, Jdbi jdbi, TransactionTemplate template) throws SQLException {
+
+ MigrationUtils.runDbMigration(jdbi, MySQLContainerUtils.migrationParameters());
+
+ try (var connection = CLICK_HOUSE_CONTAINER.createConnection("")) {
+ MigrationUtils.runDbMigration(connection, CLICKHOUSE_CHANGELOG_FILE,
+ ClickHouseContainerUtils.migrationParameters());
+ }
+
+ this.baseURI = "http://localhost:%d".formatted(client.getPort());
+ this.client = client;
+ this.template = template;
+
+ ClientSupportUtils.config(client);
+ }
+
+ @AfterAll
+ void tearDownAll() {
+ wireMock.server().stop();
+ }
+
+ private static void mockTargetWorkspace(String apiKey, String workspaceName, String workspaceId) {
+ AuthTestUtils.mockTargetWorkspace(wireMock.server(), apiKey, workspaceName, workspaceId, USER);
+ }
+
+ @Nested
+ @DisplayName("Opik usage:")
+ @TestInstance(TestInstance.Lifecycle.PER_CLASS)
+ class Usage {
+
+ private final String okApikey = UUID.randomUUID().toString();
+
+ @Test
+ @DisplayName("Get traces count on previous day for all workspaces, no Auth")
+ void tracesCountForWorkspace() {
+ // Setup mock workspace with traces
+ var workspaceName = UUID.randomUUID().toString();
+ var workspaceId = UUID.randomUUID().toString();
+ int tracesCount = setupTracesForWorkspace(workspaceName, workspaceId, okApikey);
+
+ // Change created_at to the previous day in order to capture those traces in count query, since for Stripe we need to count it daily for yesterday
+ String updateCreatedAt = "ALTER TABLE traces UPDATE created_at = subtractDays(created_at, 1) WHERE workspace_id=:workspace_id;";
+ template.nonTransaction(connection -> {
+ var statement = connection.createStatement(updateCreatedAt)
+ .bind("workspace_id", workspaceId);
+ return Mono.from(statement.execute());
+ }).block();
+
+ // Setup second workspace with traces, but leave created_at date set to today, so traces do not end up in the pool
+ var workspaceNameForToday = UUID.randomUUID().toString();
+ var workspaceIdForToday = UUID.randomUUID().toString();
+ setupTracesForWorkspace(workspaceNameForToday, workspaceIdForToday, okApikey);
+
+ try (var actualResponse = client.target(USAGE_RESOURCE_URL_TEMPLATE.formatted(baseURI))
+ .path("/workspace-trace-counts")
+ .request()
+ .header(HttpHeaders.AUTHORIZATION, okApikey)
+ .header(WORKSPACE_HEADER, workspaceName)
+ .get()) {
+
+ assertThat(actualResponse.getStatusInfo().getStatusCode()).isEqualTo(200);
+ assertThat(actualResponse.hasEntity()).isTrue();
+
+ var response = actualResponse.readEntity(TraceCountResponse.class);
+ assertThat(response.workspacesTracesCount().size()).isEqualTo(1);
+ assertThat(response.workspacesTracesCount().get(0))
+ .isEqualTo(new TraceCountResponse.WorkspaceTraceCount(workspaceId, tracesCount));
+ }
+ }
+ }
+
+ private int setupTracesForWorkspace(String workspaceName, String workspaceId, String okApikey) {
+ mockTargetWorkspace(okApikey, workspaceName, workspaceId);
+
+ var traces = PodamFactoryUtils.manufacturePojoList(factory, Trace.class)
+ .stream()
+ .map(t -> t.toBuilder()
+ .projectId(null)
+ .projectName(DEFAULT_PROJECT)
+ .feedbackScores(null)
+ .build())
+ .toList();
+
+ traces.forEach(trace -> createTrace(trace, okApikey, workspaceName));
+
+ return traces.size();
+ }
+
+ private UUID createTrace(Trace trace, String apiKey, String workspaceName) {
+ try (var actualResponse = client.target(TRACE_RESOURCE_URL_TEMPLATE.formatted(baseURI))
+ .request()
+ .accept(MediaType.APPLICATION_JSON_TYPE)
+ .header(HttpHeaders.AUTHORIZATION, apiKey)
+ .header(WORKSPACE_HEADER, workspaceName)
+ .post(Entity.json(trace))) {
+
+ assertThat(actualResponse.getStatusInfo().getStatusCode()).isEqualTo(201);
+ assertThat(actualResponse.hasEntity()).isFalse();
+
+ var actualId = TestUtils.getIdFromLocation(actualResponse.getLocation());
+
+ if (trace.id() != null) {
+ assertThat(actualId).isEqualTo(trace.id());
+ }
+ return actualId;
+ }
+ }
+}
diff --git a/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/priv/ExperimentsResourceTest.java b/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/priv/ExperimentsResourceTest.java
index 2beeb2df2b..3b4e852e8b 100644
--- a/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/priv/ExperimentsResourceTest.java
+++ b/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/priv/ExperimentsResourceTest.java
@@ -30,6 +30,7 @@
import jakarta.ws.rs.client.Entity;
import jakarta.ws.rs.core.HttpHeaders;
import org.apache.commons.lang3.RandomStringUtils;
+import org.apache.commons.lang3.StringUtils;
import org.assertj.core.api.recursive.comparison.RecursiveComparisonConfiguration;
import org.jdbi.v3.core.Jdbi;
import org.jetbrains.annotations.NotNull;
@@ -44,6 +45,8 @@
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.Arguments;
import org.junit.jupiter.params.provider.MethodSource;
+import org.junit.jupiter.params.provider.NullAndEmptySource;
+import org.junit.jupiter.params.provider.ValueSource;
import org.testcontainers.containers.ClickHouseContainer;
import org.testcontainers.containers.MySQLContainer;
import org.testcontainers.junit.jupiter.Testcontainers;
@@ -90,7 +93,7 @@ class ExperimentsResourceTest {
private static final String API_KEY = UUID.randomUUID().toString();
private static final String[] EXPERIMENT_IGNORED_FIELDS = new String[]{
- "id", "datasetId", "feedbackScores", "traceCount", "createdAt", "lastUpdatedAt", "createdBy",
+ "id", "datasetId", "name", "feedbackScores", "traceCount", "createdAt", "lastUpdatedAt", "createdBy",
"lastUpdatedBy"};
public static final String[] IGNORED_FIELDS = {"input", "output", "feedbackScores", "createdAt", "lastUpdatedAt",
"createdBy", "lastUpdatedBy"};
@@ -1246,11 +1249,14 @@ void createAndGetFeedbackAvg() {
.isEqualTo(expectedScores);
}
- @Test
- void createWithoutOptionalFieldsAndGet() {
+ @ParameterizedTest
+ @NullAndEmptySource
+ @ValueSource(strings = {" "})
+ void createWithoutOptionalFieldsAndGet(String name) {
var expectedExperiment = podamFactory.manufacturePojo(Experiment.class)
.toBuilder()
.id(null)
+ .name(name)
.metadata(null)
.build();
var expectedId = createAndAssert(expectedExperiment, API_KEY, TEST_WORKSPACE);
@@ -1592,6 +1598,11 @@ private void assertIgnoredFields(
} else {
assertThat(actualExperiment.datasetId()).isNotNull();
}
+ if (StringUtils.isNotBlank(expectedExperiment.name())) {
+ assertThat(actualExperiment.name()).isEqualTo(expectedExperiment.name());
+ } else {
+ assertThat(actualExperiment.name()).matches("[a-zA-Z]+_[a-zA-Z]+_\\d+");
+ }
assertThat(actualExperiment.traceCount()).isNotNull();
assertThat(actualExperiment.createdAt()).isAfter(expectedExperiment.createdAt());
diff --git a/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/priv/TracesResourceTest.java b/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/priv/TracesResourceTest.java
index 8bdf9fb926..4f822c9c9a 100644
--- a/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/priv/TracesResourceTest.java
+++ b/apps/opik-backend/src/test/java/com/comet/opik/api/resources/v1/priv/TracesResourceTest.java
@@ -338,18 +338,7 @@ void get__whenApiKeyIsPresent__thenReturnProperResponse(String apiKey, boolean e
var workspaceName = UUID.randomUUID().toString();
var workspaceId = UUID.randomUUID().toString();
- mockTargetWorkspace(okApikey, workspaceName, workspaceId);
-
- var traces = PodamFactoryUtils.manufacturePojoList(factory, Trace.class)
- .stream()
- .map(t -> t.toBuilder()
- .projectId(null)
- .projectName(DEFAULT_PROJECT)
- .feedbackScores(null)
- .build())
- .toList();
-
- traces.forEach(trace -> create(trace, okApikey, workspaceName));
+ int tracesCount = setupTracesForWorkspace(workspaceName, workspaceId, okApikey);
try (var actualResponse = client.target(URL_TEMPLATE.formatted(baseURI))
.queryParam("project_name", DEFAULT_PROJECT)
@@ -363,7 +352,7 @@ void get__whenApiKeyIsPresent__thenReturnProperResponse(String apiKey, boolean e
assertThat(actualResponse.hasEntity()).isTrue();
var response = actualResponse.readEntity(Trace.TracePage.class);
- assertThat(response.content()).hasSize(traces.size());
+ assertThat(response.content()).hasSize(tracesCount);
} else {
assertThat(actualResponse.getStatusInfo().getStatusCode()).isEqualTo(401);
assertThat(actualResponse.readEntity(io.dropwizard.jersey.errors.ErrorMessage.class))
@@ -482,7 +471,6 @@ void feedbackBatch__whenApiKeyIsPresent__thenReturnProperResponse(String apiKey,
}
}
-
}
@Nested
@@ -4695,4 +4683,21 @@ private void assertEqualsForScores(List expected, List actual) {
.ignoringCollectionOrder()
.isEqualTo(expected);
}
+
+ private int setupTracesForWorkspace(String workspaceName, String workspaceId, String okApikey) {
+ mockTargetWorkspace(okApikey, workspaceName, workspaceId);
+
+ var traces = PodamFactoryUtils.manufacturePojoList(factory, Trace.class)
+ .stream()
+ .map(t -> t.toBuilder()
+ .projectId(null)
+ .projectName(DEFAULT_PROJECT)
+ .feedbackScores(null)
+ .build())
+ .toList();
+
+ traces.forEach(trace -> TracesResourceTest.this.create(trace, okApikey, workspaceName));
+
+ return traces.size();
+ }
}
diff --git a/apps/opik-backend/src/test/java/com/comet/opik/infrastructure/ratelimit/RateLimitE2ETest.java b/apps/opik-backend/src/test/java/com/comet/opik/infrastructure/ratelimit/RateLimitE2ETest.java
index aa10a8b2bf..08c7cd98ce 100644
--- a/apps/opik-backend/src/test/java/com/comet/opik/infrastructure/ratelimit/RateLimitE2ETest.java
+++ b/apps/opik-backend/src/test/java/com/comet/opik/infrastructure/ratelimit/RateLimitE2ETest.java
@@ -29,6 +29,7 @@
import jakarta.ws.rs.core.HttpHeaders;
import jakarta.ws.rs.core.MediaType;
import jakarta.ws.rs.core.Response;
+import org.apache.hc.core5.http.HttpStatus;
import org.jdbi.v3.core.Jdbi;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.BeforeAll;
@@ -158,8 +159,8 @@ void rateLimit__whenUsingApiKeyAndLimitIsExceeded__shouldBlockRemainingCalls() {
Map responseMap = triggerCallsWithApiKey(LIMIT * 2, projectName, apiKey, workspaceName);
- assertEquals(LIMIT, responseMap.get(429));
- assertEquals(LIMIT, responseMap.get(201));
+ assertEquals(LIMIT, responseMap.get(HttpStatus.SC_TOO_MANY_REQUESTS));
+ assertEquals(LIMIT, responseMap.get(HttpStatus.SC_CREATED));
try (var response = client.target(BASE_RESOURCE_URI.formatted(baseURI))
.queryParam("project_name", projectName)
@@ -171,7 +172,7 @@ void rateLimit__whenUsingApiKeyAndLimitIsExceeded__shouldBlockRemainingCalls() {
.get()) {
// Verify that traces created are equal to the limit
- assertEquals(200, response.getStatus());
+ assertEquals(HttpStatus.SC_OK, response.getStatus());
TracePage page = response.readEntity(TracePage.class);
assertEquals(LIMIT, page.content().size());
@@ -196,13 +197,13 @@ void rateLimit__whenUsingApiKeyAndLimitIsNotExceededGivenDuration__thenAllowAllC
Map responseMap = triggerCallsWithApiKey(LIMIT, projectName, apiKey, workspaceName);
- assertEquals(LIMIT, responseMap.get(201));
+ assertEquals(LIMIT, responseMap.get(HttpStatus.SC_CREATED));
SingleDelay.timer(LIMIT_DURATION_IN_SECONDS, TimeUnit.SECONDS).blockingGet();
responseMap = triggerCallsWithApiKey(LIMIT, projectName, apiKey, workspaceName);
- assertEquals(LIMIT, responseMap.get(201));
+ assertEquals(LIMIT, responseMap.get(HttpStatus.SC_CREATED));
try (var response = client.target(BASE_RESOURCE_URI.formatted(baseURI))
.queryParam("project_name", projectName)
@@ -213,7 +214,7 @@ void rateLimit__whenUsingApiKeyAndLimitIsNotExceededGivenDuration__thenAllowAllC
.header(WORKSPACE_HEADER, workspaceName)
.get()) {
- assertEquals(200, response.getStatus());
+ assertEquals(HttpStatus.SC_OK, response.getStatus());
TracePage page = response.readEntity(TracePage.class);
assertEquals(LIMIT * 2, page.content().size());
@@ -238,8 +239,8 @@ void rateLimit__whenUsingSessionTokenAndLimitIsExceeded__shouldBlockRemainingCal
Map responseMap = triggerCallsWithCookie(LIMIT * 2, projectName, sessionToken, workspaceName);
- assertEquals(LIMIT, responseMap.get(429));
- assertEquals(LIMIT, responseMap.get(201));
+ assertEquals(LIMIT, responseMap.get(HttpStatus.SC_TOO_MANY_REQUESTS));
+ assertEquals(LIMIT, responseMap.get(HttpStatus.SC_CREATED));
try (var response = client.target(BASE_RESOURCE_URI.formatted(baseURI))
.queryParam("project_name", projectName)
@@ -250,7 +251,7 @@ void rateLimit__whenUsingSessionTokenAndLimitIsExceeded__shouldBlockRemainingCal
.header(WORKSPACE_HEADER, workspaceName)
.get()) {
- assertEquals(200, response.getStatus());
+ assertEquals(HttpStatus.SC_OK, response.getStatus());
TracePage page = response.readEntity(TracePage.class);
assertEquals(LIMIT, page.content().size());
@@ -275,13 +276,13 @@ void rateLimit__whenUsingSessionTokenAndLimitIsNotExceededGivenDuration__thenAll
Map responseMap = triggerCallsWithCookie(LIMIT, projectName, sessionToken, workspaceName);
- assertEquals(LIMIT, responseMap.get(201));
+ assertEquals(LIMIT, responseMap.get(HttpStatus.SC_CREATED));
SingleDelay.timer(LIMIT_DURATION_IN_SECONDS, TimeUnit.SECONDS).blockingGet();
responseMap = triggerCallsWithCookie(LIMIT, projectName, sessionToken, workspaceName);
- assertEquals(LIMIT, responseMap.get(201));
+ assertEquals(LIMIT, responseMap.get(HttpStatus.SC_CREATED));
try (var response = client.target(BASE_RESOURCE_URI.formatted(baseURI))
.queryParam("project_name", projectName)
@@ -293,7 +294,7 @@ void rateLimit__whenUsingSessionTokenAndLimitIsNotExceededGivenDuration__thenAll
.get()) {
// Verify that traces created are equal to the limit
- assertEquals(200, response.getStatus());
+ assertEquals(HttpStatus.SC_OK, response.getStatus());
TracePage page = response.readEntity(TracePage.class);
assertEquals(LIMIT * 2, page.content().size());
@@ -318,7 +319,7 @@ void rateLimit__whenRemainingLimitIsLessThanRequestedSize__thenRejectTheRequest(
Map responseMap = triggerCallsWithApiKey(1, projectName, apiKey, workspaceName);
- assertEquals(1, responseMap.get(201));
+ assertEquals(1, responseMap.get(HttpStatus.SC_CREATED));
List traces = IntStream.range(0, (int) LIMIT)
.mapToObj(i -> factory.manufacturePojo(Trace.class).toBuilder()
@@ -335,7 +336,7 @@ void rateLimit__whenRemainingLimitIsLessThanRequestedSize__thenRejectTheRequest(
.header(WORKSPACE_HEADER, workspaceName)
.post(Entity.json(new TraceBatch(traces)))) {
- assertEquals(429, response.getStatus());
+ assertEquals(HttpStatus.SC_TOO_MANY_REQUESTS, response.getStatus());
var error = response.readEntity(ErrorMessage.class);
assertEquals("Too Many Requests", error.getMessage());
}
@@ -356,7 +357,7 @@ void rateLimit__whenAfterRejectRequestDueToBatchSize__thenAcceptTheRequestWithRe
Map responseMap = triggerCallsWithApiKey(1, projectName, apiKey, workspaceName);
- assertEquals(1, responseMap.get(201));
+ assertEquals(1, responseMap.get(HttpStatus.SC_CREATED));
List traces = IntStream.range(0, (int) LIMIT)
.mapToObj(i -> factory.manufacturePojo(Trace.class).toBuilder()
@@ -373,7 +374,7 @@ void rateLimit__whenAfterRejectRequestDueToBatchSize__thenAcceptTheRequestWithRe
.header(WORKSPACE_HEADER, workspaceName)
.post(Entity.json(new TraceBatch(traces)))) {
- assertEquals(429, response.getStatus());
+ assertEquals(HttpStatus.SC_TOO_MANY_REQUESTS, response.getStatus());
var error = response.readEntity(ErrorMessage.class);
assertEquals("Too Many Requests", error.getMessage());
}
@@ -386,7 +387,7 @@ void rateLimit__whenAfterRejectRequestDueToBatchSize__thenAcceptTheRequestWithRe
.header(WORKSPACE_HEADER, workspaceName)
.post(Entity.json(new TraceBatch(traces.subList(0, (int) LIMIT - 1))))) {
- assertEquals(204, response.getStatus());
+ assertEquals(HttpStatus.SC_NO_CONTENT, response.getStatus());
}
}
@@ -414,12 +415,12 @@ void rateLimit__whenBatchEndpointConsumerRemainingLimit__thenRejectNextRequest(
try (var response = request.method(method, Entity.json(batch))) {
- assertEquals(204, response.getStatus());
+ assertEquals(HttpStatus.SC_NO_CONTENT, response.getStatus());
}
try (var response = request.method(method, Entity.json(batch2))) {
- assertEquals(429, response.getStatus());
+ assertEquals(HttpStatus.SC_TOO_MANY_REQUESTS, response.getStatus());
var error = response.readEntity(ErrorMessage.class);
assertEquals("Too Many Requests", error.getMessage());
}
@@ -450,13 +451,13 @@ void rateLimit__whenOperationFailsAfterAcceptingRequest__thenDecrementTheLimit()
.header(WORKSPACE_HEADER, workspaceName)
.post(Entity.json(trace))) {
- assertEquals(201, response.getStatus());
+ assertEquals(HttpStatus.SC_CREATED, response.getStatus());
}
// consumer limit - 2 from the limit leaving 1 remaining
Map responseMap = triggerCallsWithApiKey(LIMIT - 2, projectName, apiKey, workspaceName);
- assertEquals(LIMIT - 2, responseMap.get(201));
+ assertEquals(LIMIT - 2, responseMap.get(HttpStatus.SC_CREATED));
// consume the remaining limit but fail
try (var response = client.target(BASE_RESOURCE_URI.formatted(baseURI))
@@ -466,18 +467,18 @@ void rateLimit__whenOperationFailsAfterAcceptingRequest__thenDecrementTheLimit()
.header(WORKSPACE_HEADER, workspaceName)
.post(Entity.json(trace))) {
- assertEquals(409, response.getStatus());
+ assertEquals(HttpStatus.SC_CONFLICT, response.getStatus());
}
// consume the remaining limit
responseMap = triggerCallsWithApiKey(1, projectName, apiKey, workspaceName);
- assertEquals(1, responseMap.get(201));
+ assertEquals(1, responseMap.get(HttpStatus.SC_CREATED));
// verify that the limit is now 0
responseMap = triggerCallsWithApiKey(1, projectName, apiKey, workspaceName);
- assertEquals(1, responseMap.get(429));
+ assertEquals(1, responseMap.get(HttpStatus.SC_TOO_MANY_REQUESTS));
}
@Test
@@ -506,7 +507,7 @@ void rateLimit__whenProcessingOperations__thenReturnRemainingLimitAsHeader() {
.post(Entity.json(trace))) {
if (i < LIMIT) {
- assertEquals(201, response.getStatus());
+ assertEquals(HttpStatus.SC_CREATED, response.getStatus());
String remainingLimit = response.getHeaderString(RequestContext.USER_REMAINING_LIMIT);
String userLimit = response.getHeaderString(RequestContext.USER_LIMIT);
@@ -516,7 +517,7 @@ void rateLimit__whenProcessingOperations__thenReturnRemainingLimitAsHeader() {
assertEquals(RateLimited.GENERAL_EVENTS, userLimit);
assertThat(Long.parseLong(remainingTtl)).isStrictlyBetween(0L, Duration.ofSeconds(LIMIT_DURATION_IN_SECONDS).toMillis());
} else {
- assertEquals(429, response.getStatus());
+ assertEquals(HttpStatus.SC_TOO_MANY_REQUESTS, response.getStatus());
}
}
});
diff --git a/apps/opik-documentation/documentation/docs/cookbook/evaluate_hallucination_metric.ipynb b/apps/opik-documentation/documentation/docs/cookbook/evaluate_hallucination_metric.ipynb
index 89903f3dd4..d813672e94 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/evaluate_hallucination_metric.ipynb
+++ b/apps/opik-documentation/documentation/docs/cookbook/evaluate_hallucination_metric.ipynb
@@ -15,9 +15,9 @@
"source": [
"## Creating an account on Comet.com\n",
"\n",
- "[Comet](https://www.comet.com/site) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm) and grab you API Key.\n",
+ "[Comet](https://www.comet.com/site?from=llm&utm_source=opik&utm_medium=colab&utm_content=eval_hall) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm&utm_source=opik&utm_medium=colab&utm_content=eval_hall) and grab you API Key.\n",
"\n",
- "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/) for more information."
+ "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/?from=llm&utm_source=opik&utm_medium=colab&utm_content=eval_hall) for more information."
]
},
{
@@ -26,30 +26,9 @@
"metadata": {},
"outputs": [],
"source": [
- "import os\n",
- "import getpass\n",
+ "import opik\n",
"\n",
- "if \"OPIK_API_KEY\" not in os.environ:\n",
- " os.environ[\"OPIK_API_KEY\"] = getpass.getpass(\"Opik API Key: \")\n",
- "if \"OPIK_WORKSPACE\" not in os.environ:\n",
- " os.environ[\"OPIK_WORKSPACE\"] = input(\"Comet workspace (often the same as your username): \")"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "If you are running the Opik platform locally, simply set:"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "# import os\n",
- "# os.environ[\"OPIK_URL_OVERRIDE\"] = \"http://localhost:5173/api\""
+ "opik.configure(use_local=False)"
]
},
{
diff --git a/apps/opik-documentation/documentation/docs/cookbook/evaluate_hallucination_metric.md b/apps/opik-documentation/documentation/docs/cookbook/evaluate_hallucination_metric.md
index d6e63d3359..8eedcd58ec 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/evaluate_hallucination_metric.md
+++ b/apps/opik-documentation/documentation/docs/cookbook/evaluate_hallucination_metric.md
@@ -10,21 +10,9 @@ For this guide we will be evaluating the Hallucination metric included in the LL
```python
-import os
-import getpass
-
-if "OPIK_API_KEY" not in os.environ:
- os.environ["OPIK_API_KEY"] = getpass.getpass("Opik API Key: ")
-if "OPIK_WORKSPACE" not in os.environ:
- os.environ["OPIK_WORKSPACE"] = input("Comet workspace (often the same as your username): ")
-```
-
-If you are running the Opik platform locally, simply set:
-
+import opik
-```python
-# import os
-# os.environ["OPIK_URL_OVERRIDE"] = "http://localhost:5173/api"
+opik.configure(use_local=False)
```
## Preparing our environment
diff --git a/apps/opik-documentation/documentation/docs/cookbook/evaluate_moderation_metric.ipynb b/apps/opik-documentation/documentation/docs/cookbook/evaluate_moderation_metric.ipynb
index 4c63f1de9b..9f32b6ed4b 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/evaluate_moderation_metric.ipynb
+++ b/apps/opik-documentation/documentation/docs/cookbook/evaluate_moderation_metric.ipynb
@@ -15,9 +15,9 @@
"source": [
"## Creating an account on Comet.com\n",
"\n",
- "[Comet](https://www.comet.com/site) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm) and grab you API Key.\n",
+ "[Comet](https://www.comet.com/site?from=llm&utm_source=opik&utm_medium=colab&utm_content=eval_mod) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm&utm_source=opik&utm_medium=colab&utm_content=eval_mod) and grab you API Key.\n",
"\n",
- "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/) for more information."
+ "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/?from=llm&utm_source=opik&utm_medium=colab&utm_content=eval_mod) for more information."
]
},
{
@@ -26,30 +26,9 @@
"metadata": {},
"outputs": [],
"source": [
- "import os\n",
- "import getpass\n",
+ "import opik\n",
"\n",
- "if \"OPIK_API_KEY\" not in os.environ:\n",
- " os.environ[\"OPIK_API_KEY\"] = getpass.getpass(\"Opik API Key: \")\n",
- "if \"OPIK_WORKSPACE\" not in os.environ:\n",
- " os.environ[\"OPIK_WORKSPACE\"] = input(\"Comet workspace (often the same as your username): \")"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "If you are running the Opik platform locally, simply set:"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "#import os\n",
- "# os.environ[\"OPIK_URL_OVERRIDE\"] = \"http://localhost:5173/api\""
+ "opik.configure(use_local=False)"
]
},
{
diff --git a/apps/opik-documentation/documentation/docs/cookbook/evaluate_moderation_metric.md b/apps/opik-documentation/documentation/docs/cookbook/evaluate_moderation_metric.md
index 495b79e1f6..d1d62eac31 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/evaluate_moderation_metric.md
+++ b/apps/opik-documentation/documentation/docs/cookbook/evaluate_moderation_metric.md
@@ -10,21 +10,9 @@ For this guide we will be evaluating the Moderation metric included in the LLM E
```python
-import os
-import getpass
-
-if "OPIK_API_KEY" not in os.environ:
- os.environ["OPIK_API_KEY"] = getpass.getpass("Opik API Key: ")
-if "OPIK_WORKSPACE" not in os.environ:
- os.environ["OPIK_WORKSPACE"] = input("Comet workspace (often the same as your username): ")
-```
-
-If you are running the Opik platform locally, simply set:
-
+import opik
-```python
-#import os
-# os.environ["OPIK_URL_OVERRIDE"] = "http://localhost:5173/api"
+opik.configure(use_local=False)
```
## Preparing our environment
diff --git a/apps/opik-documentation/documentation/docs/cookbook/langchain.ipynb b/apps/opik-documentation/documentation/docs/cookbook/langchain.ipynb
index 18690d106b..9893168443 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/langchain.ipynb
+++ b/apps/opik-documentation/documentation/docs/cookbook/langchain.ipynb
@@ -21,9 +21,9 @@
"source": [
"## Creating an account on Comet.com\n",
"\n",
- "[Comet](https://www.comet.com/site) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm) and grab you API Key.\n",
+ "[Comet](https://www.comet.com/site?from=llm&utm_source=opik&utm_medium=colab&utm_content=langchain) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm&utm_source=opik&utm_medium=colab&utm_content=langchain) and grab you API Key.\n",
"\n",
- "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/) for more information."
+ "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/?from=llm&utm_source=opik&utm_medium=colab&utm_content=langchain) for more information."
]
},
{
@@ -32,30 +32,9 @@
"metadata": {},
"outputs": [],
"source": [
- "import os\n",
- "import getpass\n",
+ "import opik\n",
"\n",
- "if \"OPIK_API_KEY\" not in os.environ:\n",
- " os.environ[\"OPIK_API_KEY\"] = getpass.getpass(\"Opik API Key: \")\n",
- "if \"OPIK_WORKSPACE\" not in os.environ:\n",
- " os.environ[\"OPIK_WORKSPACE\"] = input(\"Comet workspace (often the same as your username): \")"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "If you are running the Opik platform locally, simply set:"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "# import os\n",
- "# os.environ[\"OPIK_URL_OVERRIDE\"] = \"http://localhost:5173/api\""
+ "opik.configure(use_local=False)"
]
},
{
diff --git a/apps/opik-documentation/documentation/docs/cookbook/langchain.md b/apps/opik-documentation/documentation/docs/cookbook/langchain.md
index 30af79ac90..dd94cf9531 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/langchain.md
+++ b/apps/opik-documentation/documentation/docs/cookbook/langchain.md
@@ -16,21 +16,9 @@ We will highlight three different parts of the workflow:
```python
-import os
-import getpass
-
-if "OPIK_API_KEY" not in os.environ:
- os.environ["OPIK_API_KEY"] = getpass.getpass("Opik API Key: ")
-if "OPIK_WORKSPACE" not in os.environ:
- os.environ["OPIK_WORKSPACE"] = input("Comet workspace (often the same as your username): ")
-```
-
-If you are running the Opik platform locally, simply set:
-
+import opik
-```python
-# import os
-# os.environ["OPIK_URL_OVERRIDE"] = "http://localhost:5173/api"
+opik.configure(use_local=False)
```
## Preparing our environment
diff --git a/apps/opik-documentation/documentation/docs/cookbook/llama-index.ipynb b/apps/opik-documentation/documentation/docs/cookbook/llama-index.ipynb
index 5beba6faa7..e5c023eeff 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/llama-index.ipynb
+++ b/apps/opik-documentation/documentation/docs/cookbook/llama-index.ipynb
@@ -23,9 +23,9 @@
"source": [
"## Creating an account on Comet.com\n",
"\n",
- "[Comet](https://www.comet.com/site) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm) and grab you API Key.\n",
+ "[Comet](https://www.comet.com/site?from=llm&utm_source=opik&utm_medium=colab&utm_content=llamaindex) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm&utm_source=opik&utm_medium=colab&utm_content=llamaindex) and grab you API Key.\n",
"\n",
- "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/) for more information."
+ "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/?from=llm&utm_source=opik&utm_medium=colab&utm_content=llamaindex) for more information."
]
},
{
@@ -34,30 +34,9 @@
"metadata": {},
"outputs": [],
"source": [
- "import os\n",
- "import getpass\n",
+ "import opik\n",
"\n",
- "if \"OPIK_API_KEY\" not in os.environ:\n",
- " os.environ[\"OPIK_API_KEY\"] = getpass.getpass(\"Opik API Key: \")\n",
- "if \"OPIK_WORKSPACE\" not in os.environ:\n",
- " os.environ[\"OPIK_WORKSPACE\"] = input(\"Comet workspace (often the same as your username): \")"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "If you are running the Opik platform locally, simply set:"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "# import os\n",
- "# os.environ[\"OPIK_URL_OVERRIDE\"] = \"http://localhost:5173/api\""
+ "opik.configure(use_local=False)"
]
},
{
diff --git a/apps/opik-documentation/documentation/docs/cookbook/llama-index.md b/apps/opik-documentation/documentation/docs/cookbook/llama-index.md
index df20f7cb7b..7728a1792e 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/llama-index.md
+++ b/apps/opik-documentation/documentation/docs/cookbook/llama-index.md
@@ -18,21 +18,9 @@ For this guide we will be downloading the essays from Paul Graham and use them a
```python
-import os
-import getpass
-
-if "OPIK_API_KEY" not in os.environ:
- os.environ["OPIK_API_KEY"] = getpass.getpass("Opik API Key: ")
-if "OPIK_WORKSPACE" not in os.environ:
- os.environ["OPIK_WORKSPACE"] = input("Comet workspace (often the same as your username): ")
-```
+import opik
-If you are running the Opik platform locally, simply set:
-
-
-```python
-# import os
-# os.environ["OPIK_URL_OVERRIDE"] = "http://localhost:5173/api"
+opik.configure(use_local=False)
```
## Preparing our environment
diff --git a/apps/opik-documentation/documentation/docs/cookbook/openai.ipynb b/apps/opik-documentation/documentation/docs/cookbook/openai.ipynb
index 0abc5d1c39..86276c488a 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/openai.ipynb
+++ b/apps/opik-documentation/documentation/docs/cookbook/openai.ipynb
@@ -15,9 +15,9 @@
"source": [
"## Creating an account on Comet.com\n",
"\n",
- "[Comet](https://www.comet.com/site) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm) and grab you API Key.\n",
+ "[Comet](https://www.comet.com/site?from=llm&utm_source=opik&utm_medium=colab&utm_content=openai) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm&utm_source=opik&utm_medium=colab&utm_content=openai) and grab you API Key.\n",
"\n",
- "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/) for more information."
+ "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/?from=llm&utm_source=opik&utm_medium=colab&utm_content=openai) for more information."
]
},
{
@@ -26,30 +26,9 @@
"metadata": {},
"outputs": [],
"source": [
- "import os\n",
- "import getpass\n",
+ "import opik\n",
"\n",
- "if \"OPIK_API_KEY\" not in os.environ:\n",
- " os.environ[\"OPIK_API_KEY\"] = getpass.getpass(\"Opik API Key: \")\n",
- "if \"OPIK_WORKSPACE\" not in os.environ:\n",
- " os.environ[\"OPIK_WORKSPACE\"] = input(\"Comet workspace (often the same as your username): \")"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "If you are running the Opik platform locally, simply set:"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "# import os\n",
- "# os.environ[\"OPIK_URL_OVERRIDE\"] = \"http://localhost:5173/api\""
+ "opik.configure(use_local=False)"
]
},
{
diff --git a/apps/opik-documentation/documentation/docs/cookbook/openai.md b/apps/opik-documentation/documentation/docs/cookbook/openai.md
index b9264db5e3..6eece284a1 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/openai.md
+++ b/apps/opik-documentation/documentation/docs/cookbook/openai.md
@@ -11,21 +11,9 @@ Opik integrates with OpenAI to provide a simple way to log traces for all OpenAI
```python
-import os
-import getpass
-
-if "OPIK_API_KEY" not in os.environ:
- os.environ["OPIK_API_KEY"] = getpass.getpass("Opik API Key: ")
-if "OPIK_WORKSPACE" not in os.environ:
- os.environ["OPIK_WORKSPACE"] = input("Comet workspace (often the same as your username): ")
-```
+import opik
-If you are running the Opik platform locally, simply set:
-
-
-```python
-# import os
-# os.environ["OPIK_URL_OVERRIDE"] = "http://localhost:5173/api"
+opik.configure(use_local=False)
```
## Preparing our environment
diff --git a/apps/opik-documentation/documentation/docs/cookbook/predibase.ipynb b/apps/opik-documentation/documentation/docs/cookbook/predibase.ipynb
new file mode 100644
index 0000000000..9236226ba0
--- /dev/null
+++ b/apps/opik-documentation/documentation/docs/cookbook/predibase.ipynb
@@ -0,0 +1,254 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Using Opik with Predibase\n",
+ "\n",
+ "This notebook demonstrates how to use Predibase as an LLM provider with LangChain, and how to integrate Comet for tracking and logging.\n",
+ "\n",
+ "## Setup\n",
+ "\n",
+ "First, let's install the necessary packages and set up our environment variables."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "%pip install --upgrade --quiet predibase opik"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We will now configure Opik and Predibase:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Configure Opik\n",
+ "import opik\n",
+ "import os\n",
+ "\n",
+ "opik.configure(use_local=False)\n",
+ "\n",
+ "# Configure predibase\n",
+ "import getpass\n",
+ "os.environ[\"PREDIBASE_API_TOKEN\"] = getpass.getpass(\"Enter your Predibase API token\")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Creating the Opik Tracer\n",
+ "\n",
+ "In order to log traces to Opik, we will be using the OpikTracer from the LangChain integration."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Import Comet tracer\n",
+ "from opik.integrations.langchain import OpikTracer\n",
+ "\n",
+ "# Initialize Comet tracer\n",
+ "opik_tracer = OpikTracer(\n",
+ " tags=[\"predibase\", \"langchain\"],\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Initial Call\n",
+ "\n",
+ "Let's set up our Predibase model and make an initial call."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "from langchain_community.llms import Predibase\n",
+ "import os\n",
+ "\n",
+ "model = Predibase(\n",
+ " model=\"mistral-7b\",\n",
+ " predibase_api_key=os.environ.get(\"PREDIBASE_API_TOKEN\"),\n",
+ ")\n",
+ "\n",
+ "# Test the model with Comet tracing\n",
+ "response = model.invoke(\n",
+ " \"Can you recommend me a nice dry wine?\",\n",
+ " config={\n",
+ " \"temperature\": 0.5,\n",
+ " \"max_new_tokens\": 1024,\n",
+ " \"callbacks\": [opik_tracer]\n",
+ " }\n",
+ ")\n",
+ "print(response)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "In addition to passing the OpikTracer to the invoke method, you can also define it during the creation of the `Predibase` object:\n",
+ "\n",
+ "```python\n",
+ "model = Predibase(\n",
+ " model=\"mistral-7b\",\n",
+ " predibase_api_key=os.environ.get(\"PREDIBASE_API_TOKEN\"),\n",
+ ").with_config({\"callbacks\": [opik_tracer]})\n",
+ "```"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## SequentialChain\n",
+ "\n",
+ "Now, let's create a more complex chain and run it with Comet tracing."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "from langchain.chains import LLMChain, SimpleSequentialChain\n",
+ "from langchain_core.prompts import PromptTemplate\n",
+ "\n",
+ "# Synopsis chain\n",
+ "template = \"\"\"You are a playwright. Given the title of play, it is your job to write a synopsis for that title.\n",
+ "\n",
+ "Title: {title}\n",
+ "Playwright: This is a synopsis for the above play:\"\"\"\n",
+ "prompt_template = PromptTemplate(input_variables=[\"title\"], template=template)\n",
+ "synopsis_chain = LLMChain(llm=model, prompt=prompt_template)\n",
+ "\n",
+ "# Review chain\n",
+ "template = \"\"\"You are a play critic from the New York Times. Given the synopsis of play, it is your job to write a review for that play.\n",
+ "\n",
+ "Play Synopsis:\n",
+ "{synopsis}\n",
+ "Review from a New York Times play critic of the above play:\"\"\"\n",
+ "prompt_template = PromptTemplate(input_variables=[\"synopsis\"], template=template)\n",
+ "review_chain = LLMChain(llm=model, prompt=prompt_template)\n",
+ "\n",
+ "# Overall chain\n",
+ "overall_chain = SimpleSequentialChain(\n",
+ " chains=[synopsis_chain, review_chain], verbose=True\n",
+ ")\n",
+ "\n",
+ "# Run the chain with Comet tracing\n",
+ "review = overall_chain.run(\"Tragedy at sunset on the beach\", callbacks=[opik_tracer])\n",
+ "print(review)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Accessing Logged Traces\n",
+ "\n",
+ "We can access the trace IDs collected by the Comet tracer."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "traces = opik_tracer.created_traces()\n",
+ "print(\"Collected trace IDs:\", [trace.id for trace in traces])\n",
+ "\n",
+ "# Flush traces to ensure all data is logged\n",
+ "opik_tracer.flush()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Fine-tuned LLM Example\n",
+ "\n",
+ "Finally, let's use a fine-tuned model with Comet tracing.\n",
+ "\n",
+ "**Note:** In order to use a fine-tuned model, you will need to have access to the model and the correct model ID. The code below will return a `NotFoundError` unless the `model` and `adapter_id` are updated."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "fine_tuned_model = Predibase(\n",
+ " model=\"my-base-LLM\",\n",
+ " predibase_api_key=os.environ.get(\"PREDIBASE_API_TOKEN\"),\n",
+ " predibase_sdk_version=None,\n",
+ " adapter_id=\"my-finetuned-adapter-id\",\n",
+ " adapter_version=1,\n",
+ " **{\n",
+ " \"api_token\": os.environ.get(\"HUGGING_FACE_HUB_TOKEN\"),\n",
+ " \"max_new_tokens\": 5,\n",
+ " },\n",
+ ")\n",
+ "\n",
+ "# Configure the Comet tracer\n",
+ "fine_tuned_model = fine_tuned_model.with_config({\"callbacks\": [opik_tracer]})\n",
+ "\n",
+ "# Invode the fine-tuned model\n",
+ "response = fine_tuned_model.invoke(\n",
+ " \"Can you help categorize the following emails into positive, negative, and neutral?\",\n",
+ " **{\"temperature\": 0.5, \"max_new_tokens\": 1024}\n",
+ ")\n",
+ "print(response)\n",
+ "\n",
+ "# Final flush to ensure all traces are logged\n",
+ "opik_tracer.flush()"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "py312_llm_eval",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.12.4"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/apps/opik-documentation/documentation/docs/cookbook/predibase.md b/apps/opik-documentation/documentation/docs/cookbook/predibase.md
new file mode 100644
index 0000000000..c06b117765
--- /dev/null
+++ b/apps/opik-documentation/documentation/docs/cookbook/predibase.md
@@ -0,0 +1,160 @@
+# Using Opik with Predibase
+
+This notebook demonstrates how to use Predibase as an LLM provider with LangChain, and how to integrate Comet for tracking and logging.
+
+## Setup
+
+First, let's install the necessary packages and set up our environment variables.
+
+
+```python
+%pip install --upgrade --quiet predibase opik
+```
+
+We will now configure Opik and Predibase:
+
+
+```python
+# Configure Opik
+import opik
+import os
+
+opik.configure(use_local=False)
+
+# Configure predibase
+import getpass
+os.environ["PREDIBASE_API_TOKEN"] = getpass.getpass("Enter your Predibase API token")
+```
+
+## Creating the Opik Tracer
+
+In order to log traces to Opik, we will be using the OpikTracer from the LangChain integration.
+
+
+```python
+# Import Comet tracer
+from opik.integrations.langchain import OpikTracer
+
+# Initialize Comet tracer
+opik_tracer = OpikTracer(
+ tags=["predibase", "langchain"],
+)
+```
+
+## Initial Call
+
+Let's set up our Predibase model and make an initial call.
+
+
+```python
+from langchain_community.llms import Predibase
+import os
+
+model = Predibase(
+ model="mistral-7b",
+ predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
+)
+
+# Test the model with Comet tracing
+response = model.invoke(
+ "Can you recommend me a nice dry wine?",
+ config={
+ "temperature": 0.5,
+ "max_new_tokens": 1024,
+ "callbacks": [opik_tracer]
+ }
+)
+print(response)
+```
+
+In addition to passing the OpikTracer to the invoke method, you can also define it during the creation of the `Predibase` object:
+
+```python
+model = Predibase(
+ model="mistral-7b",
+ predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
+).with_config({"callbacks": [opik_tracer]})
+```
+
+## SequentialChain
+
+Now, let's create a more complex chain and run it with Comet tracing.
+
+
+```python
+from langchain.chains import LLMChain, SimpleSequentialChain
+from langchain_core.prompts import PromptTemplate
+
+# Synopsis chain
+template = """You are a playwright. Given the title of play, it is your job to write a synopsis for that title.
+
+Title: {title}
+Playwright: This is a synopsis for the above play:"""
+prompt_template = PromptTemplate(input_variables=["title"], template=template)
+synopsis_chain = LLMChain(llm=model, prompt=prompt_template)
+
+# Review chain
+template = """You are a play critic from the New York Times. Given the synopsis of play, it is your job to write a review for that play.
+
+Play Synopsis:
+{synopsis}
+Review from a New York Times play critic of the above play:"""
+prompt_template = PromptTemplate(input_variables=["synopsis"], template=template)
+review_chain = LLMChain(llm=model, prompt=prompt_template)
+
+# Overall chain
+overall_chain = SimpleSequentialChain(
+ chains=[synopsis_chain, review_chain], verbose=True
+)
+
+# Run the chain with Comet tracing
+review = overall_chain.run("Tragedy at sunset on the beach", callbacks=[opik_tracer])
+print(review)
+```
+
+## Accessing Logged Traces
+
+We can access the trace IDs collected by the Comet tracer.
+
+
+```python
+traces = opik_tracer.created_traces()
+print("Collected trace IDs:", [trace.id for trace in traces])
+
+# Flush traces to ensure all data is logged
+opik_tracer.flush()
+```
+
+## Fine-tuned LLM Example
+
+Finally, let's use a fine-tuned model with Comet tracing.
+
+**Note:** In order to use a fine-tuned model, you will need to have access to the model and the correct model ID. The code below will return a `NotFoundError` unless the `model` and `adapter_id` are updated.
+
+
+```python
+fine_tuned_model = Predibase(
+ model="my-base-LLM",
+ predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
+ predibase_sdk_version=None,
+ adapter_id="my-finetuned-adapter-id",
+ adapter_version=1,
+ **{
+ "api_token": os.environ.get("HUGGING_FACE_HUB_TOKEN"),
+ "max_new_tokens": 5,
+ },
+)
+
+# Configure the Comet tracer
+fine_tuned_model = fine_tuned_model.with_config({"callbacks": [opik_tracer]})
+
+# Invode the fine-tuned model
+response = fine_tuned_model.invoke(
+ "Can you help categorize the following emails into positive, negative, and neutral?",
+ **{"temperature": 0.5, "max_new_tokens": 1024}
+)
+print(response)
+
+# Final flush to ensure all traces are logged
+opik_tracer.flush()
+```
diff --git a/apps/opik-documentation/documentation/docs/cookbook/ragas.ipynb b/apps/opik-documentation/documentation/docs/cookbook/ragas.ipynb
index 8a135e32d6..136f9ae905 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/ragas.ipynb
+++ b/apps/opik-documentation/documentation/docs/cookbook/ragas.ipynb
@@ -20,9 +20,9 @@
"source": [
"## Creating an account on Comet.com\n",
"\n",
- "[Comet](https://www.comet.com/site) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm) and grab you API Key.\n",
+ "[Comet](https://www.comet.com/site?from=llm&utm_source=opik&utm_medium=colab&utm_content=openai) provides a hosted version of the Opik platform, [simply create an account](https://www.comet.com/signup?from=llm&utm_source=opik&utm_medium=colab&utm_content=ragas) and grab you API Key.\n",
"\n",
- "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/) for more information."
+ "> You can also run the Opik platform locally, see the [installation guide](https://www.comet.com/docs/opik/self-host/overview/?from=llm&utm_source=opik&utm_medium=colab&utm_content=ragas) for more information."
]
},
{
@@ -31,30 +31,9 @@
"metadata": {},
"outputs": [],
"source": [
- "import os\n",
- "import getpass\n",
+ "import opik\n",
"\n",
- "if \"OPIK_API_KEY\" not in os.environ:\n",
- " os.environ[\"OPIK_API_KEY\"] = getpass.getpass(\"Opik API Key: \")\n",
- "if \"OPIK_WORKSPACE\" not in os.environ:\n",
- " os.environ[\"OPIK_WORKSPACE\"] = input(\"Comet workspace (often the same as your username): \")"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "If you are running the Opik platform locally, simply set:"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "# import os\n",
- "# os.environ[\"OPIK_URL_OVERRIDE\"] = \"http://localhost:5173/api\""
+ "opik.configure(use_local=False)"
]
},
{
@@ -72,7 +51,7 @@
"metadata": {},
"outputs": [],
"source": [
- "%pip install opik ragas --quiet\n",
+ "%pip install --quiet --upgrade opik ragas\n",
"\n",
"import os\n",
"import getpass\n",
@@ -150,16 +129,18 @@
"import asyncio\n",
"from ragas.integrations.opik import OpikTracer\n",
"from ragas.dataset_schema import SingleTurnSample\n",
+ "import os\n",
"\n",
+ "os.environ[\"OPIK_PROJECT_NAME\"] = \"ragas-integration\"\n",
"\n",
"# Define the scoring function\n",
"def compute_metric(metric, row):\n",
" row = SingleTurnSample(**row)\n",
"\n",
- " opik_tracer = OpikTracer()\n",
+ " opik_tracer = OpikTracer(tags=[\"ragas\"])\n",
"\n",
" async def get_score(opik_tracer, metric, row):\n",
- " score = await metric.single_turn_ascore(row, callbacks=[OpikTracer()])\n",
+ " score = await metric.single_turn_ascore(row, callbacks=[opik_tracer])\n",
" return score\n",
"\n",
" # Run the async function using the current event loop\n",
diff --git a/apps/opik-documentation/documentation/docs/cookbook/ragas.md b/apps/opik-documentation/documentation/docs/cookbook/ragas.md
index 748fd850af..bfb2e4e82b 100644
--- a/apps/opik-documentation/documentation/docs/cookbook/ragas.md
+++ b/apps/opik-documentation/documentation/docs/cookbook/ragas.md
@@ -15,21 +15,9 @@ There are two main ways to use Opik with Ragas:
```python
-import os
-import getpass
-
-if "OPIK_API_KEY" not in os.environ:
- os.environ["OPIK_API_KEY"] = getpass.getpass("Opik API Key: ")
-if "OPIK_WORKSPACE" not in os.environ:
- os.environ["OPIK_WORKSPACE"] = input("Comet workspace (often the same as your username): ")
-```
+import opik
-If you are running the Opik platform locally, simply set:
-
-
-```python
-# import os
-# os.environ["OPIK_URL_OVERRIDE"] = "http://localhost:5173/api"
+opik.configure(use_local=False)
```
## Preparing our environment
@@ -38,7 +26,7 @@ First, we will install the necessary libraries and configure the OpenAI API key.
```python
-%pip install opik ragas --quiet
+%pip install --quiet --upgrade opik ragas
import os
import getpass
@@ -94,16 +82,18 @@ nest_asyncio.apply()
import asyncio
from ragas.integrations.opik import OpikTracer
from ragas.dataset_schema import SingleTurnSample
+import os
+os.environ["OPIK_PROJECT_NAME"] = "ragas-integration"
# Define the scoring function
def compute_metric(metric, row):
row = SingleTurnSample(**row)
- opik_tracer = OpikTracer()
+ opik_tracer = OpikTracer(tags=["ragas"])
async def get_score(opik_tracer, metric, row):
- score = await metric.single_turn_ascore(row, callbacks=[OpikTracer()])
+ score = await metric.single_turn_ascore(row, callbacks=[opik_tracer])
return score
# Run the async function using the current event loop
diff --git a/apps/opik-documentation/documentation/docs/quickstart.md b/apps/opik-documentation/documentation/docs/quickstart.mdx
similarity index 68%
rename from apps/opik-documentation/documentation/docs/quickstart.md
rename to apps/opik-documentation/documentation/docs/quickstart.mdx
index 6cc3cb5f89..42a971a4d2 100644
--- a/apps/opik-documentation/documentation/docs/quickstart.md
+++ b/apps/opik-documentation/documentation/docs/quickstart.mdx
@@ -3,6 +3,9 @@ sidebar_position: 2
sidebar_label: Quickstart
---
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
+
# Quickstart
This guide helps you integrate the Opik platform with your existing LLM application.
@@ -19,21 +22,26 @@ pip install opik
and configuring the SDK with:
+
+
```python
-import os
+import opik
-os.environ["OPIK_API_KEY"] = ""
-os.environ["OPIK_WORKSPACE"] = ""
+opik.configure(use_local=False)
```
-
-You can find your Opik API key in the user menu of the [Opik UI](https://www.comet.com/opik/), the workspace name is the first item of the breadcrumbs and often the same as your username.
-
-
:::tip
-If you are self-hosting the platform, you don't need to set the `OPIK_API_KEY` and `OPIK_WORKSPACE` environment variables. Instead simply set:
-
- export OPIK_URL_OVERRIDE="http://localhost:5173/api"
+If you are self-hosting the platform, simply set the `use_local` flag to True in the `opik.configure` method.
+:::
+
+
+```bash
+opik configure
+```
+:::tip
+If you are self-hosting the platform, simply use the `opik configure --use_local` command.
:::
+
+
## Integrating with your LLM application
diff --git a/apps/opik-documentation/documentation/docs/self-host/local_deployment.md b/apps/opik-documentation/documentation/docs/self-host/local_deployment.md
index 73b678ef47..8197596c3f 100644
--- a/apps/opik-documentation/documentation/docs/self-host/local_deployment.md
+++ b/apps/opik-documentation/documentation/docs/self-host/local_deployment.md
@@ -34,19 +34,23 @@ docker compose up --detach
Opik will now be available at `http://localhost:5173`.
:::tip
-You will need to make sure that the Opik Python SDK is configured to point to the Opik server you just started. For this, make sure you set the environment variable `OPIK_BASE_URL` to the URL of the Opik server:
+In order to use the Opik Python SDK with your local Opik instance, you will need to run:
```bash
-export OPIK_BASE_URL=http://localhost:5173/api
+pip install opik
+
+opik configure --use_local
```
or in python:
```python
-import os
+import opik
-os.environ["OPIK_BASE_URL"] = "http://localhost:5173/api"
+opik.configure(use_local=True)
```
+
+This will create a `~/.opik.config` file that will store the URL of your local Opik instance.
:::
All the data logged to the Opik platform will be stored in the `~/opik` directory, which means that you can start and stop the Opik platform without losing any data.
diff --git a/apps/opik-documentation/documentation/docs/tracing/integrations/langchain.md b/apps/opik-documentation/documentation/docs/tracing/integrations/langchain.md
index d69550394c..a13f0587fa 100644
--- a/apps/opik-documentation/documentation/docs/tracing/integrations/langchain.md
+++ b/apps/opik-documentation/documentation/docs/tracing/integrations/langchain.md
@@ -22,6 +22,12 @@ To use the `CometTracer` with LangChain, you'll need to have both the `opik` and
pip install opik langchain langchain_openai
```
+In addition, you can configure Opik using the `opik configure` command which will prompt you for the correct local server address or if you are using the Cloud platfrom your API key:
+
+```bash
+opik configure
+```
+
## Using CometTracer
Here's a basic example of how to use the `CometTracer` callback with a LangChain chain:
@@ -69,7 +75,7 @@ opik_tracer = OpikTracer(
## Accessing logged traces
-You can use the `collected_traces` method to access the trace IDs collected by the `CometTracer` callback:
+You can use the `created_traces` method to access the trace IDs collected by the `CometTracer` callback:
```python
from opik.integrations.langchain import OpikTracer
@@ -78,8 +84,8 @@ opik_tracer = OpikTracer()
# Calling Langchain object
-traces = opik_tracer.collected_traces()
-print(traces)
+traces = opik_tracer.created_traces()
+print([trace.id for trace in traces])
```
This can be especially useful if you would like to update or log feedback scores for traces logged using the CometTracer.
diff --git a/apps/opik-documentation/documentation/docs/tracing/integrations/llama_index.md b/apps/opik-documentation/documentation/docs/tracing/integrations/llama_index.md
index 96d36f8fd1..48d988fbcd 100644
--- a/apps/opik-documentation/documentation/docs/tracing/integrations/llama_index.md
+++ b/apps/opik-documentation/documentation/docs/tracing/integrations/llama_index.md
@@ -29,6 +29,12 @@ To use the Opik integration with LlamaIndex, you'll need to have both the `opik`
pip install opik llama-index llama-index-agent-openai llama-index-llms-openai
```
+In addition, you can configure Opik using the `opik configure` command which will prompt you for the correct local server address or if you are using the Cloud platfrom your API key:
+
+```bash
+opik configure
+```
+
## Using the Opik integration
To use the Opik integration with LLamaIndex, you can set the Opik callback handler as the global callback handler:
diff --git a/apps/opik-documentation/documentation/docs/tracing/integrations/openai.md b/apps/opik-documentation/documentation/docs/tracing/integrations/openai.md
index 393fae4567..19cecea6c7 100644
--- a/apps/opik-documentation/documentation/docs/tracing/integrations/openai.md
+++ b/apps/opik-documentation/documentation/docs/tracing/integrations/openai.md
@@ -14,15 +14,21 @@ This guide explains how to integrate Comet Opik with the OpenAI Python SDK. By u
-## Integration Steps
+## Getting started
-1. First, ensure you have both `opik` and `openai` packages installed:
+First, ensure you have both `opik` and `openai` packages installed:
```bash
pip install opik openai
```
-2. Import the necessary modules and wrap the OpenAI client:
+In addition, you can configure Opik using the `opik configure` command which will prompt you for the correct local server address or if you are using the Cloud platfrom your API key:
+
+```bash
+opik configure
+```
+
+## Tracking OpenAI API calls
```python
from opik.integrations.openai import track_openai
@@ -45,7 +51,7 @@ response = openai_client.chat.completions.create(
presence_penalty=0
)
-print(completion.choices[0].message.content)
+print(response.choices[0].message.content)
```
The `track_openai` will automatically track and log the API call, including the input prompt, model used, and response generated. You can view these logs in your Comet project dashboard.
diff --git a/apps/opik-documentation/documentation/docs/tracing/integrations/overview.md b/apps/opik-documentation/documentation/docs/tracing/integrations/overview.md
index 354e70656c..3999e17a88 100644
--- a/apps/opik-documentation/documentation/docs/tracing/integrations/overview.md
+++ b/apps/opik-documentation/documentation/docs/tracing/integrations/overview.md
@@ -13,5 +13,7 @@ Opik aims to make it as easy as possible to log, view and evaluate your LLM trac
| OpenAI | Log traces for all OpenAI LLM calls | [Documentation](/tracing/integrations/openai.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/openai.ipynb) |
| LangChain | Log traces for all LangChain LLM calls | [Documentation](/tracing/integrations/langchain.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/langchain.ipynb) |
| LlamaIndex | Log traces for all LlamaIndex LLM calls | [Documentation](/tracing/integrations/llama_index.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/llama-index.ipynb) |
+| Predibase | Fine-tune and serve open-source LLMs | [Documentation](/tracing/integrations/predibase.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/predibase.ipynb) |
+| Ragas | Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines | [Documentation](/tracing/integrations/ragas.md) | [![Open Quickstart In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/comet-ml/opik/blob/master/apps/opik-documentation/documentation/docs/cookbook/ragas.ipynb) |
If you would like to see more integrations, please open an issue on our [GitHub repository](https://github.com/comet-ml/opik).
diff --git a/apps/opik-documentation/documentation/docs/tracing/integrations/predibase.md b/apps/opik-documentation/documentation/docs/tracing/integrations/predibase.md
new file mode 100644
index 0000000000..157c47b353
--- /dev/null
+++ b/apps/opik-documentation/documentation/docs/tracing/integrations/predibase.md
@@ -0,0 +1,78 @@
+---
+sidebar_position: 4
+sidebar_label: Predibase
+---
+
+# Using Opik with Predibase
+
+Predibase is a platform for fine-tuning and serving open-source Large Language Models (LLMs). It's built on top of open-source [LoRAX](https://loraexchange.ai/).
+
+
+ You can check out the Colab Notebook if you'd like to jump straight to the code:
+
+
+
+
+
+## Tracking your LLM calls
+
+Predibase can be used to serve open-source LLMs and is available as a model provider in LangChain. We will leverage the Opik integration with LangChain to track the LLM calls made using Predibase models.
+
+### Getting started
+
+To use the Opik integration with Predibase, you'll need to have both the `opik`, `predibase` and `langchain` packages installed. You can install them using pip:
+
+```bash
+pip install --upgrade --quiet opik predibase langchain
+```
+
+You can then configure Opik using the `opik configure` command which will prompt you for the correct local server address or if you are using the Cloud platfrom your API key:
+
+```bash
+opik configure
+```
+
+You will also need to set the `PREDIBASE_API_TOKEN` environment variable to your Predibase API token:
+
+```bash
+export PREDIBASE_API_TOKEN=
+```
+
+### Tracing your Predibase LLM calls
+
+In order to use Predibase through the LangChain interface, we will start by creating a Predibase model. We will then invoke the model with the Opik tracing callback:
+
+```python
+import os
+from langchain_community.llms import Predibase
+from opik.integrations.langchain import OpikTracer
+
+model = Predibase(
+ model="mistral-7b",
+ predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
+)
+
+# Test the model with Comet tracing
+response = model.invoke(
+ "Can you recommend me a nice dry wine?",
+ config={
+ "temperature": 0.5,
+ "max_new_tokens": 1024,
+ "callbacks": [OpikTracer(tags=["predibase", "mistral-7b"])]
+ }
+)
+print(response)
+```
+
+:::tip
+You can learn more about the Opik integration with LangChain in our [LangChain integration guide](/docs/tracing/integrations/langchain.md) or in the [Predibase cookbook](/docs/cookbook/predibase.md).
+:::
+
+The trace will now be available in the Opik UI for further analysis.
+
+
+![predibase](/img/tracing/predibase_opik_trace.png)
+
+## Tracking your fine-tuning training runs
+
+If you are using Predibase to fine-tune an LLM, we recommend using Predibase's integration with Comet's Experiment Management functionality. You can learn more about how to set this up in the [Comet integration guide](https://docs.predibase.com/user-guide/integrations/comet) in the Predibase documentation. If you are already using an Experiment Tracking platform, worth checking if it has an integration with Predibase.
diff --git a/apps/opik-documentation/documentation/docs/tracing/integrations/ragas.md b/apps/opik-documentation/documentation/docs/tracing/integrations/ragas.md
index 3d432cd8d5..bdc33c7304 100644
--- a/apps/opik-documentation/documentation/docs/tracing/integrations/ragas.md
+++ b/apps/opik-documentation/documentation/docs/tracing/integrations/ragas.md
@@ -12,6 +12,27 @@ There are two main ways to use Ragas with Opik:
1. Using Ragas to score traces or spans.
2. Using Ragas to evaluate a RAG pipeline.
+
+ You can check out the Colab Notebook if you'd like to jump straight to the code:
+
+
+
+
+
+## Getting started
+
+You will first need to install the `opik` and `ragas` packages:
+
+```bash
+pip install opik ragas
+```
+
+In addition, you can configure Opik using the `opik configure` command which will prompt you for the correct local server address or if you are using the Cloud platfrom your API key:
+
+```bash
+opik configure
+```
+
## Using Ragas to score traces or spans
Ragas provides a set of metrics that can be used to evaluate the quality of a RAG pipeline, a full list of the supported metrics can be found in the [Ragas documentation](https://docs.ragas.io/en/latest/references/metrics.html#).
@@ -21,23 +42,27 @@ In addition to being able to track these feedback scores in Opik, you can also u
Due to the asynchronous nature of the score calculation, we will need to define a coroutine to compute the score:
```python
+import asyncio
+
# Import the metric
from ragas.metrics import AnswerRelevancy
# Import some additional dependencies
from langchain_openai.chat_models import ChatOpenAI
from langchain_openai.embeddings import OpenAIEmbeddings
-from ragas.llms import LangchainLLMWrapper
+from ragas.dataset_schema import SingleTurnSample
from ragas.embeddings import LangchainEmbeddingsWrapper
-
-import asyncio
from ragas.integrations.opik import OpikTracer
+from ragas.llms import LangchainLLMWrapper
+from ragas.metrics import AnswerRelevancy
+
# Initialize the Ragas metric
llm = LangchainLLMWrapper(ChatOpenAI())
emb = LangchainEmbeddingsWrapper(OpenAIEmbeddings())
answer_relevancy_metric = AnswerRelevancy(llm=llm, embeddings=emb)
+
# Define the scoring function
def compute_metric(metric, row):
row = SingleTurnSample(**row)
@@ -96,7 +121,7 @@ def rag_pipeline(question):
return answer
-rag_pipeline("What is the capital of France?")
+print(rag_pipeline("What is the capital of France?"))
```
In the Opik UI, you will be able to see the full trace including the score calculation:
@@ -109,13 +134,6 @@ In the Opik UI, you will be able to see the full trace including the score calcu
We recommend using the Opik [evaluation framework](/evaluation/evaluate_your_llm) to evaluate your RAG pipeline. It shares similar concepts with the Ragas `evaluate` functionality but has a tighter integration with Opik.
-
- You can check out the Colab Notebook if you'd like to jump straight to the code:
-
-
-
-
-
:::
If you are using the Ragas `evaluate` functionality, you can use the `OpikTracer` callback to keep track of the score calculation in Opik. This will track as traces the computation of each evaluation metric:
diff --git a/apps/opik-documentation/documentation/docs/tracing/log_traces.md b/apps/opik-documentation/documentation/docs/tracing/log_traces.mdx
similarity index 85%
rename from apps/opik-documentation/documentation/docs/tracing/log_traces.md
rename to apps/opik-documentation/documentation/docs/tracing/log_traces.mdx
index cec2c5a786..2aa29e6ccd 100644
--- a/apps/opik-documentation/documentation/docs/tracing/log_traces.md
+++ b/apps/opik-documentation/documentation/docs/tracing/log_traces.mdx
@@ -3,6 +3,9 @@ sidebar_position: 1
sidebar_label: Log Traces
---
+import Tabs from '@theme/Tabs';
+import TabItem from '@theme/TabItem';
+
# Log Traces
You can log traces to the Comet LLM Evaluation plaform using either the REST API or the `opik` Python SDK.
@@ -15,14 +18,40 @@ To log traces to the Comet LLM Evaluation platform using the Python SDK, you wil
pip install opik
```
-Once the SDK is installed, you can log traces to using one our Comet's integration, function decorators or manually.
+To finish setting up the SDK, you will need to configure the SDK with your Opik API key or the path to your local deployment using the `opik configure`. This command will walk you through the setup process:
+
+
+
+
+```python
+import opik
+opik.configure(use_local=False)
+```
:::tip
-Opik has a number of integrations for popular LLM frameworks like LangChain or OpenAI, checkout a full list of integrations in the [integrations](/tracing/integrations/overview.md) section.
+If you are self-hosting the platform, simply set the `use_local` flag to True in the `opik.configure` method.
:::
+
+
+```bash
+opik configure
+```
+:::tip
+If you are self-hosting the platform, simply use the `opik configure --use_local` command.
+:::
+
+
+
+
+Once the SDK is installed, you can log traces to using one our Comet's integration, function decorators or manually.
+
+
## Log using function decorators
+Using Opik's function decorators is the easiest way to add Opik logging to your existing LLM application. We recommend using this method in conjuntion with one of our [integrations](/tracing/integrations/overview.md) for the most seamless experience.
+
+
### Logging traces and spans
If you are manually defining your LLM chains and not using LangChain for example, you can use the `track` function decorators to track LLM calls:
diff --git a/apps/opik-documentation/documentation/sidebars.ts b/apps/opik-documentation/documentation/sidebars.ts
index 42c6a743e0..8b4e8dec5b 100644
--- a/apps/opik-documentation/documentation/sidebars.ts
+++ b/apps/opik-documentation/documentation/sidebars.ts
@@ -28,7 +28,7 @@ const sidebars: SidebarsConfig = {
type: 'category',
label: 'Integrations',
items: ['tracing/integrations/overview', 'tracing/integrations/langchain', 'tracing/integrations/openai',
- 'tracing/integrations/llama_index', 'tracing/integrations/ragas']
+ 'tracing/integrations/llama_index', 'tracing/integrations/predibase', 'tracing/integrations/ragas']
}],
},
{
@@ -53,9 +53,8 @@ const sidebars: SidebarsConfig = {
type: 'category',
label: 'Cookbooks',
collapsed: false,
- items: ['cookbook/openai', 'cookbook/langchain', 'cookbook/llama-index',
- 'cookbook/evaluate_hallucination_metric', 'cookbook/evaluate_moderation_metric',
- 'cookbook/ragas']
+ items: ['cookbook/openai', 'cookbook/langchain', 'cookbook/llama-index', 'cookbook/predibase',
+ 'cookbook/ragas', 'cookbook/evaluate_hallucination_metric', 'cookbook/evaluate_moderation_metric']
},
],
};
diff --git a/apps/opik-documentation/documentation/static/img/tracing/predibase_opik_trace.png b/apps/opik-documentation/documentation/static/img/tracing/predibase_opik_trace.png
new file mode 100644
index 0000000000..a63424a651
Binary files /dev/null and b/apps/opik-documentation/documentation/static/img/tracing/predibase_opik_trace.png differ
diff --git a/apps/opik-documentation/python-sdk-docs/requirements.txt b/apps/opik-documentation/python-sdk-docs/requirements.txt
index ca1e0e7ee0..cfe4c10d43 100644
--- a/apps/opik-documentation/python-sdk-docs/requirements.txt
+++ b/apps/opik-documentation/python-sdk-docs/requirements.txt
@@ -1,3 +1,4 @@
sphinx-autobuild
sphinx
-furo
\ No newline at end of file
+furo
+sphinx-click
diff --git a/apps/opik-documentation/python-sdk-docs/source/cli.rst b/apps/opik-documentation/python-sdk-docs/source/cli.rst
new file mode 100644
index 0000000000..4c50ac62db
--- /dev/null
+++ b/apps/opik-documentation/python-sdk-docs/source/cli.rst
@@ -0,0 +1,6 @@
+CLI Reference
+=============
+
+.. click:: opik.cli:cli
+ :prog: opik
+ :nested: full
diff --git a/apps/opik-documentation/python-sdk-docs/source/conf.py b/apps/opik-documentation/python-sdk-docs/source/conf.py
index f25f1564a4..125a8d3c85 100644
--- a/apps/opik-documentation/python-sdk-docs/source/conf.py
+++ b/apps/opik-documentation/python-sdk-docs/source/conf.py
@@ -24,6 +24,7 @@
"sphinx.ext.intersphinx",
"sphinx.ext.mathjax",
"sphinx.ext.todo",
+ "sphinx_click.ext",
]
# -- Options for Autodoc --------------------------------------------------------------
diff --git a/apps/opik-documentation/python-sdk-docs/source/configure.rst b/apps/opik-documentation/python-sdk-docs/source/configure.rst
new file mode 100644
index 0000000000..68525f5345
--- /dev/null
+++ b/apps/opik-documentation/python-sdk-docs/source/configure.rst
@@ -0,0 +1,4 @@
+configure
+=========
+
+.. autofunction:: opik.configure
diff --git a/apps/opik-documentation/python-sdk-docs/source/index.rst b/apps/opik-documentation/python-sdk-docs/source/index.rst
index 694a405dc5..188eb52803 100644
--- a/apps/opik-documentation/python-sdk-docs/source/index.rst
+++ b/apps/opik-documentation/python-sdk-docs/source/index.rst
@@ -23,15 +23,19 @@ To get start with the package, you can install it using pip::
pip install opik
-To finish configuring the Opik Python SDK, you will need to set the environment variables:
+To finish configuring the Opik Python SDK, we recommend running the `opik configure` command from the command line:
-- If you are using the Comet managed Opik platform:
+.. code-block:: bash
- - `OPIK_API_KEY`: The API key to the Opik platform.
- - `OPIK_WORKSPACE`: The workspace to log traces to, this is often the same as your Opik username.
-- If you are using a self-hosted Opik platform:
+ opik configure
- - `OPIK_BASE_URL`: The base URL of the Opik platform.
+You can also call the configure function from the Python SDK:
+
+.. code-block:: python
+
+ import opik
+
+ opik.configure(use_local=False)
=============
Using the SDK
@@ -139,6 +143,7 @@ You can learn more about the `opik` python SDK in the following sections:
Opik
track
+ configure
opik_context/index
.. toctree::
@@ -176,6 +181,12 @@ You can learn more about the `opik` python SDK in the following sections:
Objects/UsageDict.rst
+.. toctree::
+ :maxdepth: 1
+ :caption: Command Line Interface
+
+ cli
+
.. toctree::
:caption: Documentation Guides
:maxdepth: 1
diff --git a/apps/opik-frontend/src/api/datasets/useExperimentItemDeleteMutation.ts b/apps/opik-frontend/src/api/datasets/useExperimentItemBatchDeleteMutation.ts
similarity index 80%
rename from apps/opik-frontend/src/api/datasets/useExperimentItemDeleteMutation.ts
rename to apps/opik-frontend/src/api/datasets/useExperimentItemBatchDeleteMutation.ts
index 005707999c..ba74cea255 100644
--- a/apps/opik-frontend/src/api/datasets/useExperimentItemDeleteMutation.ts
+++ b/apps/opik-frontend/src/api/datasets/useExperimentItemBatchDeleteMutation.ts
@@ -3,16 +3,16 @@ import get from "lodash/get";
import { useToast } from "@/components/ui/use-toast";
import api, { EXPERIMENTS_REST_ENDPOINT } from "@/api/api";
-type UseExperimentItemDeleteMutationParams = {
+type UseExperimentItemBatchDeleteMutationParams = {
ids: string[];
};
-const useExperimentItemDeleteMutation = () => {
+const useExperimentItemBatchDeleteMutation = () => {
const queryClient = useQueryClient();
const { toast } = useToast();
return useMutation({
- mutationFn: async ({ ids }: UseExperimentItemDeleteMutationParams) => {
+ mutationFn: async ({ ids }: UseExperimentItemBatchDeleteMutationParams) => {
const { data } = await api.post(
`${EXPERIMENTS_REST_ENDPOINT}items/delete`,
{
@@ -44,4 +44,4 @@ const useExperimentItemDeleteMutation = () => {
});
};
-export default useExperimentItemDeleteMutation;
+export default useExperimentItemBatchDeleteMutation;
diff --git a/apps/opik-frontend/src/api/traces/useTraceBatchDeleteMutation.ts b/apps/opik-frontend/src/api/traces/useTraceBatchDeleteMutation.ts
new file mode 100644
index 0000000000..931f1fd42a
--- /dev/null
+++ b/apps/opik-frontend/src/api/traces/useTraceBatchDeleteMutation.ts
@@ -0,0 +1,52 @@
+import { useMutation, useQueryClient } from "@tanstack/react-query";
+import get from "lodash/get";
+import { useToast } from "@/components/ui/use-toast";
+import api, { TRACES_REST_ENDPOINT } from "@/api/api";
+
+type UseTraceBatchDeleteMutationParams = {
+ ids: string[];
+ projectId: string;
+};
+
+const useTracesBatchDeleteMutation = () => {
+ const queryClient = useQueryClient();
+ const { toast } = useToast();
+
+ return useMutation({
+ mutationFn: async ({ ids }: UseTraceBatchDeleteMutationParams) => {
+ const { data } = await api.post(`${TRACES_REST_ENDPOINT}delete`, {
+ ids: ids,
+ });
+ return data;
+ },
+ onError: (error) => {
+ const message = get(
+ error,
+ ["response", "data", "message"],
+ error.message,
+ );
+
+ toast({
+ title: "Error",
+ description: message,
+ variant: "destructive",
+ });
+ },
+ onSettled: (data, error, variables) => {
+ queryClient.invalidateQueries({
+ queryKey: ["spans", { projectId: variables.projectId }],
+ });
+ queryClient.invalidateQueries({ queryKey: ["compare-experiments"] });
+ queryClient.invalidateQueries({
+ queryKey: [
+ "traces",
+ {
+ projectId: variables.projectId,
+ },
+ ],
+ });
+ },
+ });
+};
+
+export default useTracesBatchDeleteMutation;
diff --git a/apps/opik-frontend/src/components/pages/CompareExperimentsPage/CompareExperimentsPanel/CompareExperimentsViewer.tsx b/apps/opik-frontend/src/components/pages/CompareExperimentsPage/CompareExperimentsPanel/CompareExperimentsViewer.tsx
index ba257a2c3c..0ae22dc926 100644
--- a/apps/opik-frontend/src/components/pages/CompareExperimentsPage/CompareExperimentsPanel/CompareExperimentsViewer.tsx
+++ b/apps/opik-frontend/src/components/pages/CompareExperimentsPage/CompareExperimentsPanel/CompareExperimentsViewer.tsx
@@ -22,6 +22,7 @@ type CompareExperimentsViewerProps = {
const CompareExperimentsViewer: React.FunctionComponent<
CompareExperimentsViewerProps
> = ({ experimentItem, openTrace }) => {
+ const isTraceExist = traceExist(experimentItem);
const experimentId = experimentItem.experiment_id;
const { data } = useExperimentById(
{
@@ -47,11 +48,12 @@ const CompareExperimentsViewer: React.FunctionComponent<
};
const renderContent = () => {
- if (!traceExist(experimentItem)) {
+ if (!isTraceExist) {
return (
);
}
@@ -64,28 +66,33 @@ const CompareExperimentsViewer: React.FunctionComponent<
};
return (
-