Skip to content

Commit

Permalink
Update copyright year in LICENSE file
Browse files Browse the repository at this point in the history
  • Loading branch information
chandralegend committed Apr 30, 2024
1 parent 2ab7bc5 commit c63c9e2
Show file tree
Hide file tree
Showing 2 changed files with 24 additions and 28 deletions.
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
MIT License

Copyright (c) 2023 Chandra Irugalbandara
Copyright (c) 2024 Jaseci Labs

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
50 changes: 23 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,25 +1,25 @@
# SLaM Tool : *S*mall *La*nguage *M*odel Evaluation Tool
# SLaM Tool: *S*mall *La*nguage *M*odel Evaluation Tool

[![Query Engine Tests](https://github.com/Jaseci-Labs/slam/actions/workflows/query_engine_test.yml/badge.svg)](https://github.com/Jaseci-Labs/slam/actions/workflows/query_engine_test.yml)
[![SLaM App Tests](https://github.com/Jaseci-Labs/slam/actions/workflows/app_test.yml/badge.svg)](https://github.com/Jaseci-Labs/slam/actions/workflows/app_test.yml)

SLaM Tool is a helper tool to evaluate the performance of Large Language Models (LLMs) for your personal use cases with the help of Human Evaluation and Automatic Evaluation. You can deploy the application on your local machine or use Docker to generate responses for a given prompt with different LLMs (Proprietary or OpenSource), and then evaluate the responses with the help of human evaluators or automated methods.
SLaM Tool is an advanced application designed to assess the performance of Large Language Models (LLMs) for your specific use cases. It employs both Human Evaluation and Automatic Evaluation techniques, enabling you to deploy the application locally or via Docker. With SLaM Tool, you can generate responses for a given prompt using various LLMs (proprietary or open-source), and subsequently evaluate those responses through human evaluators or automated methods.

## Features
## Key Features

- **Admin Panel**: Set up the Human Evaluation UI and manage the human evaluators.
- **Realtime Insights and Analytics**: Get insights and analytics on the performance of the LLMs.
- **Human Evaluation**: Evaluate the responses of the LLMs with the help of human evaluators.
- **Automatic Evaluation**: Evaluate the responses of the LLMs with the help of LLMs and using embedding similarity.
- **Multiple Model Support**: Generate responses for a given prompt with different LLMs (Proprietary or OpenSource(Ollama)).
- **Admin Panel**: Set up and manage the Human Evaluation UI, as well as oversee human evaluators.
- **Real-time Insights and Analytics**: Gain valuable insights and analytics on the performance of the LLMs under evaluation.
- **Human Evaluation**: Leverage human evaluators to assess the responses generated by the LLMs.
- **Automatic Evaluation**: Employ LLMs and embedding similarity techniques for automated response evaluation.
- **Multiple Model Support**: Generate responses for a given prompt using a diverse range of LLMs (proprietary or open-source, such as Ollama).

## Installation

First Clone the Repository:
First, clone the repository:

```bash
git clone https://github.com/Jaseci-Labs/slam.git && cd slam
```
```

### Prerequisites

Expand All @@ -41,7 +41,7 @@ git clone https://github.com/Jaseci-Labs/slam.git && cd slam
docker run -p 8501:8501 -e SLAM_ADMIN_USERNAME=<user_name> -e SLAM_ADMIN_PASSWORD=<password> slam/slam-app:latest
```

3. Open your browser and go to `http://localhost:8501`
3. Open your browser and navigate to `http://localhost:8501`

#### Using Local Installation

Expand All @@ -68,11 +68,10 @@ git clone https://github.com/Jaseci-Labs/slam.git && cd slam
streamlit run app.py
```

5. Open your browser and go to `http://localhost:8501`

5. Open your browser and navigate to `http://localhost:8501`

### With the Query Engine and Ollama
Notice: Make Sure you are running in an environment with GPU
Notice: Ensure you are running in an environment with GPU support.

#### Using Docker Compose (Recommended)

Expand All @@ -81,19 +80,19 @@ Notice: Make Sure you are running in an environment with GPU
docker compose up -d --build
```

2. Open your browser and go to `http://localhost:8501`
2. Open your browser and navigate to `http://localhost:8501`

#### Using Local Installation

Follow the Steps above to install the app and then follow the steps below to install the Query Engine and Ollama.
Follow the steps above to install the app, and then proceed with the steps below to install the Query Engine and Ollama.

### For Response Generation & Automatic Evaluation (Optional)

For a streamlined experience, SLAM offers the option to leverage LLMs and SLMs for response generation and automated evaluation.

Open a new terminal window and navigate to the root directory of the SLAM repository.

1. Create a seperate virtual environment (Recommended):
1. Create a separate virtual environment (Recommended):

```bash
cd engine
Expand Down Expand Up @@ -125,7 +124,7 @@ Open a new terminal window and navigate to the root directory of the SLAM reposi
```bash
export OPENAI_API_KEY=<your_api_key>
```
if you have a remote ollama server, set the server url:
If you have a remote Ollama server, set the server URL:
```bash
export OLLAMA_SERVER_URL=http://<remote_server_ip>:11434/
```
Expand All @@ -137,7 +136,7 @@ Open a new terminal window and navigate to the root directory of the SLAM reposi
- [How to use SLaM for Automatic Evaluation](docs/tutorials/automatic_eval.md)
- [LLM as an Evaluator](docs/tutorials/automatic_eval.md#llm-as-an-evaluator)
- [Using Semantic Similarity to Evaluate Responses](docs/tutorials/automatic_eval.md#using-semantic-similarity-to-evaluate-responses)
- [How to Get Realtime Insights and Analytics from your Evaluations](docs/tutorials/insights_analytics.md)
- [How to Get Real-time Insights and Analytics from your Evaluations](docs/tutorials/insights_analytics.md)

## Tips and Tricks

Expand Down Expand Up @@ -181,19 +180,16 @@ To load your backups, follow these simple steps:
4. **Refresh and View**
- After the upload process is complete, click the "Refresh" button to see the updated diagrams and visualizations.

## Frequently Asked Questions

## FAQ

1. When Trying to run `ollama serve` I get an error `Error: listen tcp :11434: bind: address already in use`
1. **Error: "listen tcp :11434: bind: address already in use" when trying to run `ollama serve`**
- This error occurs when the port 11434 is already in use. To resolve this issue, you can either kill the process running on the port using `sudo systemctl stop ollama` and then run `ollama serve` again.

2. When trying to run the Query Engine, I get an error `Error: No module named 'jac'`
- This error occurs when the `jaclang` package is not installed. To resolve this issue, first makesure you are in the `slam-engine` environment and and retry installing the requirements using `pip install -r engine/requirements.txt`.
2. **Error: "No module named 'jac'" when trying to run the Query Engine**
- This error occurs when the `jaclang` package is not installed. To resolve this issue, first, make sure you are in the `slam-engine` environment, and then retry installing the requirements using `pip install -r engine/requirements.txt`.

3. If you have any other questions, please feel free to reach out to us through the [Issues](https://github.com/Jaseci-Labs/slam/issues) section.



## Contributing

We welcome contributions to enhance SLAM's capabilities. Please review the [CONTRIBUTING.md](CONTRIBUTING.md) file for our code of conduct and the process for submitting pull requests. We appreciate your interest in contributing to SLAM and look forward to your valuable contributions.
We welcome contributions to enhance SLAM's capabilities. Please review the [CONTRIBUTING.md](CONTRIBUTING.md) file for our code of conduct and the process for submitting pull requests. We appreciate your interest in contributing to SLAM and look forward to your valuable contributions.

0 comments on commit c63c9e2

Please sign in to comment.