Skip to content

Local model support for OpenAI compatible API providers like Ollama and LM Studio (LLM and embeddings) #1699

Local model support for OpenAI compatible API providers like Ollama and LM Studio (LLM and embeddings)

Local model support for OpenAI compatible API providers like Ollama and LM Studio (LLM and embeddings) #1699

Workflow file for this run

name: Lint
on: [pull_request]
jobs:
lint:
runs-on: ubuntu-latest
strategy:
matrix:
os: [ubuntu-latest]
steps:
- uses: actions/checkout@v3
- uses: psf/black@stable
with:
options: "--check --verbose"
src: "."
jupyter: true