A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents
-
Updated
Aug 15, 2024 - JavaScript
A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents
Efficient visual programming for AI language models
visionOS examples ⸺ Spatial Computing Accelerators for Apple Vision Pro
LLMX; Easiest 3rd party Local LLM UI for the web!
Serverless single HTML page access to an OpenAI API compatible Local LLM
Techniques for Using LLMs Effectively: An Introduction to Prompt Engineering, Retrieval Augmented Generation, and Toolformer
An AI chatbot that talks to people in VR Chat.
Browser extension that generates image alternate text, using GPT-4o or an LM Studio server.
Ukrainian-language Telegram chatbot using artificial intelligence. This project is based on LM Studio, which works as a server for the large language model LLAMA3 8B from META. It processes user messages and generates chatbot tokens (responses).
This repository hosts a web-based chat application using LM Studio's AI models like Mistral, OpenAI, and Llama through a Gradio interface. It maintains conversation history for a continuous, coherent chat experience akin to ChatGPT or Claude.
Quill is a cutting-edge fullstack SaaS platform built from scratch using Next.js 13.5, tRPC, TypeScript, Prisma, and Tailwind. It features a beautiful landing and pricing page, real-time streaming API responses, and robust authentication via Kinde. With modern UI components, optimistic updates, and seamless data fetching.
Solve complex problems Intelligently orchestrate subagents using Local LLM, Embeddings,duckduckgo search
Some handy tools to do with audio locally.
automate the batching and execution of prompts.
RAG with LM studio, local LLMs, Scientific PDF text extraction,
Add a description, image, and links to the lm-studio topic page so that developers can more easily learn about it.
To associate your repository with the lm-studio topic, visit your repo's landing page and select "manage topics."