Plugin that creates a ChromaDB vector database to work with LM Studio running in server mode!
-
Updated
May 27, 2024 - Python
Plugin that creates a ChromaDB vector database to work with LM Studio running in server mode!
visionOS examples ⸺ Spatial Computing Accelerators for Apple Vision Pro
LLMX; Easiest 3rd party Local LLM UI for the web!
This repository hosts a web-based chat application using LM Studio's AI models like Mistral, OpenAI, and Llama through a Gradio interface. It maintains conversation history for a continuous, coherent chat experience akin to ChatGPT or Claude.
Serverless single HTML page access to an OpenAI API compatible Local LLM
Quill is a cutting-edge fullstack SaaS platform built from scratch using Next.js 13.5, tRPC, TypeScript, Prisma, and Tailwind. It features a beautiful landing and pricing page, real-time streaming API responses, and robust authentication via Kinde. With modern UI components, optimistic updates, and seamless data fetching.
automate the batching and execution of prompts.
Solve complex problems Intelligently orchestrate subagents using Local LLM, Embeddings,duckduckgo search
Add a description, image, and links to the lm-studio topic page so that developers can more easily learn about it.
To associate your repository with the lm-studio topic, visit your repo's landing page and select "manage topics."