🤖 Collect practical AI repos, tools, websites, papers and tutorials on AI. 实用的AI百宝箱 💎
-
Updated
Jun 2, 2024 - Ruby
🤖 Collect practical AI repos, tools, websites, papers and tutorials on AI. 实用的AI百宝箱 💎
A high-throughput and memory-efficient inference and serving engine for LLMs
A full-stack, Generative AI powered mobile application which helps people control activities & sensors of their smart IoT home systems.
Tools for merging pretrained large language models.
🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
MinimalChat is a lightweight, open-source chat application that allows you to interact with various large language models.
By utilizing GPT4All-CLI, developers can effortlessly tap into the power of GPT4All and LLaMa without delving into the library's intricacies. Simply install the CLI tool, and you're prepared to explore the fascinating world of large language models directly from your command line!
Low-code framework for building custom LLMs, neural networks, and other AI models
🤘 TT-NN operator library, and TT-Metalium low level kernel programming model.
Run generative AI models in sophgo BM1684X
WebAssembly binding for llama.cpp - Enabling in-browser LLM inference
Add a description, image, and links to the llama topic page so that developers can more easily learn about it.
To associate your repository with the llama topic, visit your repo's landing page and select "manage topics."