This repo contains a list of channels and sources from where LLMs should be learned
-
Updated
May 23, 2024
This repo contains a list of channels and sources from where LLMs should be learned
LlamaIndex is a data framework for your LLM applications
An open-source container orchestration engine for running AI workloads in any cloud or data center. https://discord.gg/u8SmfwPpMd
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Python client library for improving your LLM app accuracy
Open source data anonymization and synthetic data orchestration for developers. Create high fidelity synthetic data and sync it across your environments.
Implementation for the different ML tasks on Kaggle platform with GPUs.
The open-source serverless GPU container runtime.
Scalable and flexible workflow orchestration platform that seamlessly unifies data, ML and analytics stacks.
This project's aim is to categorize retail products from their images. MobileNetV2 model fine-tuned with 18K retail product images accross 9 categories. Project deployed with Flask and containerized via docker
Low-code framework for building custom LLMs, neural networks, and other AI models
WebUI for Fine-Tuning and Self-hosting of Open-Source Large Language Models for Coding
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://h2oai.github.io/h2o-llmstudio/
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
OneTrainer is a one-stop solution for all your stable diffusion training needs.
Finetune Llama 3, Mistral & Gemma LLMs 2-5x faster with 80% less memory
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
AI Greetings & Wishes Generator - educational group project as part of the course on neural networks (NLP) at ITMO
Add a description, image, and links to the fine-tuning topic page so that developers can more easily learn about it.
To associate your repository with the fine-tuning topic, visit your repo's landing page and select "manage topics."