🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 23, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Scalable and user friendly neural 🧠 forecasting algorithms.
A high-throughput and memory-efficient inference and serving engine for LLMs
A PyTorch implementation of the original Transformer.
This repository contains a reading list of papers on Time Series Forecasting/Prediction (TSF) and Spatio-Temporal Forecasting/Prediction (STF). These papers are mainly categorized according to the type of model.
Implementations of Deep Learning Techniques
A framework for few-shot evaluation of language models.
🏆 A ranked list of awesome machine learning Python libraries. Updated weekly.
🔧 A Kotlin coroutine wrapper around Media3's Transformer API.
Large Language Model Text Generation Inference
The repository contains the notebook as well as python implementations of laguange models, from basic naive implementations to Transformers on a simple name dataset. more notebooks and files will be added regularly
animal2vec: A self-supervised transformer for rare-event raw audio input
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
OpenMMLab Semantic Segmentation Toolbox and Benchmark.
Frontend Mentor Blog preview Card Challenge
Official PyTorch implement of paper "ParCo: Part-Coordinating Text-to-Motion Synthesis": http://arxiv.org/abs/2403.18512
QuillGPT is an implementation of the GPT decoder block based on the architecture from Attention is All You Need paper by Vaswani et. al. in PyTorch. Additionally, this repository contains two pre-trained models — Shakespearean GPT and Harpoon GPT, a Streamlit Playground, Containerized FastAPI Microservice, training - inference scripts & notebooks.
Easy-to-use Speech Toolkit including Self-Supervised Learning model, SOTA/Streaming ASR with punctuation, Streaming TTS with text frontend, Speaker Verification System, End-to-End Speech Translation and Keyword Spotting. Won NAACL2022 Best Demo Award.
An offline CPU-first memory-scarce chat application to perform RAG on your corpus of data. Powered by OpenChat and CTranslate2.
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."