PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
-
Updated
May 31, 2024 - Python
PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
A compilation of the best multi-agent papers
[ICLR 2024] AGILE3D: Attention Guided Interactive Multi-object 3D Segmentation
Scenic: A Jax Library for Computer Vision Research and Beyond
🚀🚀🚀 A collection of some awesome public YOLO object detection series projects.
A collection of memory efficient attention operators implemented in the Triton language.
A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
Keras beit,caformer,CMT,CoAtNet,convnext,davit,dino,efficientdet,edgenext,efficientformer,efficientnet,eva,fasternet,fastervit,fastvit,flexivit,gcvit,ghostnet,gpvit,hornet,hiera,iformer,inceptionnext,lcnet,levit,maxvit,mobilevit,moganet,nat,nfnets,pvt,swin,tinynet,tinyvit,uniformer,volo,vanillanet,yolor,yolov7,yolov8,yolox,gpt2,llama2, alias kecam
QuillGPT is an implementation of the GPT decoder block based on the architecture from Attention is All You Need paper by Vaswani et. al. in PyTorch. Additionally, this repository contains two pre-trained models — Shakespearean GPT and Harpoon GPT, a Streamlit Playground, Containerized FastAPI Microservice, training - inference scripts & notebooks.
A Baby Llama model
[ICML 2024] Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
Simple character level Transformer
Pytorch implementation of various token mixers; Attention Mechanisms, MLP, and etc for understanding computer vision papers and other tasks.
A Jax-based library for designing and training transformer models from scratch.
Project Name: AdaViT | PyTorch Lightning, Python
The official repo for [IJCAI'24] "LeMeViT: Efficient Vision Transformer with Learnable Meta Tokens for Remote Sensing Image Interpretation"
Investigate possibilities for Vision Transformers with multiscale grids
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."