Skip to content
#

mixture-of-experts

Here are 101 public repositories matching this topic...

The idea to create the perfect LLM currently possible came to my mind because I was watching a YouTube on GaLore, the "sequel" to LoRa, and I realized how fucking groundbreaking that tech is. I was daydreaming about pretraining my own model, this (probably impossible to implement) concept is a refined version of that model.

  • Updated May 31, 2024

Improve this page

Add a description, image, and links to the mixture-of-experts topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the mixture-of-experts topic, visit your repo's landing page and select "manage topics."

Learn more