-
Updated
Jun 12, 2024 - Jupyter Notebook
gpt-2
Here are 742 public repositories matching this topic...
This repository contains demos I made with the Transformers library by HuggingFace.
-
Updated
Jun 11, 2024 - Jupyter Notebook
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
-
Updated
Jun 11, 2024 - Rust
Fine-tuned GPT-2 transformer model for fake detection
-
Updated
Jun 11, 2024 - Jupyter Notebook
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
-
Updated
Jun 11, 2024 - Python
A simple GPT model for practice based on NanoGPT
-
Updated
Jun 9, 2024 - Python
Developed end-to-end app to generate cartoons image description
-
Updated
Jun 8, 2024 - Jupyter Notebook
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
-
Updated
Jun 8, 2024 - Python
Annotations of the interesting ML papers I read
-
Updated
Jun 7, 2024
Evaluating language models based on their strategic game-playing capabilities using chess as a benchmark.
-
Updated
Jun 6, 2024 - Jupyter Notebook
Visual Studio Code client for Tabnine. https://marketplace.visualstudio.com/items?itemName=TabNine.tabnine-vscode
-
Updated
Jun 10, 2024 - TypeScript
🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
-
Updated
Jun 5, 2024 - Python
GPT-based protein language model for PTM site prediction
-
Updated
Jun 4, 2024 - Jupyter Notebook
A Python-based chatbot project built on the autogen and tinygrad foundation, utilizing advanced agents for dynamic conversations and function orchestration, enhancing and expanding traditional chatbot capabilities.
-
Updated
Jun 3, 2024 - Jupyter Notebook
Tabnine Autocomplete AI: JavaScript, Python, TypeScript, PHP, C/C++, HTML/CSS, Go, Java, Ruby, C#, Rust, SQL, Bash, Kotlin, Julia, Lua, OCaml, Perl, Haskell, React
-
Updated
Jun 2, 2024 - Python
Auto-regressive causal language model for molecule (SMILES) and reaction template (SMARTS) generation based on the Hugging Face implementation of OpenAI's GPT-2 transformer decoder model
-
Updated
Jun 1, 2024 - Jupyter Notebook
This project uses GPT-2 to generate realistic movie reviews from the IMDb dataset. By preprocessing data and fine-tuning the model, we achieved human-like text quality. The model's reviews were evaluated for coherence and diversity, showcasing GPT-2's potential in automated text generation.
-
Updated
Jun 1, 2024 - Jupyter Notebook
Improve this page
Add a description, image, and links to the gpt-2 topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gpt-2 topic, visit your repo's landing page and select "manage topics."