Use GPT-2 for Text-generation
-
Updated
Apr 22, 2023 - Jupyter Notebook
Use GPT-2 for Text-generation
This repository contains NLP Transfer learning projects with deployment and integration with UI.
GPT-2 version featuring a maximum number of training iterations. (Meant to be integrated with another GPT-2 project I'm developing.)
Poster submitted by Pseudo Lab for ACH2021
Template for using GPT-2 for AI message generation in discord.py bots
Small application to test out some functionality of OpenAIs Generative Pre-Trained Transformer (GPT-2) Model
Text Generation Software Which Creates Story Based On User Input
This repository is aimed to showcasing the potential of machine learning models in generating Python code based on user input. The project utilizes the GitHub API to collect Python repositories, preprocesses the data, trains a GPT-2 language model, and generates Python code using the trained model.
A GPT2 model made from scratch on PyTorch (Inspired by Andrej Karpathy)
This repository contains transformer implementation in pytorch , taught by Andrej Karpathy
Add a description, image, and links to the gpt-2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt-2 topic, visit your repo's landing page and select "manage topics."