Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
-
Updated
Jan 1, 2019 - Python
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Text matching using several deep models.
Generating English Rock lyrics using BERT
Quickly fine-tune language models for your downstream NLP tasks.
CS747 - Foundations Of Intelligent Learning Agents (FILA) Course Project
Semantic Textual Similarity between two document
French English Machine Translation. Natural language processing (NLP) transformer model from "Attention Is All You Need"
Detection of MBTI-type personality with NLP and Deep Learning
Universal Transforming Geometric Network for protein structure prediction.
Code for the paper "NABU - Multilingual Graph-based Neural RDF Verbalizer"
Repository for Homework 3 - Neural Networks and Deep Learning course @ UniPD. Second version. Transformer for text generation.
Deep Learning Course Assignment on Image Captioning and Machine Translation using LSTMs
unbiased toxicity detection from comments
Topic modeling using BERT and LDA combination.
Objective: Predicting whether the customer will return or not in the next month. Techniques used: XGBoost, logistic regression, attention based LSTM neural network, self attention based transformer neural network
Julia experimentation using sequence-based NLP models
A Pytorch implementation of TopicTransformer for language modeling
Visualization method of MHA which was trained on time series data, to improve human interpretation
Add a description, image, and links to the transformer-encoder topic page so that developers can more easily learn about it.
To associate your repository with the transformer-encoder topic, visit your repo's landing page and select "manage topics."