Bachelor Thesis comparing two Relation-Inference Datasets
-
Updated
Sep 25, 2017 - Python
Bachelor Thesis comparing two Relation-Inference Datasets
finetune the language model for task of natural language inference
Mitigating a language model's over-confidence with NLI predictions on Multi-NLI hypotheses with random word order using PAWS (paraphrase) and Winogrande (anaphora).
We augmented an already existing BERT Tiny Transformer network designed to train the Google NQ dataset to randomly sample some of the tokens in a question with its synonyms. The idea comes from the process of image data augmentation used in computer vision pipelines. This experiment directly tackles the concepts of Natural Language Inference and…
Natural Language Inference, SNLI dataset
Re-implementation of BIMPM(Bilateral Multi-Perspective Matching for Natural Language Sentences)
This repository is an implementation of Paper "Robust Natural Language Inference Models with Example Forgetting", which studies importance of forgettable example in making robust NLI Models
XNLIeu: a dataset for cross-lingual NLI in Basque
Assignments and projects from the interpretable natural language processing course offered at the University of Tehran.
This repository is to understand Attention mechanism for the Classification task. The task used here for explanation is Recognizing Textual Entailment. It is a Natural Language Inference task.
Fine-tune transformers with NLI data
Data, models, analysis scripts, experiment materials and environments that I used for my MA thesis can be accessed in this repository.
📃⚖️🙅 Pipeline to detect contradictions in policy documents using NLP and transformers.
The objective here is to study the plausibility of attention mechanisms in automatic language processing on an NLI (natural naguage inference) task, in transformers (BERT) architecture
Add a description, image, and links to the nli topic page so that developers can more easily learn about it.
To associate your repository with the nli topic, visit your repo's landing page and select "manage topics."