Skip to content

Code for "Unsupervised Pretraining for Fact Verification by Language Model Distillation" (ICLR 2024)

License

Notifications You must be signed in to change notification settings

AdrianBZG/SFAVEL

Repository files navigation

SFAVEL: Unsupervised Pretraining for Fact Verification by Language Model Distillation

This is the official implementation of the paper "Unsupervised Pretraining for Fact Verification by Language Model Distillation".

Code coming up soon, stay tuned!

Citation

@inproceedings{
bazaga2024unsupervised,
title={Unsupervised Pretraining for Fact Verification by Language Model Distillation},
author={Adrián Bazaga and Pietro Liò and Gos Micklem},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=1mjsP8RYAw}
}

Contact

For feedback, questions, or press inquiries please contact Adrián Bazaga