Skip to content

Analysis of robust classification algorithms for overcoming class-dependant labelling noise: Forward, Importance Reweighting and T-revision. We demonstrate methods for estimating the transition matrix in order to obtain better classifier performance when working with noisy data.

Notifications You must be signed in to change notification settings

alejandrods/Analysis-of-classifiers-robust-to-noisy-labels

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Analysis of classifiers robust to noisy labels

Paper: [arXiv]

Code: Open In Colab

We explore contemporary robust classification algorithms for overcoming class-dependant labelling noise: Forward, Importance Reweighting and T-revision. The classifiers are trained and evaluated on class-conditional random label noise data while the final test data is clean. We demonstrate methods for estimating the transition matrix in order to obtain better classifier performance when working with noisy data. We apply deep learning to three data-sets and derive an end-to-end analysis with unknown noise on the CIFAR dataset from scratch. The effectiveness and robustness of the classifiers are analysed, and we compare and contrast the results of each experiment are using top-1 accuracy as our criterion.

Authors

Alex Díaz
Damian Steele

Cite

@misc{díaz2021analysis,
      title={Analysis of classifiers robust to noisy labels}, 
      author={Alex Díaz and Damian Steele},
      year={2021},
      eprint={2106.00274},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

About

Analysis of robust classification algorithms for overcoming class-dependant labelling noise: Forward, Importance Reweighting and T-revision. We demonstrate methods for estimating the transition matrix in order to obtain better classifier performance when working with noisy data.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published