Skip to content

Experiments from "Adaptive Divergence for Rapid Adversarial Optimization" paper.

License

Notifications You must be signed in to change notification settings

HSE-LAMBDA/rapid-ao

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Adaptive Divergence for Rapid Adversarial Optimization

This repository contains experiments for Adaptive Divergence for Rapid Adversarial Optimization study.

Installation

This repository uses the following libraries:

Among non-default packages PythiaMill library requires manual installation. Please, follow the instructions in the corresponding repositories.

Other packages are available from the default pip repository and required versions are specified in setup.py.

Experiments

Jupyter notebooks with the experiments described in the paper can be found in notebooks/ directory:

  • AD-<task name>-<method name>.ipynb --- notebooks for profiling adaptive divergences on the synthetic tasks;
  • BO-XOR-GBDT.ipynb --- the experiment with Bayesian Optimization over GBDT-based adaptive divergences on one of the synthetic tasks;
  • BO-PythiaTuneMC-Cat.ipynb --- tuning Pythia hyper-parameters with Bayesian Optimization and CatBoost-based adaptive divergences;
  • plot-AVO.ipynb --- visualization of AVO results.

Code for the experiments involving AVO can be found in experiments/AVO.py.

Note: inside the package adaptive divergences might be referred as 'pseudo-Jensen-Snannon divergences' or 'pJSD'.

About

Experiments from "Adaptive Divergence for Rapid Adversarial Optimization" paper.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published