You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Code for implementing **V**ariational **E**ntropy **R**egularized **A**pproximate maximum likelihood (VERA). Contains scripts for training VERA and using VERA for [JEM](https://github.com/wgrathwohl/JEM) training. Code is also available for training semi-supervised models on tabular data, mode counting experiments, and tractable likelihood models.
13
+
Code for implementing **V**ariational **E**ntropy **R**egularized **A**pproximate maximum likelihood (VERA). Contains scripts for training VERA and using VERA for [JEM](https://github.com/wgrathwohl/JEM) training. Code is also available for training semi-supervised models on tabular data, mode counting experiments, and tractable likelihood models experiments.
14
14
15
15
For more info on me and my work please checkout my [website](http://www.cs.toronto.edu/~wgrathwohl/), [twitter](https://twitter.com/wgrathwohl), or [Google Scholar](https://scholar.google.ca/citations?user=ZbClz98AAAAJ&hl=en).
16
16
@@ -33,20 +33,29 @@ tqdm
33
33
### Hyperparameters
34
34
35
35
A brief explanation of hyperparameters that can be set from flags and their names in the paper.
36
-
-`--clf_weight` Classification weight (`\alpha` in the paper)
37
-
-`--pg_control` Gradient norm penalty (`\gamma` in the paper)
38
-
-`--ent_weight` Entropy regularization weight (`\lambda` in the paper)
39
-
-`--clf_ent_weight` Classification entropy (`\beta` in the paper)
An explanation of flags for different modes of training
43
+
An explanation of flags for different modes of training. Without any of these flags, an unsupervised VERA model will be trained.
44
44
45
45
-`--clf_only` For training a classifier on its own, i.e. without an EBM as in JEM.
46
46
-`--jem` Do JEM training.
47
47
-`--labels_per_class` If this is greater than zero, use this many labels per class for semi-supervised learning. If zero (default), do full-label training.
48
48
49
-
For example, to train a CIFAR10 JEM model: # TODO
49
+
To train a CIFAR10/CIFAR100 JEM model as in the paper, run:
50
+
51
+
```markdown
52
+
python train.py --dataset DATASET # cifar10 or cifar100
0 commit comments