Code that reproduces the experiments in the technical report:
G. Papamakarios, Comparison of Modern Stochastic Optimization Algorithms, Technical report, University of Edinburgh, 2014. [pdf] [bibtex]
The experiments benchmark four optimization algorithms on two convex problems. The algorithms are:
- Batch gradient descent
- Stochastic gradient descent
- Semi-stochastic gradient descent
- Stochastic average gradient
And the tasks are:
- Logistic regression on synthetic data
- Softmax regression on the MNIST dataset of handwritten images
First, run install.m
to add all necessary paths to matlab's path. Then all scripts and functions in this folder will become executable.
-
Run
gen_synth_data.m
to generate a synthetic dataset for logistic regression. Modify parametersN
andD
to change number of datapoints and dimesions respectively. -
Run
benchmark_logistic_synth.m
to benchmark all algorithms on the synthetic dataset. Results are written in theresults
folder.
- Download the following files from the MNIST website:
- train-images-idx3-ubyte.gz
- train-labels-idx1-ubyte.gz
- t10k-images-idx3-ubyte.gz
- t10k-labels-idx1-ubyte.gz
-
Unzip them and place them in the folder
data/mnist
and runprepare_mnist_data.m
. -
Run
benchmark_softmax_mnist.m
to benchmark all algorithms on MNIST. Results are written in theresults
folder.
-
install.m
: the script you need to run before you do anything else. -
opt
: contains implementations of the four optimization algorithms: GD, SGD, S2GD and SAG. -
data
: contains scripts for generating synthetic data and preparing the MNIST data. These datasets are needed for the benchmarks. -
benchmarks
: contains scripts that run experiments. Datasets must have been generated first. These scripts save and plot results. -
util
: some utility functions used throughout the project.