Alec Radford’s animations for optimization algorithms[FW]

Alec Radford has created some great animations comparing optimization algorithms SGDMomentumNAGAdagradAdadeltaRMSprop (unfortunately no Adam) on low dimensional problems. Also check out his presentation on RNNs.

Noisy moons: This is logistic regression on noisy moons dataset from sklearn which shows the smoothing effects of momentum based techniques (which also results in over shooting and correction). The error surface is visualized as an average over the whole dataset empirically, but the trajectories show the dynamics of minibatches on noisy data. The bottom chart is an accuracy plot.”

Beale’s function: Due to the large initial gradient, velocity based techniques shoot off and bounce around – adagrad almost goes unstable for the same reason. Algos that scale gradients/step sizes like adadelta and RMSProp proceed more like accelerated SGD and handle large gradients with more stability.”

Continue Reading