Displaying 1 to 4 from 4 results

fmin_adam - Matlab implementation of the Adam stochastic gradient descent optimisation algorithm

  •    Matlab

This is a Matlab implementation of the Adam optimiser from Kingma and Ba [1], designed for stochastic gradient descent. It maintains estimates of the moments of the gradient independently for each parameter. fmin_adam is an implementation of the Adam optimisation algorithm (gradient descent with Adaptive learning rates individually on each parameter, with Momentum) from Kingma and Ba [1]. Adam is designed to work on stochastic gradient descent problems; i.e. when only small batches of data are used to estimate the gradient on each iteration, or when stochastic dropout regularisation is used [2].