File Exchange

image thumbnail

Adam stochastic gradient descent optimization

version 1.0.0.0 (109 KB) by Dylan Muir
Matlab implementation of the Adam stochastic gradient descent optimisation algorithm

11 Downloads

Updated 16 Aug 2017

GitHub view license on GitHub

`fmin_adam` is an implementation of the Adam optimisation algorithm (gradient descent with Adaptive learning rates individually on each parameter, with Momentum) from Kingma and Ba [1]. Adam is designed to work on stochastic gradient descent problems; i.e. when only small batches of data are used to estimate the gradient on each iteration, or when stochastic dropout regularisation is used [2].
See GIT repository for examples:
https://github.com/DylanMuir/fmin_adam

Usage:
[x, fval, exitflag, output] = fmin_adam(fun, x0 <, stepSize, beta1, beta2, epsilon, nEpochSize, options>)

See the function help for a detailed reference. The github repository has a couple of examples.

References:
[1] Diederik P. Kingma, Jimmy Ba. "Adam: A Method for Stochastic Optimization", ICLR 2015. [https://arxiv.org/abs/1412.6980](https://arxiv.org/abs/1412.6980)

[2] Geoffrey E Hinton, Nitish Srivastava, Alex Krizhevsky, Ilya Sutskever, and Ruslan R. Salakhutdinov. "Improving neural networks by preventing co-adaptation of feature detectors." arXiv preprint. [https://arxiv.org/abs/1207.0580](https://arxiv.org/abs/1207.0580)

Cite As

Dylan Muir (2020). Adam stochastic gradient descent optimization (https://github.com/DylanMuir/fmin_adam), GitHub. Retrieved .

Comments and Ratings (1)

This implementation attempts to reject "bad steps". I believe this is wrong.

Updates

1.0.0.0

Updated description

1.0.0.0

Updated description

1.0.0.0

Updated description

1.0.0.0

Updated description

1.0.0.0

Updated description

1.0.0.0

Updated title

1.0.0.0

Updated description

MATLAB Release Compatibility
Created with R2016b
Compatible with any release
Platform Compatibility
Windows macOS Linux