pydbm.optimization.optparams package

Submodules

pydbm.optimization.optparams.adam module

class pydbm.optimization.optparams.adam.Adam

Bases: pydbm.optimization.opt_params.OptParams

Adam.

References

  • Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
optimize()

Return of result from this optimization function.

Override.

Parameters:
  • params_dictlist of parameters.
  • grads_listlist of gradation.
  • learning_rate – Learning rate.
Returns:

list of optimized parameters.

pydbm.optimization.optparams.sgd module

class pydbm.optimization.optparams.sgd.SGD

Bases: pydbm.optimization.opt_params.OptParams

Stochastic Gradient Descent.

References

  • Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
optimize()

Return of result from this optimization function.

Override.

Parameters:
  • params_dictlist of parameters.
  • grads_listlist of gradation.
  • learning_rate – Learning rate.
Returns:

list of optimized parameters.

Module contents