pydbm.optimization package

Submodules

pydbm.optimization.opt_params module

class pydbm.optimization.opt_params.OptParams

Bases: object

Abstract class of optimization functions.

References

  • Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
  • Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research, 15(1), 1929-1958.
  • Zaremba, W., Sutskever, I., & Vinyals, O. (2014). Recurrent neural network regularization. arXiv preprint arXiv:1409.2329.
constrain_weight()

Regularization for weights matrix to repeat multiplying the weights matrix and 0.9 until $sum_{j=0}^{n}w_{ji}^2 < weight_limit$.

Parameters:weight_arr – wegiht matrix.
Returns:weight matrix.
dropout()

Dropout.

Parameters:activity_arr – The state of units.
Returns:The state of units.
dropout_rate

getter

get_dropout_rate()

getter

get_weight_limit()

getter

optimize()

Return of result from this concrete optimization function.

Parameters:
  • params_dictlist of parameters.
  • grads_arrnp.ndarray of gradation.
  • learning_rate – Learning rate.
Returns:

list of optimized parameters.

set_dropout_rate()

setter

set_weight_limit()

setter

weight_limit

getter

Module contents