pydbm.approximation.rtrbmcd package¶
Submodules¶
pydbm.approximation.rtrbmcd.lstm_rt_rbm_cd module¶
-
class
pydbm.approximation.rtrbmcd.lstm_rt_rbm_cd.
LSTMRTRBMCD
¶ Bases:
pydbm.approximation.rt_rbm_cd.RTRBMCD
LSTM RTRBM based on Contrastive Divergence.
Conceptually, the positive phase is to the negative phase what waking is to sleeping.
LSTM-RTRBM model integrates the ability of LSTM in memorizing and retrieving useful history information, together with the advantage of RBM in high dimensional data modelling(Lyu, Q., Wu, Z., Zhu, J., & Meng, H. 2015, June). Like RTRBM, LSTM-RTRBM also has the recurrent hidden units.
Parameters: - graph.weights_arr – $W$ (Connection between v^{(t)} and h^{(t)})
- graph.visible_bias_arr – $b_v$ (Bias in visible layer)
- graph.hidden_bias_arr – $b_h$ (Bias in hidden layer)
- graph.rnn_hidden_weights_arr – $W’$ (Connection between h^{(t-1)} and b_h^{(t)})
- graph.rbm_hidden_weights_arr – $W_{R}$ (Connection between h^{(t-1)} and h^{(t)})
- graph.hat_hidden_activity_arr – $hat{h}^{(t)}$ (RNN with hidden units)
- graph.hidden_activity_arr_list – $hat{h}^{(t)} (t = 0, 1, …)$
- graph.v_hat_weights_arr – $W_2$ (Connection between v^{(t)} and hat{h}^{(t)})
- graph.hat_weights_arr – $W_3$ (Connection between hat{h}^{(t-1)} and hat{h}^{(t)})
- graph.rnn_hidden_bias_arr – $b_{hat{h}^{(t)}}$ (Bias of RNN hidden layers.)
$$hat{h}^{(t)} = sig (W_2 v^{(t)} + W_3 hat{h}^{(t-1)} + b_{hat{h}})
References
- Boulanger-Lewandowski, N., Bengio, Y., & Vincent, P. (2012). Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription. arXiv preprint arXiv:1206.6392.
- Lyu, Q., Wu, Z., Zhu, J., & Meng, H. (2015, June). Modelling High-Dimensional Sequences with LSTM-RTRBM: Application to Polyphonic Music Generation. In IJCAI (pp. 4138-4139).
- Lyu, Q., Wu, Z., & Zhu, J. (2015, October). Polyphonic music modelling with LSTM-RTRBM. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 991-994). ACM.
- Sutskever, I., Hinton, G. E., & Taylor, G. W. (2009). The recurrent temporal restricted boltzmann machine. In Advances in Neural Information Processing Systems (pp. 1601-1608).
-
back_propagation
¶ Details of the backpropagation through time algorithm.
Override.
-
compute_loss
¶ Compute loss.
Parameters: - batch_observed_arr – np.ndarray of observed data points.
- inferenced_arr – np.ndarray of reconstructed feature points.
Returns: loss.
-
memorize_activity
¶ Memorize activity.
Override.
Parameters: - observed_data_arr – Observed data points in positive phase.
- negative_visible_activity_arr – visible acitivty in negative phase.
-
rnn_learn
¶ Learning for RNN.
Parameters: observed_data_list – observed data points.
pydbm.approximation.rtrbmcd.rnn_rbm_cd module¶
-
class
pydbm.approximation.rtrbmcd.rnn_rbm_cd.
RNNRBMCD
¶ Bases:
pydbm.approximation.rt_rbm_cd.RTRBMCD
Recurrent Neural Network Restricted Boltzmann Machines(RNN-RBM) based on Contrastive Divergence.
Conceptually, the positive phase is to the negative phase what waking is to sleeping.
The RTRBM can be understood as a sequence of conditional RBMs whose parameters are the output of a deterministic RNN, with the constraint that the hidden units must describe the conditional distributions and convey temporal information. This constraint can be lifted by combining a full RNN with distinct hidden units.
RNN-RBM (Boulanger-Lewandowski, N., et al. 2012), which is the more structural expansion of RTRBM, has also hidden units.
Parameters: - graph.weights_arr – $W$ (Connection between v^{(t)} and h^{(t)})
- graph.visible_bias_arr – $b_v$ (Bias in visible layer)
- graph.hidden_bias_arr – $b_h$ (Bias in hidden layer)
- graph.rnn_hidden_weights_arr – $W’$ (Connection between h^{(t-1)} and b_h^{(t)})
- graph.rnn_visible_weights_arr – $W’‘$ (Connection between h^{(t-1)} and b_v^{(t)})
- graph.hat_hidden_activity_arr – $hat{h}^{(t)}$ (RNN with hidden units)
- graph.pre_hidden_activity_arr_list – $hat{h}^{(t)} (t = 0, 1, …)$
- graph.v_hat_weights_arr – $W_2$ (Connection between v^{(t)} and hat{h}^{(t)})
- graph.hat_weights_arr – $W_3$ (Connection between hat{h}^{(t-1)} and hat{h}^{(t)})
- graph.rnn_hidden_bias_arr – $b_{hat{h}^{(t)}}$ (Bias of RNN hidden layers.)
- $$hat{h}^{ (t)} = sig (W_2 v^{(t)} + W_3 hat{h}^{(t-1)} + b_{hat{h}}) –
References
- Boulanger-Lewandowski, N., Bengio, Y., & Vincent, P. (2012). Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription. arXiv preprint arXiv:1206.6392.
- Lyu, Q., Wu, Z., Zhu, J., & Meng, H. (2015, June). Modelling High-Dimensional Sequences with LSTM-RTRBM: Application to Polyphonic Music Generation. In IJCAI (pp. 4138-4139).
- Lyu, Q., Wu, Z., & Zhu, J. (2015, October). Polyphonic music modelling with LSTM-RTRBM. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 991-994). ACM.
- Sutskever, I., Hinton, G. E., & Taylor, G. W. (2009). The recurrent temporal restricted boltzmann machine. In Advances in Neural Information Processing Systems (pp. 1601-1608).
-
back_propagation
¶ Details of the backpropagation through time algorithm.
Override.
-
compute_loss
¶ Compute loss.
Parameters: - batch_observed_arr – np.ndarray of observed data points.
- inferenced_arr – np.ndarray of reconstructed feature points.
Returns: loss.
-
memorize_activity
¶ Memorize activity.
Override.
Parameters: - observed_data_arr – Observed data points in positive phase.
- negative_visible_activity_arr – visible acitivty in negative phase.
-
rnn_learn
¶ Learning for RNN.
Parameters: observed_data_list – observed data points.