# pydbm.dbm.builders package¶

## pydbm.dbm.builders.dbm_multi_layer_builder module¶

class pydbm.dbm.builders.dbm_multi_layer_builder.DBMMultiLayerBuilder

Concrete Builder in Builder Pattern.

Compose three restricted boltzmann machines for building a deep boltzmann machine.

As is well known, DBM is composed of layers of RBMs stacked on top of each other(Salakhutdinov, R., & Hinton, G. E. 2009). This model is a structural expansion of Deep Belief Networks(DBN), which is known as one of the earliest models of Deep Learning (Le Roux, N., & Bengio, Y. 2008). Like RBM, DBN places nodes in layers. However, only the uppermost layer is composed of undirected edges, and the other consists of directed edges.

References

• https://github.com/chimera0/accel-brain-code/blob/master/Deep-Learning-by-means-of-Design-Pattern/demo/demo_stacked_auto_encoder.ipynb
• Ackley, D. H., Hinton, G. E., & Sejnowski, T. J. (1985). A learning algorithm for Boltzmann machines. Cognitive science, 9(1), 147-169.
• Hinton, G. E. (2002). Training products of experts by minimizing contrastive divergence. Neural computation, 14(8), 1771-1800.
• Le Roux, N., & Bengio, Y. (2008). Representational power of restricted Boltzmann machines and deep belief networks. Neural computation, 20(6), 1631-1649.
• Salakhutdinov, R., & Hinton, G. E. (2009). Deep boltzmann machines. InInternational conference on artificial intelligence and statistics (pp. 448-455).
attenuate_epoch

getter

feature_neuron_part

Build neurons for feature points in virtual visible layer.

Build neurons in n layers.

For associating with n-1 layers, the object activate as neurons in hidden layer. On the other hand, for associating with n+1 layers, the object activate as neurons in virtual visible layer.

Parameters: activating_function_list – The list of activation function. neuron_count_list – The list of the number of neurons.
get_attenuate_epoch

getter

get_learning_attenuate_rate

getter

get_learning_rate

getter

get_result

Return builded restricted boltzmann machines.

Returns: The list of restricted boltzmann machines.
graph_part

Build complete bipartite graph.

Parameters: approximate_interface_list – The list of function approximation. scale – Scale of parameters which will be ParamsInitializer. params_initializer – is-a ParamsInitializer. params_dict – dict of parameters other than size to be input to function ParamsInitializer.sample_f.
hidden_neuron_part

Build neurons in hidden layer.

Parameters: activating_function – Activation function neuron_count – The number of neurons.
learning_attenuate_rate

getter

learning_rate

getter

set_attenuate_epoch

setter

set_learning_attenuate_rate

setter

set_learning_rate

setter

visible_neuron_part

Build neurons in visible layer.

Parameters: activating_function – Activation function. neuron_count – The number of neurons.

## pydbm.dbm.builders.lstm_rt_rbm_simple_builder module¶

class pydbm.dbm.builders.lstm_rt_rbm_simple_builder.LSTMRTRBMSimpleBuilder

Concrete Builder in Builder Pattern.

Compose restricted boltzmann machines for building a LSTM-RTRBM.

LSTM-RTRBM model integrates the ability of LSTM in memorizing and retrieving useful history information, together with the advantage of RBM in high dimensional data modelling(Lyu, Q., Wu, Z., Zhu, J., & Meng, H. 2015, June). Like RTRBM, LSTM-RTRBM also has the recurrent hidden units.

References

• Boulanger-Lewandowski, N., Bengio, Y., & Vincent, P. (2012). Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription. arXiv preprint arXiv:1206.6392.
• Lyu, Q., Wu, Z., Zhu, J., & Meng, H. (2015, June). Modelling High-Dimensional Sequences with LSTM-RTRBM: Application to Polyphonic Music Generation. In IJCAI (pp. 4138-4139).
• Lyu, Q., Wu, Z., & Zhu, J. (2015, October). Polyphonic music modelling with LSTM-RTRBM. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 991-994). ACM.
• Sutskever, I., Hinton, G. E., & Taylor, G. W. (2009). The recurrent temporal restricted boltzmann machine. In Advances in Neural Information Processing Systems (pp. 1601-1608).
attenuate_epoch

getter

get_attenuate_epoch

getter

get_learning_attenuate_rate

getter

get_learning_rate

getter

get_result

Return builded restricted boltzmann machines.

Returns: The list of restricted boltzmann machines.
graph_part

Build RNNRBM graph.

Parameters: approximate_interface – The function approximation. scale – Scale of parameters which will be ParamsInitializer. params_initializer – is-a ParamsInitializer. params_dict – dict of parameters other than size to be input to function ParamsInitializer.sample_f.
hidden_neuron_part

Build neurons in hidden layer.

Parameters: activating_function – Activation function neuron_count – The number of neurons.
learning_attenuate_rate

getter

learning_rate

getter

rnn_neuron_part

Build neurons for RNN.

Parameters: rnn_activating_function – Activation function
set_attenuate_epoch

setter

set_learning_attenuate_rate

setter

set_learning_rate

setter

visible_neuron_part

Build neurons in visible layer.

Parameters: activating_function – Activation function. neuron_count – The number of neurons.

## pydbm.dbm.builders.rnn_rbm_simple_builder module¶

class pydbm.dbm.builders.rnn_rbm_simple_builder.RNNRBMSimpleBuilder

Concrete Builder in Builder Pattern.

Compose restricted boltzmann machines for building a RNNRBM.

The RTRBM can be understood as a sequence of conditional RBMs whose parameters are the output of a deterministic RNN, with the constraint that the hidden units must describe the conditional distributions and convey temporal information. This constraint can be lifted by combining a full RNN with distinct hidden units.

RNN-RBM (Boulanger-Lewandowski, N., et al. 2012), which is the more structural expansion of RTRBM, has also hidden units.

References

• Boulanger-Lewandowski, N., Bengio, Y., & Vincent, P. (2012). Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription. arXiv preprint arXiv:1206.6392.
• Lyu, Q., Wu, Z., Zhu, J., & Meng, H. (2015, June). Modelling High-Dimensional Sequences with LSTM-RTRBM: Application to Polyphonic Music Generation. In IJCAI (pp. 4138-4139).
• Lyu, Q., Wu, Z., & Zhu, J. (2015, October). Polyphonic music modelling with LSTM-RTRBM. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 991-994). ACM.
• Sutskever, I., Hinton, G. E., & Taylor, G. W. (2009). The recurrent temporal restricted boltzmann machine. In Advances in Neural Information Processing Systems (pp. 1601-1608).
attenuate_epoch

getter

get_attenuate_epoch

getter

get_learning_attenuate_rate

getter

get_learning_rate

getter

get_result

Return builded restricted boltzmann machines.

Returns: The list of restricted boltzmann machines.
graph_part

Build RNNRBM graph.

Parameters: approximate_interface – The function approximation. scale – Scale of parameters which will be ParamsInitializer. params_initializer – is-a ParamsInitializer. params_dict – dict of parameters other than size to be input to function ParamsInitializer.sample_f.
hidden_neuron_part

Build neurons in hidden layer.

Parameters: activating_function – Activation function neuron_count – The number of neurons.
learning_attenuate_rate

getter

learning_rate

getter

rnn_neuron_part

Build neurons for RNN.

Parameters: rnn_activating_function – Activation function
set_attenuate_epoch

setter

set_learning_attenuate_rate

setter

set_learning_rate

setter

visible_neuron_part

Build neurons in visible layer.

Parameters: activating_function – Activation function. neuron_count – The number of neurons.

## pydbm.dbm.builders.rt_rbm_simple_builder module¶

class pydbm.dbm.builders.rt_rbm_simple_builder.RTRBMSimpleBuilder

Concrete Builder in Builder Pattern.

Compose restricted boltzmann machines for building a RTRBM.

The RTRBM (Sutskever, I., et al. 2009) is a probabilistic time-series model which can be viewed as a temporal stack of RBMs, where each RBM has a contextual hidden state that is received from the previous RBM and is used to modulate its hidden units bias.

References

• Boulanger-Lewandowski, N., Bengio, Y., & Vincent, P. (2012). Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription. arXiv preprint arXiv:1206.6392.
• Lyu, Q., Wu, Z., Zhu, J., & Meng, H. (2015, June). Modelling High-Dimensional Sequences with LSTM-RTRBM: Application to Polyphonic Music Generation. In IJCAI (pp. 4138-4139).
• Lyu, Q., Wu, Z., & Zhu, J. (2015, October). Polyphonic music modelling with LSTM-RTRBM. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 991-994). ACM.
• Sutskever, I., Hinton, G. E., & Taylor, G. W. (2009). The recurrent temporal restricted boltzmann machine. In Advances in Neural Information Processing Systems (pp. 1601-1608).
attenuate_epoch

getter

get_attenuate_epoch

getter

get_learning_attenuate_rate

getter

get_learning_rate

getter

get_result

Return builded restricted boltzmann machines.

Returns: The list of restricted boltzmann machines.
graph_part

Build RTRBM graph.

Parameters: approximate_interface – The function approximation. scale – Scale of parameters which will be ParamsInitializer. params_initializer – is-a ParamsInitializer. params_dict – dict of parameters other than size to be input to function ParamsInitializer.sample_f.
hidden_neuron_part

Build neurons in hidden layer.

Parameters: activating_function – Activation function neuron_count – The number of neurons.
learning_attenuate_rate

getter

learning_rate

getter

rnn_neuron_part

Build neurons for RNN.

Parameters: rnn_activating_function – Activation function
set_attenuate_epoch

setter

set_learning_attenuate_rate

setter

set_learning_rate

setter

visible_neuron_part

Build neurons in visible layer.

Parameters: activating_function – Activation function. neuron_count – The number of neurons.