pydbm.dbm package

Submodules

pydbm.dbm.dbm_director module

class pydbm.dbm.dbm_director.DBMDirector

Bases: object

The Director in Builder Pattern.

Compose restricted boltzmann machines for building a object of deep boltzmann machine.

As is well known, DBM is composed of layers of RBMs stacked on top of each other(Salakhutdinov, R., & Hinton, G. E. 2009). This model is a structural expansion of Deep Belief Networks(DBN), which is known as one of the earliest models of Deep Learning (Le Roux, N., & Bengio, Y. 2008). Like RBM, DBN places nodes in layers. However, only the uppermost layer is composed of undirected edges, and the other consists of directed edges.

References

  • https://github.com/chimera0/accel-brain-code/blob/master/Deep-Learning-by-means-of-Design-Pattern/demo/demo_stacked_auto_encoder.ipynb
  • Ackley, D. H., Hinton, G. E., & Sejnowski, T. J. (1985). A learning algorithm for Boltzmann machines. Cognitive science, 9(1), 147-169.
  • Hinton, G. E. (2002). Training products of experts by minimizing contrastive divergence. Neural computation, 14(8), 1771-1800.
  • Le Roux, N., & Bengio, Y. (2008). Representational power of restricted Boltzmann machines and deep belief networks. Neural computation, 20(6), 1631-1649.
  • Salakhutdinov, R., & Hinton, G. E. (2009). Deep boltzmann machines. InInternational conference on artificial intelligence and statistics (pp. 448-455).
dbm_construct

Build deep boltzmann machine.

Parameters:
  • neuron_assign_list – The unit of neurons in each layers.
  • activating_function_list – The list of activation function,
  • approximate_interface_list – The list of function approximation.
  • scale – Scale of parameters which will be ParamsInitializer.
  • params_initializer – is-a ParamsInitializer.
  • params_dictdict of parameters other than size to be input to function ParamsInitializer.sample_f.
get_rbm_list

getter

rbm_list

getter

set_rbm_list

setter

pydbm.dbm.deep_boltzmann_machine module

class pydbm.dbm.deep_boltzmann_machine.DeepBoltzmannMachine

Bases: object

The Client in Builder Pattern, to build Deep Boltzmann Machines.

As is well known, DBM is composed of layers of RBMs stacked on top of each other(Salakhutdinov, R., & Hinton, G. E. 2009). This model is a structural expansion of Deep Belief Networks(DBN), which is known as one of the earliest models of Deep Learning (Le Roux, N., & Bengio, Y. 2008). Like RBM, DBN places nodes in layers. However, only the uppermost layer is composed of undirected edges, and the other consists of directed edges.

References

  • https://github.com/chimera0/accel-brain-code/blob/master/Deep-Learning-by-means-of-Design-Pattern/demo/demo_stacked_auto_encoder.ipynb
  • Ackley, D. H., Hinton, G. E., & Sejnowski, T. J. (1985). A learning algorithm for Boltzmann machines. Cognitive science, 9(1), 147-169.
  • Hinton, G. E. (2002). Training products of experts by minimizing contrastive divergence. Neural computation, 14(8), 1771-1800.
  • Le Roux, N., & Bengio, Y. (2008). Representational power of restricted Boltzmann machines and deep belief networks. Neural computation, 20(6), 1631-1649.
  • Salakhutdinov, R., & Hinton, G. E. (2009). Deep boltzmann machines. InInternational conference on artificial intelligence and statistics (pp. 448-455).
get_feature_point

Extract the feature points.

Parameters:layer_number – The index of layers. For instance, 0 is visible layer, 1 is hidden or middle layer, and 2 is hidden layer in three layers.
Returns:The np.ndarray of feature points.
get_hidden_activity_arr_list

Extract activity of neurons in each hidden layers.

Returns:Activity.
get_hidden_bias_arr_list

Extract bias in each hidden layers.

Returns:Bias.
get_rbm_list
get_reconstruct_error_arr

Extract reconsturction error rate.

Returns:The np.ndarray.
get_visible_activity_arr_list

Extract activity of neurons in each visible layers.

Returns:Activity.
get_visible_bias_arr_list

Extract bias in each visible layers.

Returns:Bias.
get_visible_point

Extract the visible data points which is reconsturcted.

Parameters:layer_number – The index of layers. For instance, 0 is visible layer, 1 is hidden or middle layer, and 2 is hidden layer in three layers.
Returns:The np.ndarray of visible data points.
get_weight_arr_list

Extract weights of each links.

Returns:The list of weights.
learn

Learning.

Parameters:
  • observed_data_arr – The np.ndarray of observed data points.
  • training_count – Training counts.
  • batch_size – Batch size in learning.
  • r_batch_size

    Batch size in inferencing. If this value is 0, the inferencing is a recursive learning. If this value is more than 0, the inferencing is a mini-batch recursive learning. If this value is ‘-1’, the inferencing is not a recursive learning.

    If you do not want to execute the mini-batch training, the value of batch_size must be -1. And r_batch_size is also parameter to control the mini-batch training but is refered only in inference and reconstruction. If this value is more than 0, the inferencing is a kind of reccursive learning with the mini-batch training.

rbm_list
save_pre_learned_params

Save pre-learned parameters.

Parameters:
  • dir_path – Path of dir. If None, the file is saved in the current directory.
  • file_name – The naming rule of files. If None, this value is dbm.
set_rbm_list

pydbm.dbm.recurrent_temporal_rbm module

class pydbm.dbm.recurrent_temporal_rbm.RecurrentTemporalRBM

Bases: object

The Client in Builder Pattern for building RTRBM.

The RTRBM (Sutskever, I., et al. 2009) is a probabilistic time-series model which can be viewed as a temporal stack of RBMs, where each RBM has a contextual hidden state that is received from the previous RBM and is used to modulate its hidden units bias.

References

  • Boulanger-Lewandowski, N., Bengio, Y., & Vincent, P. (2012). Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription. arXiv preprint arXiv:1206.6392.
  • Lyu, Q., Wu, Z., Zhu, J., & Meng, H. (2015, June). Modelling High-Dimensional Sequences with LSTM-RTRBM: Application to Polyphonic Music Generation. In IJCAI (pp. 4138-4139).
  • Lyu, Q., Wu, Z., & Zhu, J. (2015, October). Polyphonic music modelling with LSTM-RTRBM. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 991-994). ACM.
  • Sutskever, I., Hinton, G. E., & Taylor, G. W. (2009). The recurrent temporal restricted boltzmann machine. In Advances in Neural Information Processing Systems (pp. 1601-1608).
get_rbm

getter

inference

Inferencing and recursive learning.

Parameters:
  • test_arrnp.ndarray of test data points.
  • training_count – The number of training.
  • batch_size – Batch size.
  • r_batch_size – Batch size for recursive learning.
Returns:

np.ndarray of inferenced result.

learn

Learning.

Parameters:
  • observed_arrnp.ndarray of observed data points.
  • training_count – The number of training.
  • batch_size – Batch size.
rbm

getter

save_pre_learn_params

Save pre-learned parameters.

Parameters:file_path – Stored file path.
set_rbm

setter

pydbm.dbm.restricted_boltzmann_machines module

class pydbm.dbm.restricted_boltzmann_machines.RestrictedBoltzmannMachine

Bases: object

Restricted Boltzmann Machine.

According to graph theory, the structure of RBM corresponds to a complete bipartite graph which is a special kind of bipartite graph where every node in the visible layer is connected to every node in the hidden layer. Based on statistical mechanics and thermodynamics(Ackley, D. H., Hinton, G. E., & Sejnowski, T. J. 1985), the state of this structure can be reflected by the energy function.

References

  • Ackley, D. H., Hinton, G. E., & Sejnowski, T. J. (1985). A learning algorithm for Boltzmann machines. Cognitive science, 9(1), 147-169.
  • Hinton, G. E. (2002). Training products of experts by minimizing contrastive divergence. Neural computation, 14(8), 1771-1800.
  • Le Roux, N., & Bengio, Y. (2008). Representational power of restricted Boltzmann machines and deep belief networks. Neural computation, 20(6), 1631-1649.
approximate_inferencing

Learning with function approximation.

Parameters:
  • observed_data_arr – The array of observed data points.
  • traning_count – Training counts.
  • r_batch_size

    Batch size. If this value is 0, the inferencing is a recursive learning. If this value is more than 0, the inferencing is a mini-batch recursive learning. If this value is ‘-1’, the inferencing is not a recursive learning.

    If you do not want to execute the mini-batch training, the value of batch_size must be -1. And r_batch_size is also parameter to control the mini-batch training but is refered only in inference and reconstruction. If this value is more than 0, the inferencing is a kind of reccursive learning with the mini-batch training.

approximate_learning

Learning with function approximation.

Parameters:
  • observed_data_arr – The array of observed data points.
  • traning_count – Training counts.
  • batch_size – Batch size.
get_graph

getter of graph

get_reconstruct_error_list

Extract reconstruction error.

Returns:The list.
graph

getter of graph

set_read_only

setter of graph

pydbm.dbm.rtrbm_director module

class pydbm.dbm.rtrbm_director.RTRBMDirector

Bases: object

The Director in Builder Pattern.

Compose RTRBM, RNN-RBM, or LSTM-RTRBM for building a object of restricted boltzmann machine.

The RTRBM (Sutskever, I., et al. 2009) is a probabilistic time-series model which can be viewed as a temporal stack of RBMs, where each RBM has a contextual hidden state that is received from the previous RBM and is used to modulate its hidden units bias.

References

  • Boulanger-Lewandowski, N., Bengio, Y., & Vincent, P. (2012). Modeling temporal dependencies in high-dimensional sequences: Application to polyphonic music generation and transcription. arXiv preprint arXiv:1206.6392.
  • Lyu, Q., Wu, Z., Zhu, J., & Meng, H. (2015, June). Modelling High-Dimensional Sequences with LSTM-RTRBM: Application to Polyphonic Music Generation. In IJCAI (pp. 4138-4139).
  • Lyu, Q., Wu, Z., & Zhu, J. (2015, October). Polyphonic music modelling with LSTM-RTRBM. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 991-994). ACM.
  • Sutskever, I., Hinton, G. E., & Taylor, G. W. (2009). The recurrent temporal restricted boltzmann machine. In Advances in Neural Information Processing Systems (pp. 1601-1608).
get_rbm

getter

rbm

getter

rtrbm_construct

Build deep boltzmann machine.

Parameters:
  • visible_num – The number of units in visible layer.
  • hidden_num – The number of units in hidden layer.
  • visible_activating_function – The activation function in visible layer.
  • hidden_activating_function – The activation function in hidden layer.
  • approximate_interface – The function approximation.
  • learning_rate – Learning rate.
  • learning_attenuate_rate – Attenuate the learning_rate by a factor of this value every attenuate_epoch.
  • attenuate_epoch – Attenuate the learning_rate by a factor of learning_attenuate_rate every attenuate_epoch.
  • scale – Scale of parameters which will be ParamsInitializer.
  • params_initializer – is-a ParamsInitializer.
  • params_dictdict of parameters other than size to be input to function ParamsInitializer.sample_f.
set_rbm

setter

Module contents