pydbm.approximation package

Submodules

pydbm.approximation.contrastive_divergence module

class pydbm.approximation.contrastive_divergence.ContrastiveDivergence

Bases: pydbm.approximation.interface.approximate_interface.ApproximateInterface

Contrastive Divergence.

Conceptually, the positive phase is to the negative phase what waking is to sleeping.

approximate_inference()

Inference with function approximation.

Parameters:
  • graph – Graph of neurons.
  • learning_rate – Learning rate.
  • observed_data_arr – observed data points.
  • training_count – Training counts.
  • r_batch_size – Batch size. If this value is 0, the inferencing is a recursive learning. If this value is more than 0, the inferencing is a mini-batch recursive learning. If this value is ‘-1’, the inferencing is not a recursive learning.
Returns:

Graph of neurons.

approximate_learn()

learning with function approximation.

Parameters:
  • graph – Graph of neurons.
  • learning_rate – Learning rate.
  • observed_data_arr – observed data points.
  • training_count – Training counts.
  • batch_size – Batch size (0: not mini-batch)
Returns:

Graph of neurons.

get_reconstruct_error_list()

getter

reconstruct_error_list

getter

set_readonly()

setter

pydbm.approximation.rt_rbm_cd module

class pydbm.approximation.rt_rbm_cd.RTRBMCD

Bases: pydbm.approximation.interface.approximate_interface.ApproximateInterface

Recurrent Temporal Restricted Boltzmann Machines based on Contrastive Divergence.

Conceptually, the positive phase is to the negative phase what waking is to sleeping.

Parameters:
  • graph.weights_arr – $W$ (Connection between v^{(t)} and h^{(t)})
  • graph.visible_bias_arr – $b_v$ (Bias in visible layer)
  • graph.hidden_bias_arr – $b_h$ (Bias in hidden layer)
  • graph.rnn_hidden_weights_arr – $W’$ (Connection between h^{(t-1)} and b_h^{(t)})
  • graph.rnn_visible_weights_arr – $W’‘$ (Connection between h^{(t-1)} and b_v^{(t)})
  • graph.hat_hidden_activity_arr – $hat{h}^{(t)}$ (RNN with hidden units)
  • graph.pre_hidden_activity_arr – $hat{h}^{(t-1)}$
approximate_inference()

Inference with function approximation.

Parameters:
  • graph – Graph of neurons.
  • learning_rate – Learning rate.
  • observed_data_arr – observed data points.
  • training_count – Training counts.
  • r_batch_size – Batch size. If this value is 0, the inferencing is a recursive learning. If this value is more than 0, the inferencing is a mini-batch recursive learning. If this value is ‘-1’, the inferencing is not a recursive learning.
Returns:

Graph of neurons.

approximate_learn()

learning with function approximation.

Parameters:
  • graph – Graph of neurons.
  • learning_rate – Learning rate.
  • observed_data_arr – observed data points.
  • training_count – Training counts.
  • batch_size – Batch size (0: not mini-batch)
Returns:

Graph of neurons.

back_propagation()

Details of the backpropagation through time algorithm.

batch_size = 0
batch_step = 0
get_opt_params()

getter

get_reconstrct_error_list()

getter

graph = None
learning_rate = 0.5
memorize_activity()

Memorize activity.

Parameters:
  • observed_data_arr – Observed data points in positive phase.
  • negative_visible_activity_arr – visible acitivty in negative phase.
negative_visible_activity_arr = None
opt_params

getter

r_batch_size = 0
r_batch_step = 0
reconstruct_error_list

getter

rnn_learn()

Learning for RNN.

Parameters:observed_data_list – observed data points.
set_readonly()

setter

wake_sleep_inference()

Sleeping, waking, and inferencing.

Parameters:observed_data_arr – feature points.
wake_sleep_learn()

Waking, sleeping, and learning.

Standing on the premise that the settings of the activation function and weights operation are common.

The binary activity is unsupported.

Parameters:observed_data_list – observed data points.

pydbm.approximation.shape_bm_cd module

class pydbm.approximation.shape_bm_cd.ShapeBMCD

Bases: pydbm.approximation.interface.approximate_interface.ApproximateInterface

Contrastive Divergence for Shape-Boltzmann machine(Shape-BM).

Conceptually, the positive phase is to the negative phase what waking is to sleeping.

approximate_inference()

Inference with function approximation.

Parameters:
  • graph – Graph of neurons.
  • learning_rate – Learning rate.
  • observed_data_arr – observed data points.
  • training_count – Training counts.
  • r_batch_size – Batch size. If this value is 0, the inferencing is a recursive learning. If this value is more than 0, the inferencing is a mini-batch recursive learning. If this value is ‘-1’, the inferencing is not a recursive learning.
Returns:

Graph of neurons.

approximate_learn()

learning with function approximation.

Parameters:
  • graph – Graph of neurons.
  • learning_rate – Learning rate.
  • observed_data_arr – observed data points.
  • training_count – Training counts.
  • batch_size – Batch size (0: not mini-batch)
Returns:

Graph of neurons.

get_reconstrct_error_list()

getter

reconstruct_error_list

getter

set_readonly()

setter

Module contents