pydbm.rnn package

Submodules

pydbm.rnn.encoder_decoder_controller module

class pydbm.rnn.encoder_decoder_controller.EncoderDecoderController

Bases: object

Encoder/Decoder based on LSTM networks.

This library provides Encoder/Decoder based on LSTM, which is a reconstruction model and makes it possible to extract series features embedded in deeper layers. The LSTM encoder learns a fixed length vector of time-series observed data points and the LSTM decoder uses this representation to reconstruct the time-series using the current hidden state and the value inferenced at the previous time-step.

One interesting application example is the Encoder/Decoder for Anomaly Detection (EncDec-AD) paradigm (Malhotra, P., et al. 2016). This reconstruction model learns to reconstruct normal time-series behavior, and thereafter uses reconstruction error to detect anomalies. Malhotra, P., et al. (2016) showed that EncDec-AD paradigm is robust and can detect anomalies from predictable, unpredictable, periodic, aperiodic, and quasi-periodic time-series. Further, they showed that the paradigm is able to detect anomalies from short time-series (length as small as 30) as well as long time-series (length as large as 500).

References

back_propagation()

Back propagation.

Parameters:
  • pred_arrnp.ndarray of predicted data points from decoder.
  • delta_output_arr – Delta.
Returns:

Tuple data. - decoder’s list of gradations, - encoder’s np.ndarray of Delta, - encoder’s list of gradations.

decoder

getter

encoder

getter

get_decoder()

getter

get_encoder()

getter

get_feature_points()

Extract the activities in hidden layer and reset it, considering this method will be called per one cycle in instances of time-series.

Returns:The array like or sparse matrix of feature points.
get_reconstruction_error()

Extract the reconstructed error in inferencing.

Returns:The array like or sparse matrix of reconstruction error.
get_verificatable_result()

getter

inference()

Inference the feature points to reconstruct the time-series.

Override.

Parameters:
  • observed_arr – Array like or sparse matrix as the observed data ponts.
  • hidden_activity_arr – Array like or sparse matrix as the state in hidden layer.
  • rnn_activity_arr – Array like or sparse matrix as the state in RNN.
Returns:

Tuple data. - Array like or sparse matrix of reconstructed instances of time-series, - Array like or sparse matrix of the state in hidden layer, - Array like or sparse matrix of the state in RNN.

learn()

Learn the observed data points for vector representation of the input time-series.

Override.

Parameters:
  • observed_arr – Array like or sparse matrix as the observed data ponts.
  • target_arr – Array like or sparse matrix as the target data points. To learn as Auto-encoder, this value must be None or equivalent to observed_arr.
learn_generated()

Learn features generated by FeatureGenerator.

Parameters:feature_generator – is-a FeatureGenerator.
load_pre_learned_params()

Load pre-learned parameters.

If you want to load pre-learned parameters simultaneously with stacked graphs, call method stack_graph and setup the graphs before calling this method.

Parameters:dir_path – Dir path.
optimize()

Back propagation.

Parameters:
  • decoder_grads_list – decoder’s list of graduations.
  • encoder_grads_list – encoder’s list of graduations.
  • learning_rate – Learning rate.
  • epoch – Now epoch.
save_pre_learned_params()

Save pre-learned parameters.

Parameters:dir_path – Path of dir. If None, the file is saved in the current directory.
set_readonly()

setter

set_verificatable_result()

setter

verificatable_result

getter

pydbm.rnn.facade_encoder_decoder module

class pydbm.rnn.facade_encoder_decoder.FacadeEncoderDecoder

Bases: object

Facade for casual user of Encoder/Decoder based on LSTM networks.

This library provides Encoder/Decoder based on LSTM, which is a reconstruction model and makes it possible to extract series features embedded in deeper layers. The LSTM encoder learns a fixed length vector of time-series observed data points and the LSTM decoder uses this representation to reconstruct the time-series using the current hidden state and the value inferenced at the previous time-step.

One interesting application example is the Encoder/Decoder for Anomaly Detection (EncDec-AD) paradigm (Malhotra, P., et al. 2016). This reconstruction model learns to reconstruct normal time-series behavior, and thereafter uses reconstruction error to detect anomalies. Malhotra, P., et al. (2016) showed that EncDec-AD paradigm is robust and can detect anomalies from predictable, unpredictable, periodic, aperiodic, and quasi-periodic time-series. Further, they showed that the paradigm is able to detect anomalies from short time-series (length as small as 30) as well as long time-series (length as large as 500).

References

get_feature_points()

Extract the activities in hidden layer and reset it, considering this method will be called per one cycle in instances of time-series.

Returns:The array like or sparse matrix of feature points.
get_reconstruction_error()

Extract the reconstructed error in inferencing.

Returns:The array like or sparse matrix of reconstruction error.
infernece()

Inference the feature points to reconstruct the time-series.

Parameters:
  • observed_arr – Array like or sparse matrix as the observed data ponts.
  • hidden_activity_arr – Array like or sparse matrix as the state in hidden layer.
  • rnn_activity_arr – Array like or sparse matrix as the state in RNN.
Returns:

Tuple data. - Array like or sparse matrix of reconstructed instances of time-series, - Array like or sparse matrix of the state in hidden layer, - Array like or sparse matrix of the state in RNN.

learn()

Learn the observed data points for vector representation of the input time-series.

Parameters:
  • observed_arr – Array like or sparse matrix as the observed data ponts.
  • target_arr – Array like or sparse matrix as the target data points. To learn as Auto-encoder, this value must be None or equivalent to observed_arr.
save_pre_learned_params()

Save pre-learned parameters.

Parameters:
  • encoder_file_path – File path.
  • decoder_file_path – File path.

pydbm.rnn.lstm_model module

class pydbm.rnn.lstm_model.LSTMModel

Bases: pydbm.rnn.interface.reconstructable_model.ReconstructableModel

Long short term memory(LSTM) networks.

Originally, Long Short-Term Memory(LSTM) networks as a special RNN structure has proven stable and powerful for modeling long-range dependencies.

The Key point of structural expansion is its memory cell which essentially acts as an accumulator of the state information. Every time observed data points are given as new information and input to LSTM’s input gate, its information will be accumulated to the cell if the input gate is activated. The past state of cell could be forgotten in this process if LSTM’s forget gate is on. Whether the latest cell output will be propagated to the final state is further controlled by the output gate.

References

  • Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
  • Malhotra, P., Ramakrishnan, A., Anand, G., Vig, L., Agarwal, P., & Shroff, G. (2016). LSTM-based encoder-decoder for multi-sensor anomaly detection. arXiv preprint arXiv:1607.00148.
  • Zaremba, W., Sutskever, I., & Vinyals, O. (2014). Recurrent neural network regularization. arXiv preprint arXiv:1409.2329.
back_propagation()

Back propagation.

Parameters:
  • pred_arrnp.ndarray of predicted data points.
  • delta_output_arr – Delta.
Returns:

Tuple data. - np.ndarray of Delta, - list of gradations

forward_propagation()

Forward propagation.

Parameters:batch_observed_arr – Array like or sparse matrix as the observed data points.
Returns:Array like or sparse matrix as the predicted data points.
get_feature_points()

Extract the activities in hidden layer and reset it, considering this method will be called per one cycle in instances of time-series.

Returns:The list of array like or sparse matrix of feature points or virtual visible observed data points.
get_graph()

getter

get_opt_params()

getter

get_verificatable_result()

getter

graph

getter

hidden_back_propagate()

Back propagation in hidden layer.

Parameters:delta_output_arr – Delta.
Returns:Tuple data. - np.ndarray of Delta, - list of gradations.
hidden_forward_propagate()

Forward propagation in LSTM gate.

Parameters:observed_arrnp.ndarray of observed data points.
Returns:Predicted data points.
inference()

Inference the feature points to reconstruct the time-series.

Override.

Parameters:
  • observed_arr – Array like or sparse matrix as the observed data points.
  • hidden_activity_arr – Array like or sparse matrix as the state in hidden layer.
  • rnn_activity_arr – Array like or sparse matrix as the state in RNN.
Returns:

Tuple data. - Array like or sparse matrix of reconstructed instances of time-series, - Array like or sparse matrix of the state in hidden layer, - Array like or sparse matrix of the state in RNN.

learn()

Learn the observed data points for vector representation of the input time-series.

Override.

Parameters:
  • observed_arr – Array like or sparse matrix as the observed data points.
  • target_arr – Array like or sparse matrix as the target data points. To learn as Auto-encoder, this value must be None or equivalent to observed_arr.
load_pre_learned_params()

Load pre-learned parameters.

Parameters:
  • dir_name – Path of dir. If None, the file is saved in the current directory.
  • file_name – File name.
lstm_backward()

Back propagation in LSTM gate.

Parameters:
  • delta_hidden_arr – Delta from output layer to hidden layer.
  • delta_rnn_arr – Delta in LSTM gate.
  • cycle – Now cycle or time.
Returns:

Tuple data. - Delta from hidden layer to input layer, - Delta in hidden layer at previous time, - Delta in LSTM gate at previous time, - list of gradations.

opt_params

getter

optimize()

Optimization.

Parameters:
  • grads_listlist of graduations.
  • learning_rate – Learning rate.
  • epoch – Now epoch.
output_back_propagate()

Back propagation in output layer.

Parameters:
  • pred_arrnp.ndarray of predicted data points.
  • delta_output_arr – Delta.
Returns:

Tuple data. - np.ndarray of Delta, - list of gradations.

output_forward_propagate()

Forward propagation in output layer.

Parameters:pred_arrnp.ndarray of predicted data points.
Returns:np.ndarray of propagated data points.
save_pre_learned_params()

Save pre-learned parameters.

Parameters:
  • dir_name – Path of dir. If None, the file is saved in the current directory.
  • file_name – File name.
set_graph()

setter

set_opt_params()

setter

set_verificatable_result()

setter

verificatable_result

getter

Module contents