pydbm.synapse.recurrenttemporalgraph.lstmgraph package¶
Submodules¶
pydbm.synapse.recurrenttemporalgraph.lstmgraph.attention_lstm_graph module¶
-
class
pydbm.synapse.recurrenttemporalgraph.lstmgraph.attention_lstm_graph.
AttentionLSTMGraph
¶ Bases:
pydbm.synapse.recurrenttemporalgraph.lstm_graph.LSTMGraph
Attention based on Long short term memory(LSTM) networks.
In relation to do transfer learning, this object is-a Synapse which can be delegated to AttentionLSTMModel.
-
attention_output_weight_arr
¶ getter
-
create_rnn_cells
¶ Create RNN cells for a LSTMModel.
Parameters: - input_neuron_count – The number of units in input layer.
- hidden_neuron_count – The number of units in hidden layer.
- output_neuron_count – The number of units in output layer.
- scale – Scale of parameters which will be ParamsInitializer.
- params_initializer – is-a ParamsInitializer.
- params_dict – dict of parameters other than size to be input to function ParamsInitializer.sample_f.
-
get_attention_output_weight_arr
¶ getter
-
set_attention_output_weight_arr
¶ setter
-