# pydbm.activation.interface package¶

## pydbm.activation.interface.activating_function_interface module¶

class pydbm.activation.interface.activating_function_interface.ActivatingFunctionInterface

Bases: object

Abstract class for building activation functions.

Two distinctions are introduced in this class design.

What was first introduced is the distinction between an activate in forward propagation and a derivative in back propagation. This two kind of methods enable implementation of learning algorithm based on probabilistic gradient descent method etc, in relation to the neural networks theory.

The second distinction corresponds to the difference based on the presence or absence of memory retention. In activate and derivative, the memories of propagated data points will be stored for computing delta. On the other hand, in forward and backword, the memories will be not stored.

The methods that can perform forward and back propagation independently of the recording for delta calculations are particularly useful for models such as ConvolutionalAutoEncoder that perform deconvolution as transposition.

activate

Activate and extract feature points in forward propagation.

Parameters: np.ndarray of observed data points. (x) – np.ndarray of the activated feature points.
backward

Back propagation but not operate the activation.

Parameters: y – np.ndarray of delta. The result.
batch_norm

getter

derivative

Derivative and extract delta in back propagation.

Parameters: y – np.ndarray of delta. np.ndarray of delta.
forward

Forward propagation but not retain the activation.

Parameters: np.ndarray of observed data points. (x) – The result.
get_batch_norm

getter

set_batch_norm

setter