pydbm.cnn.convolutionalneuralnetwork.convolutionalautoencoder package

Submodules

pydbm.cnn.convolutionalneuralnetwork.convolutionalautoencoder.contractive_convolutional_auto_encoder module

class pydbm.cnn.convolutionalneuralnetwork.convolutionalautoencoder.contractive_convolutional_auto_encoder.ContractiveConvolutionalAutoEncoder

Bases: pydbm.cnn.convolutionalneuralnetwork.convolutional_auto_encoder.ConvolutionalAutoEncoder

Contractive Convolutional Auto-Encoder which is-a ConvolutionalNeuralNetwork.

The First-Order Contractive Auto-Encoder(Rifai, S., et al., 2011) executes the representation learning by adding a penalty term to the classical reconstruction cost function. This penalty term corresponds to the Frobenius norm of the Jacobian matrix of the encoder activations with respect to the input and results in a localized space contraction which in turn yields robust features on the activation layer.

Analogically, the Contractive Convolutional Auto-Encoder calculates the penalty term. But it differs in that the operation of the deconvolution intervenes insted of inner product.

Note that it is only an intuitive application in this library.

References

  • Kamyshanska, H., & Memisevic, R. (2014). The potential energy of an autoencoder. IEEE transactions on pattern analysis and machine intelligence, 37(6), 1261-1273.
  • Rifai, S., Vincent, P., Muller, X., Glorot, X., & Bengio, Y. (2011, June). Contractive auto-encoders: Explicit invariance during feature extraction. In Proceedings of the 28th International Conference on International Conference on Machine Learning (pp. 833-840). Omnipress.
  • Rifai, S., Mesnil, G., Vincent, P., Muller, X., Bengio, Y., Dauphin, Y., & Glorot, X. (2011, September). Higher order contractive auto-encoder. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases (pp. 645-660). Springer, Berlin, Heidelberg.
forward_propagation

Forward propagation in CNN.

Parameters:img_arrnp.ndarray of image file array.
Returns:Propagated np.ndarray.
get_penalty_lambda

getter for Positive hyperparameter that controls the strength of the regularization.

penalty_lambda

getter for Positive hyperparameter that controls the strength of the regularization.

set_penalty_lambda

setter for Positive hyperparameter that controls the strength of the regularization.

pydbm.cnn.convolutionalneuralnetwork.convolutionalautoencoder.convolutional_ladder_networks module

class pydbm.cnn.convolutionalneuralnetwork.convolutionalautoencoder.convolutional_ladder_networks.ConvolutionalLadderNetworks

Bases: pydbm.cnn.convolutionalneuralnetwork.convolutional_auto_encoder.ConvolutionalAutoEncoder

Ladder Networks with a Stacked convolutional Auto-Encoder.

References

  • Bengio, Y., Lamblin, P., Popovici, D., & Larochelle, H. (2007). Greedy layer-wise training of deep networks. In Advances in neural information processing systems (pp. 153-160).
  • Dumoulin, V., & V,kisin, F. (2016). A guide to convolution arithmetic for deep learning. arXiv preprint arXiv:1603.07285.
  • Erhan, D., Bengio, Y., Courville, A., Manzagol, P. A., Vincent, P., & Bengio, S. (2010). Why does unsupervised pre-training help deep learning?. Journal of Machine Learning Research, 11(Feb), 625-660.
  • Erhan, D., Courville, A., & Bengio, Y. (2010). Understanding representations learned in deep architectures. Department dInformatique et Recherche Operationnelle, University of Montreal, QC, Canada, Tech. Rep, 1355, 1.
  • Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning (adaptive computation and machine learning series). Adaptive Computation and Machine Learning series, 800.
  • Masci, J., Meier, U., Cireşan, D., & Schmidhuber, J. (2011, June). Stacked convolutional auto-encoders for hierarchical feature extraction. In International Conference on Artificial Neural Networks (pp. 52-59). Springer, Berlin, Heidelberg.
  • Rasmus, A., Berglund, M., Honkala, M., Valpola, H., & Raiko, T. (2015). Semi-supervised learning with ladder networks. In Advances in neural information processing systems (pp. 3546-3554).
  • Valpola, H. (2015). From neural PCA to deep unsupervised learning. In Advances in Independent Component Analysis and Learning Machines (pp. 143-171). Academic Press.
alpha_loss_arr

getter

back_propagation

Back propagation in CNN.

Override.

Parameters:Delta.
Returns.
Delta.
compute_alpha_loss

Compute denoising loss weighted alpha.

Returns:loss.
compute_mu_loss

Compute mu loss weighted mu.

Returns:loss.
compute_sigma_loss

Compute sigma loss weighted sigma.

Returns:loss.
extract_feature_points_arr

Extract feature points.

Returns:np.ndarray of feature points in hidden layer which means the encoded data.
forward_propagation

Forward propagation in Convolutional Auto-Encoder.

Override.

Parameters:img_arrnp.ndarray of image file array.
Returns:Propagated np.ndarray.
get_alpha_loss_arr

getter

get_mu_loss_arr

getter

get_sigma_loss_arr

getter

learn

Learn.

Parameters:
  • observed_arrnp.ndarray of observed data points.
  • target_arrnp.ndarray of labeled data. If None, the function of this cnn model is equivalent to Convolutional Auto-Encoder.
learn_generated

Learn features generated by FeatureGenerator.

Parameters:feature_generator – is-a FeatureGenerator.
mu_loss_arr

getter

optimize

Back propagation.

Parameters:
  • learning_rate – Learning rate.
  • epoch – Now epoch.
set_readonly

setter

sigma_loss_arr

getter

pydbm.cnn.convolutionalneuralnetwork.convolutionalautoencoder.repelling_convolutional_auto_encoder module

class pydbm.cnn.convolutionalneuralnetwork.convolutionalautoencoder.repelling_convolutional_auto_encoder.RepellingConvolutionalAutoEncoder

Bases: pydbm.cnn.convolutionalneuralnetwork.convolutional_auto_encoder.ConvolutionalAutoEncoder

Repelling Convolutional Auto-Encoder which is-a ConvolutionalNeuralNetwork.

This Convolutional Auto-Encoder calculates the Repelling regularizer(Zhao, J., et al., 2016) as a penalty term.

Note that it is only an intuitive application in this library.

References

  • Zhao, J., Mathieu, M., & LeCun, Y. (2016). Energy-based generative adversarial network. arXiv preprint arXiv:1609.03126.
back_propagation

Back propagation in CNN.

Override.

Parameters:Delta.
Returns.
Delta.
forward_propagation

Forward propagation in CNN.

Parameters:img_arrnp.ndarray of image file array.
Returns:Propagated np.ndarray.

Module contents