pygan package

Subpackages

Submodules

pygan.discriminative_model module

class pygan.discriminative_model.DiscriminativeModel[source]

Bases: object

Discriminator which discriminates true from fake.

feature_matching_backward(grad_arr)[source]

Back propagation in only first or intermediate layer for so-called Feature matching.

Parameters:observed_arrnp.ndarray of observed data points.
Returns:np.ndarray of outputs.
feature_matching_forward(observed_arr)[source]

Forward propagation in only first or intermediate layer for so-called Feature matching.

Parameters:observed_arrnp.ndarray of observed data points.
Returns:np.ndarray of outputs.
inference(observed_arr)[source]

Draws samples from the true distribution.

Parameters:observed_arrnp.ndarray of observed data points.
Returns:np.ndarray of inferenced. 0 is to 1 what fake is to true.
learn(grad_arr, fix_opt_flag=False)[source]

Update this Discriminator by ascending its stochastic gradient.

Parameters:
  • grad_arrnp.ndarray of gradients.
  • fix_opt_flag – If False, no optimization in this model will be done.
Returns:

np.ndarray of delta or gradients.

pygan.feature_matching module

class pygan.feature_matching.FeatureMatching(lambda1=1.0, lambda2=0.0, computable_loss=None)[source]

Bases: object

Value function with Feature matching, which addresses the instability of GANs by specifying a new objective for the generator that prevents it from overtraining on the current discriminator(Salimans, T., et al., 2016).

“Instead of directly maximizing the output of the discriminator, the new objective requires the generator to generate data that matches the statistics of the real data, where we use the discriminator only to specify the statistics that we think are worth matching.” (Salimans, T., et al., 2016, p2.)

References

  • Salimans, T., Goodfellow, I., Zaremba, W., Cheung, V., Radford, A., & Chen, X. (2016). Improved techniques for training gans. In Advances in neural information processing systems (pp. 2234-2242).
  • Yang, L. C., Chou, S. Y., & Yang, Y. H. (2017). MidiNet: A convolutional generative adversarial network for symbolic-domain music generation. arXiv preprint arXiv:1703.10847.
computable_loss

getter

compute_delta(true_sampler, discriminative_model, generated_arr)[source]

Compute generator’s reward.

Parameters:
  • true_sampler – Sampler which draws samples from the true distribution.
  • discriminative_model – Discriminator which discriminates true from fake.
  • generated_arrnp.ndarray generated data points.
Returns:

np.ndarray of Gradients.

get_computable_loss()[source]

getter

get_loss_arr()[source]

getter

get_true_arr()[source]

getter

loss_arr

getter

set_readonly(value)[source]

setter

true_arr

getter

pygan.gans_value_function module

class pygan.gans_value_function.GANsValueFunction[source]

Bases: object

The interface to compute rewards.

compute_discriminator_reward(true_posterior_arr, generated_posterior_arr)[source]

Compute discriminator’s reward.

Parameters:
  • true_posterior_arrnp.ndarray of true posterior inferenced by the discriminator.
  • generated_posterior_arrnp.ndarray of fake posterior inferenced by the discriminator.
Returns:

np.ndarray of Gradients.

compute_generator_reward(generated_posterior_arr)[source]

Compute generator’s reward.

Parameters:generated_posterior_arrnp.ndarray of fake posterior inferenced by the discriminator.
Returns:np.ndarray of Gradients.

pygan.generative_adversarial_networks module

class pygan.generative_adversarial_networks.GenerativeAdversarialNetworks(gans_value_function=None, feature_matching=False)[source]

Bases: object

The controller for the Generative Adversarial Networks(GANs).

extract_logs_tuple()[source]

Extract update logs data.

Returns:
  • list of probabilities inferenced by the discriminator (mean) in the discriminator’s update turn.
  • list of probabilities inferenced by the discriminator (mean) in the generator’s update turn.
Return type:The shape is
feature_matching

getter

get_feature_matching()[source]

getter

set_readonly(value)[source]

setter

train(true_sampler, generative_model, discriminative_model, iter_n=100, k_step=10)[source]

Train.

Parameters:
  • true_sampler – Sampler which draws samples from the true distribution.
  • generative_model – Generator which draws samples from the fake distribution.
  • discriminative_model – Discriminator which discriminates true from fake.
  • iter_n – The number of training iterations.
  • k_step – The number of learning of the discriminative_model.
Returns:

Tuple data. - trained Generator which is-a GenerativeModel. - trained Discriminator which is-a DiscriminativeModel.

train_discriminator(k_step, true_sampler, generative_model, discriminative_model, d_logs_list)[source]

Train the discriminator.

Parameters:
  • k_step – The number of learning of the discriminative_model.
  • true_sampler – Sampler which draws samples from the true distribution.
  • generative_model – Generator which draws samples from the fake distribution.
  • discriminative_model – Discriminator which discriminates true from fake.
  • d_logs_listlist of probabilities inferenced by the discriminator (mean) in the discriminator’s update turn.
Returns:

Tuple data. The shape is… - Discriminator which discriminates true from fake. - list of probabilities inferenced by the discriminator (mean) in the discriminator’s update turn.

train_generator(true_sampler, generative_model, discriminative_model, g_logs_list)[source]

Train the generator.

Parameters:
  • true_sampler – Sampler which draws samples from the true distribution.
  • generative_model – Generator which draws samples from the fake distribution.
  • discriminative_model – Discriminator which discriminates true from fake.
  • g_logs_listlist of Probabilities inferenced by the discriminator (mean) in the generator’s update turn.
Returns:

Tuple data. The shape is… - Generator which draws samples from the fake distribution. - list of probabilities inferenced by the discriminator (mean) in the generator’s update turn.

pygan.generative_model module

class pygan.generative_model.GenerativeModel[source]

Bases: object

Sampler which draws samples from the fake distribution.

draw()[source]

Draws samples from the fake distribution.

Returns:np.ndarray of samples.
get_noise_sampler()[source]

getter

learn(grad_arr)[source]

Update this Generator by ascending its stochastic gradient.

Parameters:grad_arrnp.ndarray of gradients.
Returns:np.ndarray of delta or gradients.
noise_sampler

getter

set_noise_sampler(value)[source]

setter

switch_inferencing_mode(inferencing_mode=True)[source]

Set inferencing mode in relation to concrete regularizations.

Parameters:inferencing_mode – Inferencing mode or not.

pygan.noise_sampler module

class pygan.noise_sampler.NoiseSampler[source]

Bases: object

Generate samples based on the noise prior.

generate()[source]

Generate noise samples.

Returns:np.ndarray of samples.
get_noise_sampler()[source]

getter for a NoiseSampler.

noise_sampler

getter for a NoiseSampler.

set_noise_sampler(value)[source]

setter for a NoiseSampler.

pygan.true_sampler module

class pygan.true_sampler.TrueSampler[source]

Bases: object

Sampler which draws samples from the true distribution.

draw()[source]

Draws samples from the true distribution.

Returns:np.ndarray of samples.

Module contents