Skip to content

Added BatchNormalization layers and LeakyReLU activation function#3

Open
sebaspv wants to merge 1 commit intophreeza:masterfrom
sebaspv:master
Open

Added BatchNormalization layers and LeakyReLU activation function#3
sebaspv wants to merge 1 commit intophreeza:masterfrom
sebaspv:master

Conversation

@sebaspv
Copy link
Copy Markdown

@sebaspv sebaspv commented Oct 19, 2020

Added BatchNormalization layers to both the decoder and the encoder, as it assures a faster training, and added LeakyReLu activation function to the decoder, as its better than ReLU in the discriminator case.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant