-
Notifications
You must be signed in to change notification settings - Fork 294
Implementing Leaky Relu, parametric and other forms of Relu #322
Copy link
Copy link
Open
Labels
enhancementFeature requests and improvementsFeature requests and improvementsfeat / layersWeights layers, transforms, combinators, wrappersWeights layers, transforms, combinators, wrappers
Description
I am working on an implementation of LeakyRelu. I would like some input on how to go about with the implementation of the same. There are 2 options.
- A separate layer named LeakyRelu, ParamRelu etc for each of the relu variations.
- One single Relu layer that instead takes optional params and implements them. (This would greatly reduce duplicated code, but also does reduce the visibility of the layer to end users unless they spend some time on the documentation).
Keras and Pytorch seem to have separate layers for each of the Relu variations, but I am inclined more towards a single relu with the right parameters. What would you guys suggest?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementFeature requests and improvementsFeature requests and improvementsfeat / layersWeights layers, transforms, combinators, wrappersWeights layers, transforms, combinators, wrappers