Skip to content

Implementing Leaky Relu, parametric and other forms of Relu #322

@naveenjafer

Description

@naveenjafer

I am working on an implementation of LeakyRelu. I would like some input on how to go about with the implementation of the same. There are 2 options.

  1. A separate layer named LeakyRelu, ParamRelu etc for each of the relu variations.
  2. One single Relu layer that instead takes optional params and implements them. (This would greatly reduce duplicated code, but also does reduce the visibility of the layer to end users unless they spend some time on the documentation).

Keras and Pytorch seem to have separate layers for each of the Relu variations, but I am inclined more towards a single relu with the right parameters. What would you guys suggest?

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementFeature requests and improvementsfeat / layersWeights layers, transforms, combinators, wrappers

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions