Feature Request: Activation Functions

Now that Perceptilabs has moved to TensorFlow 2, please could we have the full range of activation functions that TF2 provides, as described here

Currently we seem to have None, Sigmoid, ReLU, Tanh or just a couple more

Rationale:

  • Items that exist in TF should be easy to provide
  • swish (x * sigmoid) is noted as being better than ReLU in many documented, standard cases and it is always nice to have the best tools for the job

I also note that it may also be possible to define custom activation functions, as described here on StackOverflow.

How would we do that in PerceptiLabs?

Thanks for the suggestion @JulianSMoore!
I’ll add it to our backlog, should be a fairly easy one to hammer out.

As for the custom activation functions, it should be possible to create those in the custom code if I’m not mistaken?
Only downside is that you just can reach it from the component you introduce it to, but that will be fixed in the future with a global code editor as well.

1 Like