Now that Perceptilabs has moved to TensorFlow 2, please could we have the full range of activation functions that TF2 provides, as described here
Currently we seem to have None, Sigmoid, ReLU, Tanh or just a couple more
Rationale:
- Items that exist in TF should be easy to provide
- swish (x * sigmoid) is noted as being better than ReLU in many documented, standard cases and it is always nice to have the best tools for the job
I also note that it may also be possible to define custom activation functions, as described here on StackOverflow.
How would we do that in PerceptiLabs?