Hi Kamil,

Generally speaking, you will use sigmoid when you want the output from a given layer to be between 0 and 1. Tanh forces the output to be between -1 and 1, and relu just forces output to be > 0.

There is a great lecture series on neural networks at Stanford. Lecture 5 in that series has a section on the different kinds of activation functions and when to use them: http://cs231n.stanford.edu/slides/winter1516_lecture5.pdf . Hopefully that is helpful!

PhD. Interests include Deep (Reinforcement) Learning, Computational Neuroscience, and Phenomenology.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store