Generally speaking, you will use sigmoid when you want the output from a given layer to be between 0 and 1. Tanh forces the output to be between -1 and 1, and relu just forces output to be > 0.
There is a great lecture series on neural networks at Stanford. Lecture 5 in that series has a section on the different kinds of activation functions and when to use them: http://cs231n.stanford.edu/slides/winter1516_lecture5.pdf . Hopefully that is helpful!