Implementation of the sigmoid function for neural network activation.
This is a common function to use for neural networks and recommended
if you don't know which one to pick.
Note: the sigmoid function can only return between 0 and 1. And near the edges of that,
it takes an extreme input to get that. I may be doing something majorly wrong, but I'd recommend
expecting outputs in the area of 0.25 - 0.75. If you are expecting boolean outputs,
I'd do 0.25 = false, 0.75 = true or something like that. If you are expecting floating point output,
I'd map the things you care about to 0.25-0.75 range. This seems to work well for me, but
I'm no expert in neural networks or maths.