I am reading some texts for deep learning, but everyone seems to take it as self-explanatory. :)
My understanding is that the activation function gets a WX + b
which equals, lets say 1.745, as an input and returns a...value. What does this value represent? Is it just a new "weight"? Is it something else?
Top comments (3)
"Neurons that fire together wire together"?
From TensorFlow homepage: "Next, the network is asked to solve a problem, which it attempts to do over and over, each time strengthening the connections that lead to success and diminishing those that lead to failure."
Are you looking for more technical details or an abstract concept?
After reading more materials I realized that it is just an indicator of the input(eg maybe a probability) and the next layer knows how to deal with it.
Thanks :)
In neuroscience terms it means that the the threshold for an action potential (name of a process) has been passed (40mv) then the neuron will fire and an electrical impulse will propagate.