Activation Function
A
function used to transform the activation level of a unit (neuron) into an
output signal is known as Activation Function.
In
computation networks, the activation function of a node defines the output of
that node given an input or set of inputs.
In
biologically inspires neural networks, the activation function is usually an
abstraction representing the art of action potential firing in the cell. In
simplest form, this function is binary that is, either a neuron is firing or
not. This function looks like P(Vi)=U(Vi) where U is the
Heavy side step function.
Activation
function answers a single question
“Some
of the input switches are turns in. Shall we turn the output switch?”
Activation
function is often confused with Non-Linear transformation like a Sigmoid or
Softmax .
Infact,
when you want an activation, you compute the sigmoid first (since it’s
continuously differentiable) and then you send the input to activation function
which checks whether the output of sigmoid higher than its activation
threshold.
Activation
Function is a decision making function that determines presence of particular
feature.
Zero
means neuron says feature is not present.
One
means neuron says feature is present
The
reason you find continuous activation functions is for optimisation purpose.
The
preceptron are intuitive. We train a network by adjusting the weights by small
increments and see what happens.
The
problem is the step function , small changes occurring in the weights
cannot be reflected in the activation value because it can only swing between 0
and 1.
That
is why, sigmoid functions were introduced because they are differentiable , in
fact all modern activation functions are continuous and differentiable.
Thus,
Activation function is a decision function having some Non- Linearity in them
and one needs non-linear decision functions.
Types of Activation functions:
1. Threshold Function: A
threshold(hard-limiter) activation function is either binary type or a bipolar
type.
2. Binary threshold function
can be represented as :
i.e.
It ouputs 1 if weighted sum is positive
and 0
if weighted sum is negative.
No comments:
Post a Comment