Search the FAQ Archives

3 - A - B - C - D - E - F - G - H - I - J - K - L - M
N - O - P - Q - R - S - T - U - V - W - X - Y - Z
faqs.org - Internet FAQ Archives

comp.ai.neural-nets FAQ, Part 2 of 7: Learning
Section - How to avoid overflow in the logistic function?

( Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page )
[ Usenet FAQs | Web FAQs | Documents | RFC Index | Houses ]


Top Document: comp.ai.neural-nets FAQ, Part 2 of 7: Learning
Previous Document: Why use activation functions?
Next Document: What is a softmax activation function?
See reader questions & answers on this topic! - Help others by sharing your knowledge

The formula for the logistic activation function is often written as: 

   netoutput = 1 / (1+exp(-netinput));

But this formula can produce floating-point overflow in the exponential
function if you program it in this simple form. To avoid overflow, you can
do this: 

   if (netinput < -45) netoutput = 0;
   else if (netinput > 45) netoutput = 1;
   else netoutput = 1 / (1+exp(-netinput));

The constant 45 will work for double precision on all machines that I know
of, but there may be some bizarre machines where it will require some
adjustment. Other activation functions can be handled similarly. 

User Contributions:

Report this comment as inappropriate
Aug 22, 2014 @ 9:09 am
I'm writing a Matlab code for face recognition , feature extraction has been done using PCA and the classifier is a back Propagation Neural Network, i have a strange problem, that every time i run my code, it gives me a random different output, although i'm using the same parameters of BPNN (momentum, learning rate, no. of epochs, goal , ...etc,) and of course the same input ,,, what is the problem ? is there is any thing in the back propagation neural network that randomize the output ?? where is the problem i don't know ?

Comment about this article, ask questions, or add new information about this topic:

CAPTCHA




Top Document: comp.ai.neural-nets FAQ, Part 2 of 7: Learning
Previous Document: Why use activation functions?
Next Document: What is a softmax activation function?

Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page

[ Usenet FAQs | Web FAQs | Documents | RFC Index ]

Send corrections/additions to the FAQ Maintainer:
saswss@unx.sas.com (Warren Sarle)





Last Update March 27 2014 @ 02:11 PM