Search the FAQ Archives

3 - A - B - C - D - E - F - G - H - I - J - K - L - M
N - O - P - Q - R - S - T - U - V - W - X - Y - Z
faqs.org - Internet FAQ Archives

comp.ai.neural-nets FAQ, Part 7 of 7: Hardware
Section - What about Genetic Algorithms?

( Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page )
[ Usenet FAQs | Web FAQs | Documents | RFC Index | Property taxes ]


Top Document: comp.ai.neural-nets FAQ, Part 7 of 7: Hardware
Previous Document: What about pulsed or spiking NNs?
Next Document: What about Fuzzy Logic?
See reader questions & answers on this topic! - Help others by sharing your knowledge

There are a number of definitions of GA (Genetic Algorithm). A possible one
is

  A GA is an optimization program
  that starts with
  a population of encoded procedures,       (Creation of Life :-> )
  mutates them stochastically,              (Get cancer or so :-> )
  and uses a selection process              (Darwinism)
  to prefer the mutants with high fitness
  and perhaps a recombination process       (Make babies :-> )
  to combine properties of (preferably) the succesful mutants.

Genetic algorithms are just a special case of the more general idea of
"evolutionary computation". There is a newsgroup that is dedicated to the
field of evolutionary computation called comp.ai.genetic. It has a detailed
FAQ posting which, for instance, explains the terms "Genetic Algorithm",
"Evolutionary Programming", "Evolution Strategy", "Classifier System", and
"Genetic Programming". That FAQ also contains lots of pointers to relevant
literature, software, other sources of information, et cetera et cetera.
Please see the comp.ai.genetic FAQ for further information. 

For an entertaining introduction to evolutionary training of neural nets,
see: 

   David Fogel (2001), Blondie24: Playing at the Edge of AI, Morgan Kaufmann
   Publishers, ISBN: 1558607838 

There are other books and papers by Fogel and his colleagues listed under 
"Checkers/Draughts" in the "Games, sports, gambling" section above. 

For an extensive review, see: 

   Yao, X. (1999), "Evolving Artificial Neural Networks," Proceedings of the
   IEEE, 87, 1423-1447, http://www.cs.bham.ac.uk/~xin/journal_papers.html 

Here are some other on-line papers about evolutionary training of NNs: 

 o Backprop+GA: http://geneura.ugr.es/~pedro/G-Prop.htm 

 o LVQ+GA: http://geneura.ugr.es/g-lvq/g-lvq.html 

 o Very long chromosomes: 
   ftp://archive.cis.ohio-state.edu/pub/neuroprose/korning.nnga.ps.Z 

More URLs on genetic algorithms and NNs: 

 o Omri Weisman and Ziv Pollack's web page on "Neural Network Using Genetic
   Algorithms" at http://www.cs.bgu.ac.il/~omri/NNUGA/ 

 o Christoph M. Friedrich's web page on Evolutionary algorithms and
   Artificial Neural Networks has a bibloigraphy and links to researchers at
   http://www.tussy.uni-wh.de/~chris/gann/gann.html 

 o Andrew Gray's Hybrid Systems FAQ at the University of Otago at 
   http://divcom.otago.ac.nz:800/COM/INFOSCI/SMRL/people/andrew/publications/faq/hybrid/hybrid.htm

 o Differential Evolution: http://www.icsi.berkeley.edu/~storn/code.html 

For general information on GAs, try the links at 
http://www.shef.ac.uk/~gaipp/galinks.html and http://www.cs.unibo.it/~gaioni

User Contributions:

1
Majid Maqbool
Sep 27, 2024 @ 5:05 am
https://techpassion.co.uk/how-does-a-smart-tv-work-read-complete-details/
PDP++ is a neural-network simulation system written in C++, developed as an advanced version of the original PDP software from McClelland and Rumelhart's "Explorations in Parallel Distributed Processing Handbook" (1987). The software is designed for both novice users and researchers, providing flexibility and power in cognitive neuroscience studies. Featured in Randall C. O'Reilly and Yuko Munakata's "Computational Explorations in Cognitive Neuroscience" (2000), PDP++ supports a wide range of algorithms. These include feedforward and recurrent error backpropagation, with continuous and real-time models such as Almeida-Pineda. It also incorporates constraint satisfaction algorithms like Boltzmann Machines, Hopfield networks, and mean-field networks, as well as self-organizing learning algorithms, including Self-organizing Maps (SOM) and Hebbian learning. Additionally, it supports mixtures-of-experts models and the Leabra algorithm, which combines error-driven and Hebbian learning with k-Winners-Take-All inhibitory competition. PDP++ is a comprehensive tool for exploring neural network models in cognitive neuroscience.

Comment about this article, ask questions, or add new information about this topic:




Top Document: comp.ai.neural-nets FAQ, Part 7 of 7: Hardware
Previous Document: What about pulsed or spiking NNs?
Next Document: What about Fuzzy Logic?

Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page

[ Usenet FAQs | Web FAQs | Documents | RFC Index ]

Send corrections/additions to the FAQ Maintainer:
saswss@unx.sas.com (Warren Sarle)





Last Update March 27 2014 @ 02:11 PM