Search the FAQ Archives

3 - A - B - C - D - E - F - G - H - I - J - K - L - M
N - O - P - Q - R - S - T - U - V - W - X - Y - Z
faqs.org - Internet FAQ Archives

comp.ai.neural-nets FAQ, Part 2 of 7: Learning
Section - What is ART?

( Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page )
[ Usenet FAQs | Web FAQs | Documents | RFC Index | Forum archive ]


Top Document: comp.ai.neural-nets FAQ, Part 2 of 7: Learning
Previous Document: How to measure importance of inputs?
Next Document: What is PNN?
See reader questions & answers on this topic! - Help others by sharing your knowledge

ART stands for "Adaptive Resonance Theory", invented by Stephen Grossberg in
1976. ART encompasses a wide variety of neural networks based explicitly on
neurophysiology. ART networks are defined algorithmically in terms of
detailed differential equations intended as plausible models of biological
neurons. In practice, ART networks are implemented using analytical
solutions or approximations to these differential equations. 

ART comes in several flavors, both supervised and unsupervised. As discussed
by Moore (1988), the unsupervised ARTs are basically similar to many
iterative clustering algorithms in which each case is processed by: 

1. finding the "nearest" cluster seed (AKA prototype or template) to that
   case 
2. updating that cluster seed to be "closer" to the case 

where "nearest" and "closer" can be defined in hundreds of different ways.
In ART, the framework is modified slightly by introducing the concept of
"resonance" so that each case is processed by: 

1. finding the "nearest" cluster seed that "resonates" with the case 
2. updating that cluster seed to be "closer" to the case 

"Resonance" is just a matter of being within a certain threshold of a second
similarity measure. A crucial feature of ART is that if no seed resonates
with the case, a new cluster is created as in Hartigan's (1975) leader
algorithm. This feature is said to solve the "stability-plasticity dilemma"
(See "Sequential Learning, Catastrophic Interference, and the
Stability-Plasticity Dilemma" 

ART has its own jargon. For example, data are called an "arbitrary sequence
of input patterns". The current training case is stored in "short term
memory" and cluster seeds are "long term memory". A cluster is a "maximally
compressed pattern recognition code". The two stages of finding the nearest
seed to the input are performed by an "Attentional Subsystem" and an
"Orienting Subsystem", the latter of which performs "hypothesis testing",
which simply refers to the comparison with the vigilance threshhold, not to
hypothesis testing in the statistical sense. "Stable learning" means that
the algorithm converges. So the often-repeated claim that ART algorithms are
"capable of rapid stable learning of recognition codes in response to
arbitrary sequences of input patterns" merely means that ART algorithms are
clustering algorithms that converge; it does not mean, as one might naively
assume, that the clusters are insensitive to the sequence in which the
training patterns are presented--quite the opposite is true. 

There are various supervised ART algorithms that are named with the suffix
"MAP", as in Fuzzy ARTMAP. These algorithms cluster both the inputs and
targets and associate the two sets of clusters. The effect is somewhat
similar to counterpropagation. The main disadvantage of most ARTMAP
algorithms is that they have no mechanism to avoid overfitting and hence
should not be used with noisy data (Williamson, 1995). 

For more information, see the ART FAQ at http://www.wi.leidenuniv.nl/art/
and the "ART Headquarters" at Boston University, http://cns-web.bu.edu/. For
a statistical view of ART, see Sarle (1995). 

For C software, see the ART Gallery at 
http://cns-web.bu.edu/pub/laliden/WWW/nnet.frame.html 

References: 

   Carpenter, G.A., Grossberg, S. (1996), "Learning, Categorization, Rule
   Formation, and Prediction by Fuzzy Neural Networks," in Chen, C.H., ed.
   (1996) Fuzzy Logic and Neural Network Handbook, NY: McGraw-Hill, pp.
   1.3-1.45. 

   Hartigan, J.A. (1975), Clustering Algorithms, NY: Wiley. 

   Kasuba, T. (1993), "Simplified Fuzzy ARTMAP," AI Expert, 8, 18-25. 

   Moore, B. (1988), "ART 1 and Pattern Clustering," in Touretzky, D.,
   Hinton, G. and Sejnowski, T., eds., Proceedings of the 1988
   Connectionist Models Summer School, 174-185, San Mateo, CA: Morgan
   Kaufmann. 

   Sarle, W.S. (1995), "Why Statisticians Should Not FART," 
   ftp://ftp.sas.com/pub/neural/fart.txt 

   Williamson, J.R. (1995), "Gaussian ARTMAP: A Neural Network for Fast
   Incremental Learning of Noisy Multidimensional Maps," Technical Report
   CAS/CNS-95-003, Boston University, Center of Adaptive Systems and
   Department of Cognitive and Neural Systems. 

User Contributions:

Comment about this article, ask questions, or add new information about this topic:

CAPTCHA




Top Document: comp.ai.neural-nets FAQ, Part 2 of 7: Learning
Previous Document: How to measure importance of inputs?
Next Document: What is PNN?

Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page

[ Usenet FAQs | Web FAQs | Documents | RFC Index ]

Send corrections/additions to the FAQ Maintainer:
saswss@unx.sas.com (Warren Sarle)





Last Update March 27 2014 @ 02:11 PM