Search the FAQ Archives

3 - A - B - C - D - E - F - G - H - I - J - K - L - M
N - O - P - Q - R - S - T - U - V - W - X - Y - Z
faqs.org - Internet FAQ Archives

comp.ai.neural-nets FAQ, Part 4 of 7: Books, data, etc.

( Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page )
[ Usenet FAQs | Web FAQs | Documents | RFC Index | Zip codes ]

See reader questions & answers on this topic! - Help others by sharing your knowledge
Copyright 1997, 1998, 1999, 2000, 2001, 2002 by Warren S. Sarle, Cary, NC,
USA. Reviews provided by other authors as cited below are copyrighted by
those authors, who by submitting the reviews for the FAQ give permission for
the review to be reproduced as part of the FAQ in any of the ways specified
in part 1 of the FAQ. 

This is part 4 (of 7) of a monthly posting to the Usenet newsgroup
comp.ai.neural-nets. See the part 1 of this posting for full information
what it is all about.

========== Questions ========== 
********************************

Part 1: Introduction
Part 2: Learning
Part 3: Generalization
Part 4: Books, data, etc.

   Books and articles about Neural Networks?
      The Best
         The best of the best
         The best popular introduction to NNs
         The best introductory book for business executives
         The best elementary textbooks
         The best books on using and programming NNs
         The best intermediate textbooks on NNs
         The best advanced textbook covering NNs
         The best book on neurofuzzy systems
         The best comparison of NNs with other classification methods
      Other notable books
         Introductory
         Bayesian learning
         Biological learning and neurophysiology
         Collections
         Combining networks
         Connectionism
         Feedforward networks
         Fuzzy logic and neurofuzzy systems
         General (including SVMs and Fuzzy Logic)
         History
         Knowledge, rules, and expert systems
         Learning theory
         Object oriented programming
         On-line and incremental learning
         Optimization
         Pulsed/Spiking networks
         Recurrent
         Reinforcement learning
         Speech recognition
         Statistics
         Time-series forecasting
         Unsupervised learning
      Books for the Beginner
      Not-quite-so-introductory Literature
      Books with Source Code (C, C++)
      The Worst
   Journals and magazines about Neural Networks?
   Conferences and Workshops on Neural Networks?
   Neural Network Associations?
   Mailing lists, BBS, CD-ROM?
   How to benchmark learning methods?
   Databases for experimentation with NNs?
      UCI machine learning database
      UCI KDD Archive
      The neural-bench Benchmark collection
      Proben1
      Delve: Data for Evaluating Learning in Valid Experiments
      Bilkent University Function Approximation Repository
      NIST special databases of the National Institute Of Standards And
      Technology:
      CEDAR CD-ROM 1: Database of Handwritten Cities, States, ZIP Codes,
      Digits, and Alphabetic Characters
      AI-CD-ROM
      Time series
      Financial data
      USENIX Faces
      Linguistic Data Consortium
      Otago Speech Corpus
      Astronomical Time Series
      Miscellaneous Images
      StatLib

Part 5: Free software
Part 6: Commercial software
Part 7: Hardware and miscellaneous

User Contributions:

1
Majid Maqbool
Sep 27, 2024 @ 5:05 am
https://techpassion.co.uk/how-does-a-smart-tv-work-read-complete-details/
PDP++ is a neural-network simulation system written in C++, developed as an advanced version of the original PDP software from McClelland and Rumelhart's "Explorations in Parallel Distributed Processing Handbook" (1987). The software is designed for both novice users and researchers, providing flexibility and power in cognitive neuroscience studies. Featured in Randall C. O'Reilly and Yuko Munakata's "Computational Explorations in Cognitive Neuroscience" (2000), PDP++ supports a wide range of algorithms. These include feedforward and recurrent error backpropagation, with continuous and real-time models such as Almeida-Pineda. It also incorporates constraint satisfaction algorithms like Boltzmann Machines, Hopfield networks, and mean-field networks, as well as self-organizing learning algorithms, including Self-organizing Maps (SOM) and Hebbian learning. Additionally, it supports mixtures-of-experts models and the Leabra algorithm, which combines error-driven and Hebbian learning with k-Winners-Take-All inhibitory competition. PDP++ is a comprehensive tool for exploring neural network models in cognitive neuroscience.

Comment about this article, ask questions, or add new information about this topic:



Section Contents



Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page

[ Usenet FAQs | Web FAQs | Documents | RFC Index ]

Send corrections/additions to the FAQ Maintainer:
saswss@unx.sas.com (Warren Sarle)





Last Update March 27 2014 @ 02:11 PM