|
Top Document: comp.ai.neural-nets FAQ, Part 4 of 7: Books, data, etc. Previous Document: Conferences and Workshops on Neural Next Document: Mailing lists, BBS, CD-ROM? See reader questions & answers on this topic! - Help others by sharing your knowledge 1. International Neural Network Society (INNS). +++++++++++++++++++++++++++++++++++++++++++++++ INNS membership includes subscription to "Neural Networks", the official journal of the society. Membership is $55 for non-students and $45 for students per year. Address: INNS Membership, P.O. Box 491166, Ft. Washington, MD 20749. 2. International Student Society for Neural Networks ++++++++++++++++++++++++++++++++++++++++++++++++++++ (ISSNNets). +++++++++++ Membership is $5 per year. Address: ISSNNet, Inc., P.O. Box 15661, Boston, MA 02215 USA 3. Women In Neural Network Research and technology ++++++++++++++++++++++++++++++++++++++++++++++++++ (WINNERS). ++++++++++ Address: WINNERS, c/o Judith Dayhoff, 11141 Georgia Ave., Suite 206, Wheaton, MD 20902. Phone: 301-933-9000. 4. European Neural Network Society (ENNS) +++++++++++++++++++++++++++++++++++++++++ ENNS membership includes subscription to "Neural Networks", the official journal of the society. Membership is currently (1994) 50 UK pounds (35 UK pounds for students) per year. Address: ENNS Membership, Centre for Neural Networks, King's College London, Strand, London WC2R 2LS, United Kingdom. 5. Japanese Neural Network Society (JNNS) +++++++++++++++++++++++++++++++++++++++++ Address: Japanese Neural Network Society; Department of Engineering, Tamagawa University; 6-1-1, Tamagawa Gakuen, Machida City, Tokyo; 194 JAPAN; Phone: +81 427 28 3457, Fax: +81 427 28 3597 6. Association des Connexionnistes en THese (ACTH) ++++++++++++++++++++++++++++++++++++++++++++++++++ (the French Student Association for Neural Networks); Membership is 100 FF per year; Activities: newsletter, conference (every year), list of members, electronic forum; Journal 'Valgo' (ISSN 1243-4825); WWW page: http://www.supelec-rennes.fr/acth/welcome.html ; Contact: acth@loria.fr 7. Neurosciences et Sciences de l'Ingenieur (NSI) +++++++++++++++++++++++++++++++++++++++++++++++++ Biology & Computer Science Activity : conference (every year) Address : NSI - TIRF / INPG 46 avenue Felix Viallet 38031 Grenoble Cedex FRANCE 8. IEEE Neural Networks Council +++++++++++++++++++++++++++++++ Web page at http://www.ieee.org/nnc 9. SNN (Foundation for Neural Networks) +++++++++++++++++++++++++++++++++++++++ The Foundation for Neural Networks (SNN) is a university based non-profit organization that stimulates basic and applied research on neural networks in the Netherlands. Every year SNN orgines a symposium on Neural Networks. See http://www.mbfys.kun.nl/SNN/. You can find nice lists of NN societies in the WWW at http://www.emsl.pnl.gov:2080/proj/neuron/neural/societies.html and at http://www.ieee.org:80/nnc/research/othernnsoc.html. User Contributions:Comment about this article, ask questions, or add new information about this topic:Top Document: comp.ai.neural-nets FAQ, Part 4 of 7: Books, data, etc. Previous Document: Conferences and Workshops on Neural Next Document: Mailing lists, BBS, CD-ROM? Part1 - Part2 - Part3 - Part4 - Part5 - Part6 - Part7 - Single Page [ Usenet FAQs | Web FAQs | Documents | RFC Index ] Send corrections/additions to the FAQ Maintainer: saswss@unx.sas.com (Warren Sarle)
Last Update March 27 2014 @ 02:11 PM
|

PDP++ is a neural-network simulation system written in C++, developed as an advanced version of the original PDP software from McClelland and Rumelhart's "Explorations in Parallel Distributed Processing Handbook" (1987). The software is designed for both novice users and researchers, providing flexibility and power in cognitive neuroscience studies. Featured in Randall C. O'Reilly and Yuko Munakata's "Computational Explorations in Cognitive Neuroscience" (2000), PDP++ supports a wide range of algorithms. These include feedforward and recurrent error backpropagation, with continuous and real-time models such as Almeida-Pineda. It also incorporates constraint satisfaction algorithms like Boltzmann Machines, Hopfield networks, and mean-field networks, as well as self-organizing learning algorithms, including Self-organizing Maps (SOM) and Hebbian learning. Additionally, it supports mixtures-of-experts models and the Leabra algorithm, which combines error-driven and Hebbian learning with k-Winners-Take-All inhibitory competition. PDP++ is a comprehensive tool for exploring neural network models in cognitive neuroscience.