Abstract
What makes people smarter than computers? These volumes by a pioneering neurocomputing group suggest that the answer lies in the massively parallel architecture of the human mind. They describe a new theory of cognition called connectionism that is challenging the idea of symbolic computation that has traditionally been at the center of debate in theoretical discussions about the mind. The authors' theory assumes the mind is composed of a great number of elementary units connected in a neural network. Mental processes are interactions between these units which excite and inhibit each other in parallel rather than sequential operations. In this context, knowledge can no longer be thought of as stored in localized structures; instead, it consists of the connections between pairs of units that are distributed throughout the network. Volume 1 lays the foundations of this exciting theory of parallel distributed processing, while Volume 2 applies it to a number of specific issues in cognitive science and neuroscience, with chapters describing models of aspects of perception, memory, language, and thought.
Keywords
Related Publications
Parallel networks that learn to pronounce English text
This paper describes NETtalk, a class of massively-parallel network systems that learn to convert English text to speech. The memory representations for pronunciations are learn...
INFORMATION THEORY AND STIMULUS‐INDEPENDENT THOUGHT
The production of stimulus‐independent thought (e.g. fantasy and imagery) was measured as a function of the rate at which information was presented to human subjects. Informatio...
Conceptual Processing during the Conscious Resting State: A Functional MRI Study
Abstract Localized, task-induced decreases in cerebral blood flow are a frequent finding in functional brain imaging research but remain poorly understood. One account of these ...
Highway Networks
There is plenty of theoretical and empirical evidence that depth of neural networks is a crucial ingredient for their success. However, network training becomes more difficult w...
Long Short-Term Memory
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We brief...
Publication Info
- Year
- 1986
- Type
- book
- Citations
- 15204
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.7551/mitpress/5236.001.0001