This book tries to answer the question posed by Minsky at the beginning of The Society of Mind: "to explain the mind, we have to show how minds are built from mindless stuff, from parts that are much smaller and simpler than anything we'd considered smart." The author believes that cognition should not be rooted in innate rules and primitives, but rather grounded in human memory. More specifically, he suggests viewing linguistic comprehension as a time-constrained process -- a race for building an interpretation in short term memory.
After reviewing existing psychological and computational approaches to text understanding and concluding that they generally rely on self-validating primitives, the author abandons this objectivist and normative approach to meaning and develops a set of requirements for a grounded cognitive architecture. He then goes on to explain how this architecture must avoid all epistemological commitments, be tractable both with respect to space and time, and, most importantly, account for the diachronic and non-deterministic nature of comprehension. In other words, a text may or may not lead to an interpretation for a specific reader, and may be associated with several interpretations over time by one reader.
Throughout the remainder of the book, the author demonstrates that rules for all major facets of comprehension -- syntax, reference resolution, quantification, lexical and structural disambiguation, inference and subject matter -- can be expressed in terms of the simple mechanistic computing elements of a massively parallel network modeling memory. These elements, called knowledge units, work in a limited amount of time and have the ability not only to recognize but also to build the structures that make up an interpretation.
Designed as a main text for graduate courses, this volume is essential to the fields of cognitive science, artificial intelligence, memory modeling, text understanding, computational linguistics and natural language understanding. Other areas of application are schema-matching, hermeneutics, local connectionism, and text linguistics. With its extensive bibliography, the book is also valuable as supplemental reading for introductory undergraduate courses in cognitive science and computational linguistics.
This book tries to answer the question posed by Minsky at the beginning of The Society of Mind: "to explain the mind, we have to show how minds are built from mindless stuff, from parts that are much smaller and simpler than anything we'd considered smart." The author believes that cognition should not be rooted in innate rules and primitives, but rather grounded in human memory. More specifically, he suggests viewing linguistic comprehension as a time-constrained process -- a race for building an interpretation in short term memory.
After reviewing existing psychological and computational approaches to text understanding and concluding that they generally rely on self-validating primitives, the author abandons this objectivist and normative approach to meaning and develops a set of requirements for a grounded cognitive architecture. He then goes on to explain how this architecture must avoid all epistemological commitments, be tractable both with respect to space and time, and, most importantly, account for the diachronic and non-deterministic nature of comprehension. In other words, a text may or may not lead to an interpretation for a specific reader, and may be associated with several interpretations over time by one reader.
Throughout the remainder of the book, the author demonstrates that rules for all major facets of comprehension -- syntax, reference resolution, quantification, lexical and structural disambiguation, inference and subject matter -- can be expressed in terms of the simple mechanistic computing elements of a massively parallel network modeling memory. These elements, called knowledge units, work in a limited amount of time and have the ability not only to recognize but also to build the structures that make up an interpretation.
Designed as a main text for graduate courses, this volume is essential to the fields of cognitive science, artificial intelligence, memory modeling, text understanding, computational linguistics and natural language understanding. Other areas of application are schema-matching, hermeneutics, local connectionism, and text linguistics. With its extensive bibliography, the book is also valuable as supplemental reading for introductory undergraduate courses in cognitive science and computational linguistics.
Time-constrained Memory: A Reader-based Approach To Text Comprehension
432Time-constrained Memory: A Reader-based Approach To Text Comprehension
432eBook
Related collections and offers
Product Details
ISBN-13: | 9781317780106 |
---|---|
Publisher: | Taylor & Francis |
Publication date: | 02/04/2014 |
Sold by: | Barnes & Noble |
Format: | eBook |
Pages: | 432 |
File size: | 19 MB |
Note: | This product may take a few minutes to download. |