CSLI Publications logo
new books
catalog
series
knuth books
contact
for authors
order
search
CSLI Publications
Facebook CSLI Publications RSS feed
CSLI Publications Newsletter Signup Button
 
Holographic Reduced Representation cover

Holographic Reduced Representation

Distributed Representation for Cognitive Structures

Tony A. Plate

While neuroscientists garner success in identifying brain regions and in analyzing individual neurons, ground is still being broken at the intermediate scale of understanding how neurons combine to encode information. This book proposes a method of representing information in a computer that would be suited for modeling the brain's methods of processing information. Holographic Reduced Representations (HRRs) are introduced here to model how the brain distributes each piece of information among thousands of neurons. It had been previously thought that the grammatical structure of a language cannot be encoded practically in a distributed representation, but HRRs can overcome the problems of earlier proposals. Thus this work has implications for psychology, neuroscience, linguistics, and computer science, and engineering.

Tony A. Plate is a research scientist at Black Mesa Capital in Santa Fe, New Mexico.

Contents

  • Preface
  • 1 Introduction
    • 1.1 Representations, descriptions, and implementations
    • 1.2 Connectionist issues
    • 1.3 Representational issues
    • 1.4 Connectionist representions
    • 1.5 Reduced Descriptions
    • 1.6 Book outline
  • 2 Review of connectionist and distributed memory models
    • 2.1 Localist connectionist models
    • 2.2 Distributed connectionist models for simple structure
    • 2.3 Connectionsist models that learn to process language
    • 2.4 Distributed connectionist models for explicit representation of structure
    • 2.5 Learning distributed representations
  • 3 Holographic Reduced Representation
    • 3.1 Circular convolution and correlation
    • 3.2 Superposition Memories
    • 3.3 The need for clean-up memories
    • 3.4 Representing complex structure
    • 3.5 Constraints on the vectors and the representation of features and tokens
    • 3.6 Mathematical Properties
    • 3.7 Faster comparison
    • 3.8 The capacity of convolution memories and HRRs
    • 3.9 Convolution-based memories versus matrix-based memories
    • 3.10 An example of encoding and decoding HRRs
    • 3.11 Discussion
    • 3.12 Summary
  • 4 HRRs in the frequency domain
    • 4.1 Circular Vectors
    • 4.2 HRR operations with circular vectors
    • 4.3 Relationship to binary spatter codes
    • 4.4 Comparison with standard system
  • 5 Using convolution-based storage in systems that learn
    • 5.1 Trajectory-association
    • 5.2 Encoding and decoding systems for trajectoy-associated sequences
    • 5.3 Trajectory-association decoding in a recurrent network
    • 5.4 Simple Recurrent Networks
    • 5.5 Training and productive capacity results
    • 5.6 Trajectories in continuous space
    • 5.7 Hierarchical HRNs
    • 5.8 Discussion
    • 5.9 Conclusion
  • 6 Estimating analogical similarity
    • 6.1 An experiment with shap configurations
    • 6.2 Models of analogy processing
    • 6.3 Analogies between hierarchical structures
    • 6.4 Why HRR similarity reflects structural similarity
    • 6.5 Contextualized HRRs
    • 6.6 Interpretations of an analogy
    • 6.7 Discussion
    • 6.8 Coclusion
  • 7 Discussion
    • 7.1 Holistic processing
    • 7.2 Chunks and the organization of long-term memory
    • 7.3 Convoltution, tensor products, associative operators, conjunctive coding, and structure
    • 7.4 Implementation in neural tissue
    • 7.5 Weaknesses of HRRs
    • 7.6 Conclusion
  • Appendix A Means and variances of similarities between bindings
  • Appendix B The capacity of a superposition memory
    • B.1 Scaling properties of a simple superposition memory
    • B.2 A superposition memory with similarity among the vectors
    • B.3 Limitations of superposition
  • Appendix C A lower bound for the capacity of superposition memories
  • Appendix D THe capacity of convolution-based associative memories
    • D.1 A memory for paired-associates with no similarity among vectors
    • D.2 A memory for paired-associates with some similarity among vectors
    • D.3 Comments on the variable0binding memory with similarity
  • Appendix E A lower bound for the capacity of convolution memories
  • Appendix F Means and variances of a signal
  • Appendix G The effect of normalization on dot-products
    • G.1 Means and variances of dot-products of vectors with varying similarity
    • G.2 Means and variances of dot-products of convolution expressions
  • Appendix H HRRs with circular vectors
    • H.1 Means and variances of dot-products of vectors with varying similarity
    • H.2 Means and variances of dot-products for similarity and decoding
    • H.3 Results on analogical similarity estimation
  • Appendix I Arithmetic tables: an example of HRRs with many items in memory
    • I.1 THe objexts and relations
    • I.2 Queries to the memory
    • I.3 Using fast estimates of the dot-product
  • References
  • Subject Index
  • Author Index

4/1/2003

ISBN (Paperback): 1575864304 (9781575864303)
ISBN (Cloth): 1575864290 (9781575864297)
ISBN (Electronic): 157586939X (9781575869391)

Subject: Cognitive science

Add to Cart
View Cart

Check Out

Distributed by the
University of
Chicago Press

pubs @ csli.stanford.edu