4.6 Article

On separating long- and short-term memories in hyperdimensional computing

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Computer Science, Artificial Intelligence

Variable Binding for Sparse Distributed Representations: Theory and Applications

Edward Paxon Frady et al.

Summary: Variable binding is essential for symbolic reasoning and cognition, but implementing it in connectionist models has been a challenge. Vector symbolic architectures (VSAs) provide a natural solution that enables dimensionality-preserving binding, allowing representation of complex hierarchical structures without increasing dimensionality. This study explores symbolic reasoning using sparse distributed representations and proposes dimensionality-preserving binding methods. Experimental results show the effectiveness of block-local circular convolution binding with sparse block-codes, achieving similar performance as classical VSAs.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2023)

Article Computer Science, Artificial Intelligence

A comparison of vector symbolic architectures

Kenny Schlegel et al.

Summary: Vector Symbolic Architectures combine high-dimensional vector space with operators for symbolic computations. The paper provides an overview of available implementations, discusses differences in vector space and operators, and evaluates performance on tasks like visual place- and language-recognition through experimental comparison. The goal is to support development and selection of appropriate VSAs for different tasks.

ARTIFICIAL INTELLIGENCE REVIEW (2022)

Article Computer Science, Artificial Intelligence

Cellular Automata Can Reduce Memory Requirements of Collective-State Computing

Denis Kleyko et al.

Summary: This article discusses the optimization of collective-state computing models using random representations, achieving a space-time tradeoff by using the CA90 cellular automaton to balance memory requirements with computation. The randomization behavior of CA90 and the application of its expansion for representations are studied, showing similar performance to traditional collective-state models with randomly generated patterns stored in memory.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022)

Proceedings Paper Computer Science, Artificial Intelligence

Computing on Functions Using Randomized Vector Representations (in brief)

E. Paxon Frady et al.

Summary: This paper generalizes vector space models for symbolic processing to function spaces and introduces a new encoding and computing framework called Vector Function Architecture (VFA). The study shows that the fractional power encoding (FPE) can produce VFAs for computing with band-limited functions. Several applications of VFA models in image recognition, density estimation, and nonlinear regression are demonstrated, highlighting the potential of VFAs in artificial intelligence.

PROCEEDINGS OF THE 2022 ANNUAL NEURO-INSPIRED COMPUTATIONAL ELEMENTS CONFERENCE (NICE 2022) (2022)

Article Computer Science, Artificial Intelligence

Hyperseed: Unsupervised Learning With Vector Symbolic Architectures

Evgeny Osipov et al.

Summary: Motivated by recent innovations in biologically inspired neuromorphic hardware, this article introduces a novel unsupervised machine learning algorithm named Hyperseed, which draws on the principles of vector symbolic architectures (VSAs) for fast learning of topological feature mapping in unlabeled data. The algorithm is expressed within the Fourier holographic reduced representations (FHRR) model, and it has been empirically evaluated on synthetic datasets and benchmark use cases to demonstrate its capabilities and applications in neuromorphic hardware.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022)

Article Engineering, Electrical & Electronic

A 5 μW Standard Cell Memory-Based Configurable Hyperdimensional Computing Accelerator for Always-on Smart Sensing

Manuel Eggimann et al.

Summary: Hyperdimensional computing (HDC) is a brain-inspired computing paradigm based on high-dimensional holistic representations of vectors, gaining attention for embedded smart sensing. A programmable all-digital CMOS implementation of a fully autonomous HDC accelerator is proposed for energy-constrained sensor nodes, achieving extremely low power and energy efficiency improvement over existing architectures. The architecture includes novel hardware-friendly embodiments of common HDC-algorithmic primitives, leading to technology scaled area reduction and fully configurable datapath for flexibility.

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS (2021)

Article Computer Science, Artificial Intelligence

A Theoretical Perspective on Hyperdimensional Computing

Anthony Thomas et al.

JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH (2021)

Proceedings Paper Computer Science, Artificial Intelligence

Generalized Learning Vector Quantization for Classification in Randomized Neural Networks and Hyperdimensional Computing

Cameron Diao et al.

Summary: RVFL networks are popular for edge device applications due to their simplicity and training efficiency. We proposed a modified RVFL network with a GLVQ classifier, achieving state-of-the-art accuracy while reducing computational costs significantly, even with limited training iterations.

2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) (2021)

Article Computer Science, Hardware & Architecture

Hardware Optimizations of Dense Binary Hyperdimensional Computing: Rematerialization of Hypervectors, Binarized Bundling, and Combinational Associative Memory

Manuel Schmuck et al.

ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS (2019)

Article Computer Science, Artificial Intelligence

An Introduction to Hyperdimensional Computing for Robotics

Peer Neubert et al.

KUNSTLICHE INTELLIGENZ (2019)

Article Computer Science, Artificial Intelligence

A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks

E. Paxon Frady et al.

NEURAL COMPUTATION (2018)

Article Neurosciences

Optimal Degrees of Synaptic Connectivity

Ashok Litwin-Kumar et al.

NEURON (2017)

Article Multidisciplinary Sciences

A neural algorithm for a fundamental computing problem

Sanjoy Dasgupta et al.

SCIENCE (2017)

Article Computer Science, Artificial Intelligence

Holographic Graph Neuron: A Bioinspired Architecture for Pattern Processing

Denis Kleyko et al.

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2017)

Proceedings Paper Computer Science, Hardware & Architecture

Modification of Holographic Graph Neuron using Sparse Distributed Representations

Denis Kleyko et al.

7TH ANNUAL INTERNATIONAL CONFERENCE ON BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES, (BICA 2016) (2016)

Article Computer Science, Artificial Intelligence

Vector space architecture for emergent interoperability of systems by learning from demonstration

Blerim Emruli et al.

BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES (2015)

Article Computer Science, Artificial Intelligence

Representing Objects, Relations, and Sequences

Stephen I. Gallant et al.

NEURAL COMPUTATION (2013)

Article Computer Science, Artificial Intelligence

Building a world model with structure-sensitive sparse binary distributed representations

Dmitri A. Rachkovskij et al.

BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES (2013)

Article Computer Science, Artificial Intelligence

Collective communication for dense sensing environments

Predrag Jakimovski et al.

JOURNAL OF AMBIENT INTELLIGENCE AND SMART ENVIRONMENTS (2012)

Article Computer Science, Artificial Intelligence

Some approaches to analogical mapping with structure-sensitive distributed representations

DA Rachkovskij

JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE (2004)

Article Computer Science, Artificial Intelligence

Representation and processing of structures with binary sparse distributed codes

DA Rachkovskij

IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING (2001)