LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Locality-Based Encoder and Model Quantization for Efficient Hyper-Dimensional Computing

Photo from wikipedia

Brain-inspired hyper-dimensional (HD) computing is a new computing paradigm emulating the neuron’s activity in high-dimensional space. The first step in HD computing is to map each data point into high-dimensional… Click to show full abstract

Brain-inspired hyper-dimensional (HD) computing is a new computing paradigm emulating the neuron’s activity in high-dimensional space. The first step in HD computing is to map each data point into high-dimensional space (e.g., 10 000), which requires the computation of thousands of operations for each element of data in the original domain. Encoding alone takes about 80% of the execution time of training. In this article, we propose, ReHD, an entire rework of encoding, training, and inference in HD computing for a more hardware friendly implementation. ReHD includes a full binary encoding module for HD computing for energy-efficient and high-accuracy classification. Our encoding module based on random projection with a predictable memory access pattern can be efficiently implemented in hardware. ReHD is the first HD-based approach that provides data projection with a 1:1 ratio to the original data and enables all training/inference computation to be performed using binary hypervectors. After the optimizations ReHD adds to the encoding process, retraining and inference become the energy intensive part of HD computing. To resolve this, we additionally propose model quantization. Model quantization introduces a novel method of storing class hypervectors using $n$ -bits, where $n$ ranges from 1 to 32, rather than at full 32-bit precision, which allows for fine-grained tuning of the tradeoff between energy efficiency and accuracy. To further improve ReHD efficiency, we developed an online dimension reduction approach that removes insignificant hypervector dimensions during training.

Keywords: dimensional computing; model; model quantization; hyper dimensional

Journal Title: IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.