As a human-cortex-inspired computing model, hierarchical temporal memory (HTM) has shown great promise in sequence learning and has been applied to various time-series applications. HTM uses the combination of columns… Click to show full abstract
As a human-cortex-inspired computing model, hierarchical temporal memory (HTM) has shown great promise in sequence learning and has been applied to various time-series applications. HTM uses the combination of columns and neurons to learn the temporal patterns within the sequence. However, the conventional HTM model compacts the input into two naive column states—active and nonactive, and uses a fixed learning strategy. This simplicity limits the representation capability of HTM and ignores the impacts of active columns on learning the temporal context. To address these issues, we propose a new HTM algorithm based on activation intensity. By introducing the column activation intensity, more useful and fine-grained information from the input is retained for sequence learning. Furthermore, a self-adaptive nonlinear learning strategy is proposed where the synaptic connections are dynamically adjusted according to the activation intensity of columns. Extensive experiments are carried out on two real-world time-series datasets. Compared to the conventional HTM and LSTM model, our method achieved higher accuracy and less time overhead.
               
Click one of the above tabs to view related content.