Abstract Capture of long-term dependencies is a core task in sequence learning, and imitating the way of human memorization is a promising orientation. The existing algorithms can fractionate the sequence… Click to show full abstract
Abstract Capture of long-term dependencies is a core task in sequence learning, and imitating the way of human memorization is a promising orientation. The existing algorithms can fractionate the sequence into segments with fixed length according to a prior knowledge, but the segmentation depends on the context and is difficult to assign length before network training. Thus in this paper, we propose a variant of segmented-memory neural network which can segment the sequence into arbitrary lengths and then perform cascade via a binarized mask of memory slots. For optimization of the network, we deduce an optimal mask theoretically, and then apply it in a novel scheme based on a sparsity regularizer. In experiments, we conduct ablation analysis and evaluation on some algorithmic or classification tasks, several models including the proposed one optimized by using lasso regularizer are adopted for comparison. Both of the fixed- and variable-length sequences are tested, and the results in different criteria have demonstrated the superiority of our proposed method.
               
Click one of the above tabs to view related content.