Abstract A hybrid learning algorithm suitable for hardware implementation of multi-layer neural networks is proposed. Though backpropagation is a powerful learning method for multilayer neural networks, its hardware implementation is… Click to show full abstract
Abstract A hybrid learning algorithm suitable for hardware implementation of multi-layer neural networks is proposed. Though backpropagation is a powerful learning method for multilayer neural networks, its hardware implementation is difficult due to complexities of the neural synapses and the operations involved in error backpropagation. We propose a learning algorithm with performance comparable to but easier than backpropagation to be implemented in hardware for on-chip learning of multi-layer neural networks. In the proposed learning algorithm, a multilayer neural network is trained with a hybrid of gradient-based delta rule and a stochastic algorithm, called Random Weight Change. The parameters of the output layer are learned using the delta rule, whereas the inner layer parameters are learned using Random Weight Change, thereby the overall multilayer neural network is trained without the need for error backpropagation. Experimental results showing better performance of the proposed hybrid learning rule than either of its constituent learning algorithms, and comparable to that of backpropagation on the benchmark MNIST dataset are presented. Hardware architecture illustrating the ease of implementation of the proposed learning rule in analog hardware vis-a-vis the backpropagation algorithm is also presented.
               
Click one of the above tabs to view related content.