Randomized shallow/deep neural networks with closed form solution avoid the shortcomings that exist in the back propagation (BP) based trained neural networks. Ensemble deep random vector functional link (edRVFL) network… Click to show full abstract
Randomized shallow/deep neural networks with closed form solution avoid the shortcomings that exist in the back propagation (BP) based trained neural networks. Ensemble deep random vector functional link (edRVFL) network utilize the strength of two growing fields, i.e., deep learning and ensemble learning. However, edRVFL model doesn't consider the geometrical relationship of the data while calculating the final output parameters corresponding to each layer considered as base model. In the literature, graph embedded frameworks have been successfully used to describe the geometrical relationship within data. In this paper, we propose an extended graph embedded RVFL (EGERVFL) model that, unlike standard RVFL, employs both intrinsic and penalty subspace learning (SL) criteria under the graph embedded framework in its optimization process to calculate the model's output parameters. The proposed shallow EGERVFL model has only single hidden layer and hence, has less representation learning. Therefore, we further develop an ensemble deep EGERVFL (edEGERVFL) model that can be considered a variant of edRVFL model. Unlike edRVFL, the proposed edEGERVFL model solves graph embedded based optimization problem in each layer and hence, has better generalization performance than edRVFL model. We evaluated the proposed approaches for the diagnosis of Alzheimer's disease and furthermore on UCI datasets. The experimental results demonstrate that the proposed models perform better than baseline models. The source code of the proposed models is available at https://github.com/mtanveer1/.
               
Click one of the above tabs to view related content.