Multinomial logistic regression (MLR) has been widely used in the field of face recognition, text classification, and so on. However, the standard multinomial logistic regression has not yet stressed the… Click to show full abstract
Multinomial logistic regression (MLR) has been widely used in the field of face recognition, text classification, and so on. However, the standard multinomial logistic regression has not yet stressed the problem of data redundancy. That is to say, in multi-class classification, there are many similar features among different classes, which will cause the corresponding classes not being correctly classified. As data redundancy is a common phenomenon in many fields, in response to this phenomenon, this paper proposes a maximal uncorrelated MLR (MUMLR) classification model to solve the problem of data redundancy in multi-class classification. The main idea is to reduce the weight of similar features and try to keep more discriminative information in the data by adding an uncorrelated regularization. In addition, we use the Cauchy–Buniakowsky–Schwarz inequation to scale the original objective function into the convex function and solve it by the Adam optimization method. Its main advantages are as follows: for data with more redundant information, the classification effect of the proposed algorithm is better than the state-of-the-art algorithms. In addition, we prove that the regularization we proposed can also be applied to neural networks and has achieved good results.
               
Click one of the above tabs to view related content.