Support vector machines (SVMs), powerful learning methods, have been popular among machine learning researches due to their strong performance on both classification and regression problems. However, traditional SVM making use… Click to show full abstract
Support vector machines (SVMs), powerful learning methods, have been popular among machine learning researches due to their strong performance on both classification and regression problems. However, traditional SVM making use of Hinge Loss cannot deal with class imbalance problems, because it applies the same weight of loss to each class. Recently, Focal Loss has been widely used for deep learning to address the imbalanced datasets. The significant effectiveness of Focal loss attracts the attention in many fields, such as object detection, semantic segmentation. Inspired by Focal loss, we reconstructed Hinge Loss with the scaling factor of Focal loss, called FH Loss, which not only deals with the class imbalance problems but also preserve the distinctive property of Hinge loss. Owing to the difficulty of the trade-off between positive and negative accuracy in imbalanced classification, FH loss pays more attention on minority class and misclassified instances to improve the accuracy of each class, further to reduce the influence of imbalance. In addition, due to the difficulty of solving SVM with FH loss, we propose an improved model with modified FH loss, called Adaptive FH-SVM. The algorithm solves the optimization problem iteratively and adaptively updates the FH loss of each instance. Experimental results on 31 binary imbalanced datasets demonstrate the effectiveness of our proposed method.
               
Click one of the above tabs to view related content.