In many applications, selecting the optimal features is a difficult task. Numerous In optimization problems, e.g., feature selection (FS) problem, have been solved using optimization algorithms. In this paper, the… Click to show full abstract
In many applications, selecting the optimal features is a difficult task. Numerous In optimization problems, e.g., feature selection (FS) problem, have been solved using optimization algorithms. In this paper, the most discriminating features were chosen using a new chaotic gradient-based optimizer (CGBO) that combines chaotic maps with searching iterations of the gradient-based optimizer (GBO). Ten chaotic maps were utilised to update the parameters, eliminate a local optimum and premature convergence, accelerate convergence, and enhance the efficiency of GBO. A classifier was used in FS approaches to determine the best subset of characteristics. The proposed CGBO uses the k-nearest neighbor as an objective function for the classification process in FS. Ten datasets from the UCI machine learning repository were used to validate CGBO. In an experiment, CGBO outperformed five other metaheuristic algorithms: particle swarm optimization(PSO), moth flame optimizer(MFO), sine cosine algorithm(SCA), salp swarm algorithm(SSA), and GBO. The results demonstrated the capability of CGBO to find and select the optimal feature subset, which maximized the classification performance and minimized the number of features selected efficiently compared with other metaheuristic algorithms.
               
Click one of the above tabs to view related content.