Feature selection (FS) is an essential technique widely applied in data mining. Recent studies have shown that evolutionary computing (EC) is very promising for FS due to its powerful search… Click to show full abstract
Feature selection (FS) is an essential technique widely applied in data mining. Recent studies have shown that evolutionary computing (EC) is very promising for FS due to its powerful search capability. However, most existing EC-based FS methods use a length-fixed encoding to represent feature subsets. This inflexible encoding turns ineffective when high-dimension data are handled, because it results in a huge search space, as well as a large amount of training time and memory overhead. In this article, we propose a length-adaptive genetic algorithm with Markov blanket (LAGAM), which adopts a length-variable individual encoding and enables individuals to evolve in their own search space. In LAGAM, features are rearranged decreasingly based on their relevance, and an adaptive length changing operator is introduced, which extends or shortens an individual to guide it to explore in a better search space. Local search based on Markov blanket (MB) is embedded to further improve individuals. Experiments are conducted on 12 high-dimensional datasets and results reveal that LAGAM performs better than existing methods. Specifically, it achieves a higher classification accuracy by using fewer features.
               
Click one of the above tabs to view related content.