A new method for clustering functional data is proposed via information maximization. The proposed method learns a probabilistic classifier in an unsupervised manner so that mutual information (or squared loss… Click to show full abstract
A new method for clustering functional data is proposed via information maximization. The proposed method learns a probabilistic classifier in an unsupervised manner so that mutual information (or squared loss mutual information) between data points and cluster assignments is maximized. A notable advantage of this proposed method is that it only involves continuous optimization of model parameters, which is simpler than discrete optimization of cluster assignments and avoids the disadvantages of generative models. Unlike some existing methods, the proposed method does not require estimating the probability densities of Karhunen-Lo`eve expansion scores under different clusters and also does not require the common eigenfunction assumption. The empirical performance and the applications of the proposed methods are demonstrated by simulation studies and real data analyses. In addition, the proposed method allows for out-of-sample clustering, and its effect is comparable with that of some supervised classifiers.
               
Click one of the above tabs to view related content.