In multi-class discriminant analysis for High Dimension Low Sample Size settings it is not possible to define Fisher’s discriminant function, since the sample covariance matrix is singular. For the special case of two-class problems, the naive Bayes rule has been studied, and combined with feature selection, this approach yields good practical results. We show how to extend the naive Bayes rule based on the naive canonical correlation matrix to a general setting for K≥2 classes, and we propose variable ranking and feature selection methods which integrate information from all K-1 eigenvectors. Provided the dimension does not grow too fast, we show that the K-1 sample eigenvectors are consistent estimators of the corresponding population parameters as both the dimension and sample size grow, and we give upper bounds for the misclassification rate. For real and simulated data we illustrate the performance of the new method which results in lower errors and typically smaller numbers of selected variables than existing methods.

This research is a joint work with Mitsuru Tamatani (JSPS & Shimane University) and Inge Koch (University of Adelaide).


Professor Kanta Naito

Research Area

Shimane University, Japan


Fri, 21/03/2014 - 4:00pm to 5:00pm


OMB-145, Old Main Building, UNSW Kensington Campus