Abstract
Fisher’s Linear Discriminant Analysis (LDA) is a traditional dimensionality reduction method that has been proven to be successful for decades. Numerous variants, such as the Kernel-based Fisher Discriminant Analysis (KFDA) have been proposed to enhance the LDA’s power for nonlinear discriminants. Though effective, the KFDA is computationally expensive, since the complexity increases with the size of the data set. In this paper, we suggest a novel strategy to enhance the computation for an entire family of KFDA’s. Rather than invoke the KFDA for the entire data set, we advocate that the data be first reduced into a smaller representative subset using a Prototype Reduction Scheme (PRS), and that dimensionality reduction be achieved by invoking a KFDA on this reduced data set. In this way data points which are ineffective in the dimension reduction and classification can be eliminated to obtain a significantly reduced kernel matrix, K, without degrading the performance. Our experimental results demonstrate that the proposed mechanism dramatically reduces the computation time without sacrificing the classification accuracy for artificial and real-life data sets.
The second author was partially supported by NSERC, the Natural Sciences and Engineering Research Council of Canada. This work was generously supported by the Korea Research Foundation Grant funded by the Korea Government (MOEHRD-KRF-2005-042-D00265).
Chapter PDF
Similar content being viewed by others
Keywords
- Training Sample
- Linear Discriminant Analysis
- Near Neighbor
- Kernel Matrix
- Kernel Principal Component Analysis
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, San Diego (1990)
Camastra, F.: Data Dimensionality Estimation Methods: A Survey. Pattern Recognition 36, 2945–2954 (2003)
Lotlikar, R., Kothari, R.: Adaptive Linear Dimensionality Reduction for Classification. Pattern Recognition 33, 185–194 (2000)
Yang, J., Yang, J.-Y.: Why can LDA be performed in PCA transformed Space? Pattern Recognition 36, 563–566 (2003)
Cooke, T.: Two Variations on Fisher’s Linear Discriminant for Pattern Recognition. IEEE Trans. Pattern Anal. and Machine Intell. PAMI-24(2), 268–273 (2002)
Schölkopf, B., Smola, A.J., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10, 1299–1319 (1998)
Mika, S., Ratsch, G., Schölkopf, B., Smola, A., Weston, J., Müller, K.R.: Fisher Discriminant Analysis with Kernels. In: Proc. of IEEE International Workshop Neural Networks for Signal Processing IX, pp. 41–48 (August 1999)
Baudat, G., Anouar, F.: Generalized Discriminant Analysis Using a Kernel Approach. Neural Comput. 12, 2385–2404 (2000)
Yang, J., Frangi, A.F., Yang, J.-Y., Zhang, D.: KPCA plus LDA: A Complete Kernel Fisher Discriminant Framework for Feature Extraction and Recognition. IEEE Trans. Pattern Anal. and Machine Intell. PAMI-27(2), 230–244 (2005)
Loog, M., Duin, R.P.W.: Linear dimensionality reduction via a Heteroscedastic extension of LDA: The Chernoff criterion. IEEE Trans. Pattern Anal. and Machine Intell. PAMI-26(6), 732–739 (2004)
Achlioptas, D., McSherry, F.: Fast computation of low-rank approximations. In: Proc. of the Thirty-Third Annual ACM Symposium on the Theory of Computing, Hersonissos, Greece, pp. 611–618. ACM Press, New York (2001)
Achlioptas, D., McSherry, F., Schölkopf, B.: Sampling techniques for kernel methods. In: Advances in Neural Information Processing Systems, vol. 14, pp. 335–342. MIT Press, Cambridge (2002)
Smola, A.J., Schölkopf, B.: Sparse greedy matrix approximation for machine learning. In: Proc. of ICML 2000, Bochum, Germany, pp. 911–918. Morgan Kaufmann, San Francisco (2000)
Williams, C., Seeger, M.: Using the Nystrom method to speed up kernel machines. In: Advances in Neural Information Processing Systems, vol. 13. MIT Press, Cambridge (2001)
Tipping, M.: Sparse kernel principal component analysis. In: Advances in Neural Information Processing Systems, vol. 13, pp. 633–639. MIT Press, Cambridge (2001)
Cawley, G.C., Talbot, N.L.C.: Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers. Pattern Recognition 36, 2585–2592 (2003)
Xu, Y., Yang, J.-Y., Yang, J.: A reformative kernel Fisher discriminant analysis. Pattern Recognition 37, 1299–1302 (2004)
Xu, Y., Yang, J.-Y., Lu, J., Yu, D.-J.: An efficient renovation on kernel Fisher discriminant analysis and face recognition experiments. Pattern Recognition 37, 2091–2094 (2004)
Billings, S.A., Lee, K.L.: Nonlinear Fisher discriminant analysis using a minimum squared error cost function and the orthogonal least squares algorithm. Neural Networks 15(2), 263–270 (2002)
Bezdek, J.C., Kuncheva, L.I.: Nearest prototype classifier designs: An experimental study. Int’l. Journal of Intelligent Systems 16(12), 1445–1473 (2001)
Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)
Hart, P.E.: The condensed nearest neighbor rule. IEEE Trans. Inform. Theory IT-14, 515–516 (1968)
Chang, C.L.: Finding prototypes for nearest neighbor classifiers. IEEE Trans. Computers C-23(11), 1179–1184 (1974)
Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2), 121–167 (1998)
Kim, S.-W., Oommen, B.J.: Enhancing prototype reduction schemes with LVQ3-type algorithms. Pattern Recognition 36(5), 1083–1093 (2003)
Kim, S.-W., Oommen, B.J.: On using prototype reduction schemes to optimize kernel-based Fisher discriminant analysis (Unabridged version of this paper)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kim, SW., Oommen, B.J. (2006). On Optimizing Kernel-Based Fisher Discriminant Analysis Using Prototype Reduction Schemes. In: Yeung, DY., Kwok, J.T., Fred, A., Roli, F., de Ridder, D. (eds) Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2006. Lecture Notes in Computer Science, vol 4109. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11815921_91
Download citation
DOI: https://doi.org/10.1007/11815921_91
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37236-3
Online ISBN: 978-3-540-37241-7
eBook Packages: Computer ScienceComputer Science (R0)