Abstract
Recently, a simple and efficient learning algorithm for single hidden layer feedforward networks (SLFNs) called extreme learning machine (ELM) has been developed by G.-B. Huang et al. One key strength of ELM algorithm is that there is only one parameter, the number of hidden nodes, to be determined while it has the significantly low computational time required for training new classifiers and good generalization performance. However, there is no effective method for finding the proper and universal number of hidden nodes. In order to address this problem, we propose a self-adaptive extreme learning machine (SELM) algorithm. SELM algorithm determines self-adaptively the number of hidden nodes and constructs Gaussian function as activation functions of hidden nodes. And in this algorithm, rough set theory acts as the pre-treatment cell to eliminate the redundant attributes of data sets. Then, affinity propagation clustering (AP Clustering) is used to self-adaptively determine the number of hidden nodes, while the centers and widths of AP clustering are utilized to construct a Gaussian function in the hidden layer of SLFNs. Empirical study of SELM algorithm on several commonly used classification benchmark problems shows that the proposed algorithm can find the proper number of hidden nodes and construct compact network classifiers, comparing with traditional ELM algorithm.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Xin-Zheng X, Ding S-F, Shi Z-Z, Zhu H. Optimizing radial basis function neural network based on rough set and AP clustering algorithm. J Zhejiang Univ (SCIENCE A). 2012;13(2):131–8.
He Q, Shang T-F, Zhuang F-Z, Shi Z-Z. Parallel extreme learning machine for regression based on MapReduce. Neurocomputing. 2013;102(2):52–8.
Xie F-Y, Bovik VAC. Automatic segmentation of dermoscopy images using self-generating neural networks seeded by genetic algorithm. Pattern Recogn. 2013;46(3):1012–9.
Ding S-F, Jia W-K, Chun-Yang S, et al. Research of neural network algorithm based on factor analysis and cluster analysis. Neural Comput Appl. 2011;20(2):297–302.
Razavi S, Tolson BA. A new formulation for feedforward neural networks. IEEE Trans Neural Netw. 2011;22(10):1588–98.
Kimura D, Nii M, Yamaguchi T, Takahashi Y, Yumoto T. Fuzzy nonlinear regression analysis using fuzzified neural networks for fault diagnosis of chemical plants. J Adv Comput Intell Intell Inf. 2011;15(3):336–44.
Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of international joint conference on neural networks (IJCNN2004). Hungary: Budapest; 2004. 985–90.
Huang G-B, Zhu QinYu, Siew C-K. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(1–3):489–501.
Huang G-B, Chen L, Siew C-K. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw. 2006;17(4):879–92.
Huang G-B. An Insight into Extreme learning machines: random neurons, random features and kernels. Cogn Comput. 2014;6(3):376–90.
Ding S-F, Xin-Zheng X, Nie R. Extreme learning machine and its applications. Neural Comput Appl. 2014;25(3):549–56.
Huang G, Huang G-B, Song S-J, You K-Y. Trends in extreme learning machines: A review. Neural Netw. 2015;61:32–48.
Kasun L-L-C, Zhou H-M, Huang G-B, Vong C-M. Representational learning with extreme learning machine for big data. IEEE Intell Syst. 2013;28(6):31–4.
Bai Z, Huang G-B, Wang D-W, Wang H, Westover M-B. Sparse extreme learning machine for classification. IEEE Trans Cybern. 2014;44(10):1858–70.
Huang G, Song S-J, Gupta J-N-D, Wu C. Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern. 2014;44(12):2405–17.
Lin S-B, Liu X, Fang J, Zong-Ben X. Is extreme learning machine feasible? A theoretical assessment (part II). IEEE Trans Neural Netw Learn Syst. 2015;26(1):21–34.
Rong H-J, Ong Y-S, Tan A-H, Zhu Z-X. A fast pruned-extreme learning machine for classification problem. Neurocomputing. 2008;72(1–3):359–66.
Huang G-B, Li M-B, Chen L, Siew C-K. Incremental extreme learning machine with fully complex hidden nodes. Neurocomputing. 2008;71(4–6):576–83.
Huang G-B, Chen L. Convex incremental extreme learning machine. Neurocomputing. 2007;70(16–18):3056–62.
Huang G-B, Chen L. Enhanced random search based incremental extreme learning machine. Neurocomputing. 2008;71(16–18):3460–8.
Pawlak Z. Rough set. Int J Comput Inf Sci. 1982;11:341–56.
Cao Y, Chen X-H, Wu DD, Mo M. Early warning of enterprise decline in a life cycle using neural networks and rough set theory. Exp Syst Appl. 2011;38(6):6424–9.
Cheng J-H, Chen H-P, Lin Y-M. A hybrid forecast marketing timing model based on probabilistic neural network, rough set and C4.5. Exp Syst Appl. 2010;37(3):1814–20.
Frey B, Dueck D. Clustering by passing messages between data points. Science. 2007;315(5814):972–6.
Acknowledgments
This work is supported by the Fundamental Research Funds for the Central Universities (No. 2015XKMS088).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
Li Xu, Shifei Ding, Xinzheng Xu and Nan Zhang declare that they have no conflict of interest.
Informed Consent
All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5). Additional informed consent was obtained from all patients for which identifying information is included in this article.
Human and Animal Rights
This article does not contain any studies with human or animal subjects performed by the any of the authors.
Rights and permissions
About this article
Cite this article
Xu, L., Ding, S., Xu, X. et al. Self-adaptive Extreme Learning Machine Optimized by Rough Set Theory and Affinity Propagation Clustering. Cogn Comput 8, 720–728 (2016). https://doi.org/10.1007/s12559-016-9409-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-016-9409-5