iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.1007/s12559-016-9409-5
Self-adaptive Extreme Learning Machine Optimized by Rough Set Theory and Affinity Propagation Clustering | Cognitive Computation Skip to main content
Log in

Self-adaptive Extreme Learning Machine Optimized by Rough Set Theory and Affinity Propagation Clustering

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Recently, a simple and efficient learning algorithm for single hidden layer feedforward networks (SLFNs) called extreme learning machine (ELM) has been developed by G.-B. Huang et al. One key strength of ELM algorithm is that there is only one parameter, the number of hidden nodes, to be determined while it has the significantly low computational time required for training new classifiers and good generalization performance. However, there is no effective method for finding the proper and universal number of hidden nodes. In order to address this problem, we propose a self-adaptive extreme learning machine (SELM) algorithm. SELM algorithm determines self-adaptively the number of hidden nodes and constructs Gaussian function as activation functions of hidden nodes. And in this algorithm, rough set theory acts as the pre-treatment cell to eliminate the redundant attributes of data sets. Then, affinity propagation clustering (AP Clustering) is used to self-adaptively determine the number of hidden nodes, while the centers and widths of AP clustering are utilized to construct a Gaussian function in the hidden layer of SLFNs. Empirical study of SELM algorithm on several commonly used classification benchmark problems shows that the proposed algorithm can find the proper number of hidden nodes and construct compact network classifiers, comparing with traditional ELM algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Xin-Zheng X, Ding S-F, Shi Z-Z, Zhu H. Optimizing radial basis function neural network based on rough set and AP clustering algorithm. J Zhejiang Univ (SCIENCE A). 2012;13(2):131–8.

    Google Scholar 

  2. He Q, Shang T-F, Zhuang F-Z, Shi Z-Z. Parallel extreme learning machine for regression based on MapReduce. Neurocomputing. 2013;102(2):52–8.

    Article  Google Scholar 

  3. Xie F-Y, Bovik VAC. Automatic segmentation of dermoscopy images using self-generating neural networks seeded by genetic algorithm. Pattern Recogn. 2013;46(3):1012–9.

    Article  Google Scholar 

  4. Ding S-F, Jia W-K, Chun-Yang S, et al. Research of neural network algorithm based on factor analysis and cluster analysis. Neural Comput Appl. 2011;20(2):297–302.

    Article  Google Scholar 

  5. Razavi S, Tolson BA. A new formulation for feedforward neural networks. IEEE Trans Neural Netw. 2011;22(10):1588–98.

    Article  PubMed  Google Scholar 

  6. Kimura D, Nii M, Yamaguchi T, Takahashi Y, Yumoto T. Fuzzy nonlinear regression analysis using fuzzified neural networks for fault diagnosis of chemical plants. J Adv Comput Intell Intell Inf. 2011;15(3):336–44.

    Google Scholar 

  7. Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of international joint conference on neural networks (IJCNN2004). Hungary: Budapest; 2004. 985–90.

  8. Huang G-B, Zhu QinYu, Siew C-K. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(1–3):489–501.

    Article  Google Scholar 

  9. Huang G-B, Chen L, Siew C-K. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw. 2006;17(4):879–92.

    Article  PubMed  Google Scholar 

  10. Huang G-B. An Insight into Extreme learning machines: random neurons, random features and kernels. Cogn Comput. 2014;6(3):376–90.

    Article  Google Scholar 

  11. Ding S-F, Xin-Zheng X, Nie R. Extreme learning machine and its applications. Neural Comput Appl. 2014;25(3):549–56.

    Article  Google Scholar 

  12. Huang G, Huang G-B, Song S-J, You K-Y. Trends in extreme learning machines: A review. Neural Netw. 2015;61:32–48.

    Article  PubMed  Google Scholar 

  13. Kasun L-L-C, Zhou H-M, Huang G-B, Vong C-M. Representational learning with extreme learning machine for big data. IEEE Intell Syst. 2013;28(6):31–4.

    Google Scholar 

  14. Bai Z, Huang G-B, Wang D-W, Wang H, Westover M-B. Sparse extreme learning machine for classification. IEEE Trans Cybern. 2014;44(10):1858–70.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Huang G, Song S-J, Gupta J-N-D, Wu C. Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern. 2014;44(12):2405–17.

    Article  PubMed  Google Scholar 

  16. Lin S-B, Liu X, Fang J, Zong-Ben X. Is extreme learning machine feasible? A theoretical assessment (part II). IEEE Trans Neural Netw Learn Syst. 2015;26(1):21–34.

    Article  PubMed  Google Scholar 

  17. Rong H-J, Ong Y-S, Tan A-H, Zhu Z-X. A fast pruned-extreme learning machine for classification problem. Neurocomputing. 2008;72(1–3):359–66.

    Article  Google Scholar 

  18. Huang G-B, Li M-B, Chen L, Siew C-K. Incremental extreme learning machine with fully complex hidden nodes. Neurocomputing. 2008;71(4–6):576–83.

    Article  Google Scholar 

  19. Huang G-B, Chen L. Convex incremental extreme learning machine. Neurocomputing. 2007;70(16–18):3056–62.

    Article  Google Scholar 

  20. Huang G-B, Chen L. Enhanced random search based incremental extreme learning machine. Neurocomputing. 2008;71(16–18):3460–8.

    Article  Google Scholar 

  21. Pawlak Z. Rough set. Int J Comput Inf Sci. 1982;11:341–56.

    Article  Google Scholar 

  22. Cao Y, Chen X-H, Wu DD, Mo M. Early warning of enterprise decline in a life cycle using neural networks and rough set theory. Exp Syst Appl. 2011;38(6):6424–9.

    Article  Google Scholar 

  23. Cheng J-H, Chen H-P, Lin Y-M. A hybrid forecast marketing timing model based on probabilistic neural network, rough set and C4.5. Exp Syst Appl. 2010;37(3):1814–20.

    Article  Google Scholar 

  24. Frey B, Dueck D. Clustering by passing messages between data points. Science. 2007;315(5814):972–6.

    Article  CAS  PubMed  Google Scholar 

Download references

Acknowledgments

This work is supported by the Fundamental Research Funds for the Central Universities (No. 2015XKMS088).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shifei Ding.

Ethics declarations

Conflict of Interest

Li Xu, Shifei Ding, Xinzheng Xu and Nan Zhang declare that they have no conflict of interest.

Informed Consent

All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5). Additional informed consent was obtained from all patients for which identifying information is included in this article.

Human and Animal Rights

This article does not contain any studies with human or animal subjects performed by the any of the authors.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, L., Ding, S., Xu, X. et al. Self-adaptive Extreme Learning Machine Optimized by Rough Set Theory and Affinity Propagation Clustering. Cogn Comput 8, 720–728 (2016). https://doi.org/10.1007/s12559-016-9409-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-016-9409-5

Keywords

Navigation