iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/978-3-540-74827-4_62
Combining Bagging, Boosting and Dagging for Classification Problems | SpringerLink
Skip to main content

Combining Bagging, Boosting and Dagging for Classification Problems

  • Conference paper
Knowledge-Based Intelligent Information and Engineering Systems (KES 2007)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4693))

  • 1798 Accesses

Abstract

Bagging, boosting and dagging are well known re-sampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging and dagging on noise-free data. However, there are strong empirical indications that bagging and dagging are much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging, boosting and dagging ensembles with 8 sub-classifiers in each one. We performed a comparison with simple bagging, boosting and dagging ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique had better accuracy in most cases.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36, 105–139 (1999)

    Article  Google Scholar 

  2. Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. Irvine, CA: University of California, Department of Information and Computer Science (1998), http://www.ics.uci.edu/m~learn/MLRepository.html

  3. Bosch, A., Daelemans, W.: Memory-based morphological analysis. In: proceedings of 37th Annual Meeting of the ACL, University of Maryland, pp. 285–292 (1999), http://ilk.kub.nl/~antalb/ltuia/week10.html

  4. Breiman, L.: Bagging Predictors. Machine Learning 24(3), 123–140 (1996)

    MATH  Google Scholar 

  5. Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1–15. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  6. Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Proceedings of ICML’96, pp. 148–156 (1996)

    Google Scholar 

  7. Iba, W., Langley, P.: Induction of one-level decision trees. In: Proceedings of Ninth International Machine Learning Conference (1992). Aberdeen, Scotland (1992)

    Google Scholar 

  8. Kotsiantis, S., Pierrakeas, C., Pintelas, P.: Preventing student dropout in distance learning systems using machine learning techniques. In: Palade, V., Howlett, R.J., Jain, L. (eds.) KES 2003. LNCS, vol. 2774, pp. 267–274. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  9. Melville, P., Mooney, R.: Constructing Diverse Classifier Ensembles using Artificial Training Examples. In: Proceedings of IJCAI-2003, Acapulco, Mexico, pp. 505–510 (August 2003)

    Google Scholar 

  10. Murthy: Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey. Data Mining and Knowledge Discovery 2, 345–389 (1998)

    Article  Google Scholar 

  11. Opitz, D., Maclin, R.: Popular Ensemble Methods: An Empirical Study. Artificial Intelligence Research 11, 169–198 (1999)

    MATH  Google Scholar 

  12. Salzberg, S.: On Comparing Classifiers: Pitfalls to Avoid and a Recommended Approach. Data Mining and Knowledge Discovery 1, 317–328 (1997)

    Article  Google Scholar 

  13. Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics 26, 1651–1686 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  14. Ting, K.M., Witten, I.H.: Stacking Bagged and Dagged Models. In: Fourteenth international Conference on Machine Learning, San Francisco, CA, pp. 367–375 (1997)

    Google Scholar 

  15. Webb, G.I.: MultiBoosting: A Technique for Combining Boosting and Wagging. Machine Learning 40, 159–196 (2000)

    Article  Google Scholar 

  16. Witten Ian, H., Eibe, F.: Data Mining: Practical machine learning tools and techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kotsianti, S.B., Kanellopoulos, D. (2007). Combining Bagging, Boosting and Dagging for Classification Problems. In: Apolloni, B., Howlett, R.J., Jain, L. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2007. Lecture Notes in Computer Science(), vol 4693. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74827-4_62

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74827-4_62

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74826-7

  • Online ISBN: 978-3-540-74827-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics