iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/11573036_31
Bagging Model Trees for Classification Problems | SpringerLink
Skip to main content

Bagging Model Trees for Classification Problems

  • Conference paper
Advances in Informatics (PCI 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3746))

Included in the following conference series:

  • 2152 Accesses

Abstract

Structurally, a model tree is a regression method that takes the form of a decision tree with linear regression functions instead of terminal class values at its leaves. In this study, model trees are coupled with bagging for solving classification problems. In order to apply this regression technique to classification problems, we consider the conditional class probability function and seek a model-tree approximation to it. During classification, the class whose model tree generates the greatest approximated probability value is chosen as the predicted class. We performed a comparison with other well known ensembles of decision trees, on standard benchmark datasets and the performance of the proposed technique was greater in most cases.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36, 105–139 (1999)

    Article  Google Scholar 

  2. Blake, C., Merz, C.: UCI Repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html

  3. Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  4. Breiman, L.: “Random Forests”. Machine Learning 45, 5–32 (2001)

    Article  MATH  Google Scholar 

  5. Dietterich, T.: Ensemble Methods in Machine Learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1–10. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  6. Frank, E., Wang, Y., Inglis, S., Holmes, G., Witten, I.: Using Model Trees for Classification. Machine Learning 32, 63–76 (1998)

    Article  MATH  Google Scholar 

  7. Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Proceedings of ICML 1996, pp. 148–156 (1996)

    Google Scholar 

  8. Kleinberg, E.M.: A mathematically rigorous foundation for supervised learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 67–76. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  9. Kotsiantis, S., Pierrakeas, C., Pintelas, P.: Predicting Students’ Performance in Distance Learning Using Machine Learning Techniques. Applied Artificial Intelligence (AAI) 18, 411–426 (2004)

    Article  Google Scholar 

  10. Landwehr, N., Hall, M., Frank, E.: Logistic model trees. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) ECML 2003. LNCS (LNAI), vol. 2837, pp. 241–252. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  11. Malerba, D., Esposito, F., Ceci, M., Appice, A.: Top-Down Induction of Model Trees with Regression and Splitting Nodes. IEEE Trans. Pattern Anal. Mach. Intell. 26(5), 612–625 (2004)

    Article  Google Scholar 

  12. Melville, P., Mooney, R.: Constructing Diverse Classifier Ensembles using Artificial Training Examples. In: Proceedings of IJCAI 2003, Acapulco, Mexico, August 2003, pp. 505–510 (2003)

    Google Scholar 

  13. Nadeau, C., Bengio, Y.: Inference for the Generalization Error. Machine Learning 52, 239–281 (2003)

    Article  MATH  Google Scholar 

  14. Opitz, D., Maclin, R.: Popular Ensemble Methods: An Empirical Study. Artificial Intelligence Research 11, 169–198 (1999)

    MATH  Google Scholar 

  15. Quinlan, J.R.: Bagging, boosting, and C4.5. In: Proceedings of the Thirteenth National Conference on Artificial Intelligence, pp. 725–730. AAAI/MIT Press (1996)

    Google Scholar 

  16. Quinlan, J.R.: Learning with continuous classes. In: Proc. of Australian Joint Conf. on AI, pp. 343–348. World Scientific, Singapore (1992)

    Google Scholar 

  17. Wang, Y., Witten, I.H.: Induction of model trees for predicting continuous classes. In: Proc. of the Poster Papers of the European Conference on ML, Prague, pp. 128–137. University of Economics, Faculty of Informatics and Statistics, Prague (1997)

    Google Scholar 

  18. Witten, I., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Mateo (2000)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kotsiantis, S.B., Tsekouras, G.E., Pintelas, P.E. (2005). Bagging Model Trees for Classification Problems. In: Bozanis, P., Houstis, E.N. (eds) Advances in Informatics. PCI 2005. Lecture Notes in Computer Science, vol 3746. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11573036_31

Download citation

  • DOI: https://doi.org/10.1007/11573036_31

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-29673-7

  • Online ISBN: 978-3-540-32091-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics