Abstract
Structurally, a model tree is a regression method that takes the form of a decision tree with linear regression functions instead of terminal class values at its leaves. In this study, model trees are coupled with bagging for solving classification problems. In order to apply this regression technique to classification problems, we consider the conditional class probability function and seek a model-tree approximation to it. During classification, the class whose model tree generates the greatest approximated probability value is chosen as the predicted class. We performed a comparison with other well known ensembles of decision trees, on standard benchmark datasets and the performance of the proposed technique was greater in most cases.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36, 105–139 (1999)
Blake, C., Merz, C.: UCI Repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)
Breiman, L.: “Random Forests”. Machine Learning 45, 5–32 (2001)
Dietterich, T.: Ensemble Methods in Machine Learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1–10. Springer, Heidelberg (2000)
Frank, E., Wang, Y., Inglis, S., Holmes, G., Witten, I.: Using Model Trees for Classification. Machine Learning 32, 63–76 (1998)
Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Proceedings of ICML 1996, pp. 148–156 (1996)
Kleinberg, E.M.: A mathematically rigorous foundation for supervised learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 67–76. Springer, Heidelberg (2000)
Kotsiantis, S., Pierrakeas, C., Pintelas, P.: Predicting Students’ Performance in Distance Learning Using Machine Learning Techniques. Applied Artificial Intelligence (AAI) 18, 411–426 (2004)
Landwehr, N., Hall, M., Frank, E.: Logistic model trees. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) ECML 2003. LNCS (LNAI), vol. 2837, pp. 241–252. Springer, Heidelberg (2003)
Malerba, D., Esposito, F., Ceci, M., Appice, A.: Top-Down Induction of Model Trees with Regression and Splitting Nodes. IEEE Trans. Pattern Anal. Mach. Intell. 26(5), 612–625 (2004)
Melville, P., Mooney, R.: Constructing Diverse Classifier Ensembles using Artificial Training Examples. In: Proceedings of IJCAI 2003, Acapulco, Mexico, August 2003, pp. 505–510 (2003)
Nadeau, C., Bengio, Y.: Inference for the Generalization Error. Machine Learning 52, 239–281 (2003)
Opitz, D., Maclin, R.: Popular Ensemble Methods: An Empirical Study. Artificial Intelligence Research 11, 169–198 (1999)
Quinlan, J.R.: Bagging, boosting, and C4.5. In: Proceedings of the Thirteenth National Conference on Artificial Intelligence, pp. 725–730. AAAI/MIT Press (1996)
Quinlan, J.R.: Learning with continuous classes. In: Proc. of Australian Joint Conf. on AI, pp. 343–348. World Scientific, Singapore (1992)
Wang, Y., Witten, I.H.: Induction of model trees for predicting continuous classes. In: Proc. of the Poster Papers of the European Conference on ML, Prague, pp. 128–137. University of Economics, Faculty of Informatics and Statistics, Prague (1997)
Witten, I., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Mateo (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kotsiantis, S.B., Tsekouras, G.E., Pintelas, P.E. (2005). Bagging Model Trees for Classification Problems. In: Bozanis, P., Houstis, E.N. (eds) Advances in Informatics. PCI 2005. Lecture Notes in Computer Science, vol 3746. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11573036_31
Download citation
DOI: https://doi.org/10.1007/11573036_31
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-29673-7
Online ISBN: 978-3-540-32091-3
eBook Packages: Computer ScienceComputer Science (R0)