Abstract
Although neural networks have many appealing properties, yet there is neither a systematic way how to set up the topology of a neural network nor how to determine its various learning parameters. Thus an expert is needed for fine tuning. If neural network applications should not be realisable only for publications but in real life, fine tuning must become unnecessary. We developed a tool called ACMD (Approximation and Classification of Medical Data) that is demonstrated to fulfil this demand. Moreover referring to six medical classification and approximation problems of the PROBEN1 benchmark collection this approach will be shown even to outperform fine tuned networks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Rumelhart, D. E., Hinton, G. E., & Williams, R. J.: Learning Representations by Back-Propagating Errors. Nature 323 (1986) 533–536
Orr, G. B., Müller, K.-R. (eds.): Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, Vol. 1524. Springer-Verlag, Berlin Heidelberg New York (1998)
Kwok, T.Y., Yeung, D.Y.: Constructive algorithms for structure learning in feed forward neural networks for regression problems. IEEE Trans. on Neural Networks 8(3) (1997) 630–645
Ash, T.: Dynamic node creation in backpropagation networks. Connection Science 1(4) (1989) 365–375
Fahlman, S.E., Lebiere, C.: The cascade-correlation learning architecture. In: Touretzky, D.S. (ed.): Advances in Neural Information Processing Systems 2, Morgan Kaufmann, CA (1990) 524–532
Anand, R., Mehrotra, K., Mohan, C.K., Ranka, S.: Efficient Classification for Multiclass Problems Using Modular Neural Networks. IEEE Trans. on Neural Networks. 6(1) (1995) 117–124
Linder, R., Wirtz, S., Pöppl, S.J.: Speeding up Backpropagation Learning by the APROP Algorithm. Second International ICSC Symposium on Neural Computation, Proceedings CD, Berlin (2000).
Prechelt, L.: Proben 1 — a set of neural network benchmark problems and benchmarking rules. Technical Report 21/94, Fakultät für Informatik, Universität Karlsruhe. Available via http://ftp.ira.uka.de in directory /pub/papers/techreports/1994 as file 1994-21.ps.Z. (1994)
Linder, R., Pöppl, S.J.: Backprop, RPROP, APROP: Searching for the best learning rule for an electronic nose. Neural Networks in Applications ’99. In: Proc. of the Fourth International Workshop, Magdeburg, Germany, (1999) 69–74
Ng, S.C., Leung, S.H., Lik, A.: Fast Convergent Generalized Back-Propagation Algorithm with Constant Learning Rate. Neural Processing Letters 9 (1999) 13–23
Schraudolph, N.N.: A Fast, Compact Approximation of the Exponential Function. Neural Computation 11 (1999) 853–862
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Linder, R., Pöppl, S.J. (2001). ACMD: A Practical Tool for Automatic Neural Net Based Learning. In: Crespo, J., Maojo, V., Martin, F. (eds) Medical Data Analysis. ISMDA 2001. Lecture Notes in Computer Science, vol 2199. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45497-7_25
Download citation
DOI: https://doi.org/10.1007/3-540-45497-7_25
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42734-6
Online ISBN: 978-3-540-45497-7
eBook Packages: Springer Book Archive