Abstract
We introduce a nonparametric cumulant based statistical approach for detecting linear and nonlinear statistical dependences in non-stationary time series. The statistical dependence is detected by measuring the predictability which tests the null hypothesis of statistical independence, expressed in Fourier-space, by the surrogate method. Therefore, the predictability is defined as a higher-order cumulant based significance discriminating between the original data and a set of scrambled surrogate data which correspond to the null hypothesis of a non-causal relationship between past and present. In this formulation nonlinear and non-Gaussian temporal dependences can be detected in time series. Information about the predictability can be used for example to select regions where a temporal structure is visible in order to select data for training a neural network for prediction. The regions where only a noisy behavior is observed are therefore ignored avoiding in this fashion the learning of irrelevant noise which normally spoils the generalization characteristics of the neural network.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
M. Palus, Physica D 80, 186 (1995).
M. Palus, V. Albrecht and I. Dvorák, Physics Letters A175, 203 (1993).
J. Theiler, S. Eubank, A. Longtin, B. Galdrikian and J. Farmer, Physica D 58, 77 (1992).
G. Deco and B. Schürmann, Physical Review E 51, 1780 (1995).
G. Deco and W. Brauer, Neural Networks 8, 525 (1995).
G. Deco and D. Obradovic, “An Information-Theoretic Approach to Neural Computing” (Springer Verlag, New York, 1996).
M. Hénon, Comm. Math. Phys. 50, 69 (1976).
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1996 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Deco, G., Schürmann, B. (1996). Nonparametric data selection for improvement of parametric neural learning: A cumulant-surrogate method. In: von der Malsburg, C., von Seelen, W., Vorbrüggen, J.C., Sendhoff, B. (eds) Artificial Neural Networks — ICANN 96. ICANN 1996. Lecture Notes in Computer Science, vol 1112. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-61510-5_24
Download citation
DOI: https://doi.org/10.1007/3-540-61510-5_24
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-61510-1
Online ISBN: 978-3-540-68684-2
eBook Packages: Springer Book Archive