iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: http://www.scholarpedia.org/article/Dynamical_Systems
Dynamical systems - Scholarpedia

Dynamical systems

From Scholarpedia
James Meiss (2007), Scholarpedia, 2(2):1629. doi:10.4249/scholarpedia.1629 revision #197505 [link to/cite this article]
Jump to: navigation, search
Post-publication activity

Curator: James Meiss

A dynamical system is a rule for time evolution on a state space.


Contents

Introduction

A dynamical system consists of an abstract phase space or state space, whose coordinates describe the state at any instant, and a dynamical rule that specifies the immediate future of all state variables, given only the present values of those same state variables. For example the state of a pendulum is its angle and angular velocity, and the evolution rule is Newton's equation \(F = ma\ .\)

Mathematically, a dynamical system is described by an initial value problem. The implication is that there is a notion of time and that a state at one time evolves to a state or possibly a collection of states at a later time. Thus states can be ordered by time, and time can be thought of as a single quantity.

Dynamical systems are deterministic if there is a unique consequent to every state, or stochastic or random if there is a probability distribution of possible consequents (the idealized coin toss has two consequents with equal probability for each initial state).

A dynamical system can have discrete or continuous time. A deterministic system with discrete time is defined by a map, \[ x_1 = f(x_0) , \] that gives the state \(x_1 \) resulting from the initial state \(x_0 \) at the next time value. After time \(n\) one has \[ x_n = f^n(x_0) , \] where \(f^n\) is the \(n\)-th iterate of \(f\ .\) A deterministic system with continuous time is defined by a flow, \[ x(t) = \varphi_t(x(0)), \] that gives the state at time \(t\ ,\) given that the state was \(x(0)\) at time 0. A smooth flow can be differentiated with respect to time to give a differential equation, \(dx/dt = X(x)\ .\) The function \(X(x)\) is called a vector field, it gives a vector pointing in the direction of the velocity at every point in phase space.

Definition

A dynamical system is a state space \(S\ ,\) a set of times \(T\) and a rule \(R\) for evolution, \(R: S \times T \rightarrow S\) that gives the consequent(s) to a state \(s \in S\ .\) A dynamical system can be considered to be a model describing the temporal evolution of a system.

State space

The state space is a collection of coordinates that describe all the modeler feels is needed to give a complete description of the system. Given the current state of the system, the evolution rule predicts the next state or states. In addition to the state, which evolves in time, a model may also depend upon parameters that are constant or perhaps known functions of time, for example the mass of bodies in a mechanical model or the birth rate and carrying capacity in a population model.

A state space can be discrete or continuous. For example the coin toss might be modeled by a state space consisting of two states, heads and tails. Thus the state \(s\) at each time is an element of the set \({H,T}\ .\) A discrete space can also have infinitely many states; for example, a random walk could be restricted to a lattice of points, and the system state is simply which lattice point is currently occupied.

When the state space is continuous it is often a smooth manifold. In this case it is called the phase space. For example, a simple pendulum is modeled as a rigid rod that is suspended in a vertical gravitational field from a pivot that allows the pendulum to oscillate in a plane. According to Newton, knowledge of the angle of the rod relative to the vertical, \(\theta\ ,\) and the angular velocity, \(\nu = d\theta/dt\) is sufficient to describe the pendulum's state. Thus the phase space of the pendulum is the collection of possible values of \(\theta\) and \(\nu\ ,\) a two-dimensional manifold. This manifold is the cylinder since \(\theta\) is periodic. In addition to the pendulum's state, the model also depends upon two parameters, the pendulum's length and the strength of gravity.

A phase space can also be infinite dimensional, e.g. a function space. This is the case for dynamics that is modeled by partial differential equations.

Time

Time may also be discrete or continuous or more generally be represented by a topological group. Dynamical systems with discrete time, like the ideal coin toss, have their states evaluated only after certain discrete intervals. In the case of the coin toss, the smooth tumbling and bouncing of the coin is ignored, and its state is only viewed when it has come to equilibrium. Other systems that are often modeled with discrete time include population dynamics (the discreteness referring to subsequent generations) and impacting systems like a billiard where only the state at impact is used. It is common to scale the discrete time interval to one, so the set of allowed times becomes \(T = \mathbb{Z}\) or possibly only nonnegative integers, \(T = \mathbb{N}\ .\) This is convenient even in cases like the billiard where the actual physical time interval between impacts may not be constant.

Figure 1: Henri Poincaré, the father of dynamical systems

Dynamical systems first appeared when Newton introduced the concept of ordinary differential equations (ODEs) into Mechanics. In this case, \(T = \mathbb{R}\ .\) However, Henri Poincaré is the father of the modern, qualitative theory of dynamical systems. He recognized that even differential equations can be viewed as a discrete-time systems by strobing, i.e. only recording the solution at a set of discrete times, or by Poincaré section. This, of course, is required in any computational algorithm and also in any experimental measurement since it is only possible to measure finitely many values.

Evolution rule

The evolution rule provides a prediction of the next state or states that follow from the current state space value. An evolution rule is deterministic if each state has a unique consequent, and is stochastic (or "random") if there is more than one possible consequent for a given state.

The forward orbit or trajectory of a state \(s\) is the time-ordered collection of states that follow from \(s\) using the evolution rule. For a deterministic rule with discrete time the forward orbit of \(s_0\) is the sequence \(s_0 , s_1 , s_2 , \ldots\ .\) When both state space and time are continuous, the forward orbit is a curve \(s(t), t \ge 0\ .\)

Deterministic evolution rules are invertible if each state has a unique precedent or preimage. In this case the full orbit of the system is the bi-infinite sequence or curve that starts at \(s_0 \) or \(s(0)\) and extends in both directions of time.

Examples

Maps

A deterministic evolution rule with discrete time, and a continuous state space is called a map, \[f: S \rightarrow S\ .\] The evolution is defined by iteration \(s_{t+1} = f(s_t )\ .\) A map can be one-to-one (invertible) or not. Invertible maps can be continuous with continuous inverses (homeomorphism) or be smooth and smoothly invertible (diffeomorphism).

A simple example is the Logistic map of population dynamics. Here the state space is \(\mathbb{R}^+\ ,\) the nonnegative reals, representing a continuous approximation to a population size. The map is \[\tag{1} f(x) = rx\left( 1 - \frac{x}{K}\right) \]

where \(r\) is the growth rate per individual and \(K\) is the carrying capacity. The map (1) is not invertible since most states in the interval \([0,K]\) have two preimages.

Flows

A flow is a deterministic dynamical system on a manifold, \(M\) that is continuously differentiable with respect to time. It is defined by a function \[\varphi : R\times M \to M\ ,\] so that the orbit is given by \[\tag{2} x(t) = \varphi_t(x(0)) \]

Flows obey the properties

  • Identity\[\varphi_0(x) = x\]
  • Group\[\varphi_{t+s}(x) = \varphi_t(\varphi_s(x))\]
  • Differentiability\[ \frac{d}{dt} \varphi_t(x)|_{t=0} = X(x)\]

The second property is known as the group property; it expresses the concept that the dynamics can be restarted at any point \(x(s)\) along its trajectory to get the same result \(x(t+s)\) as flowing forward for time \(t+s\) from \(x(0)\ .\) The last property, differentiability, defines a vector field \(X\) that is associated with any flow. A consequence of the group property is that the orbits of a flow are solutions of the ordinary differential equation \[ \frac{d}{dt} x = X(x) \]

It is convenient to define the dynamics associated with differential equations through the flow concept because the issues of existence and uniqueness of the solutions of the ODE can then be avoided: the orbits of a flow are unique (only one orbit passes through each point in \(M\)) and exist for all time. This is not true generally for ODEs.

A semi-flow is a flow defined only for nonnegative values of time. Semi-flows commonly arise for partial differential equations.

Iterated function system

A stochastic evolution with discrete time but continuous phase space is an iterated function system. In this case there is a collection of functions \(f_\alpha\) indexed by parameters \(\alpha\ .\) The evolution is random with the next state \(s_{t+1} = f_\alpha (s_t )\) where \(\alpha\) is selected from a probability distribution.

Iterated function systems can generate interesting dynamics even when the functions are contraction maps. In this case the orbits are often attracted to some fractal set.

Cellular automata

A dynamical system with a deterministic rule, discrete time and discrete state space is a cellular automata. The evolution rule assigns a new state to a cell as a function of the old state of this cell and finitely many of its neighbors. The (relative) rule is the same for each cell.

An example is the game of life, where there is a square grid on the plane and each cell can assume 2 states: alive or dead (but there are only finitely many live cells).

References

  • Alligood, K. T., T. D. Sauer and J. A. Yorke (1997). Chaos. New York, Springer-Verlag.
  • Arrowsmith, D. K. and C. M. Place (1990). An Introduction to Dynamical Systems. Cambridge, Cambridge University Press.
  • Birkhoff, G. D. (1927). Dynamical Systems. New York, Am. Math. Soc.
  • Chicone, C. (1999). Ordinary Differential Equations with Applications. New York, Springer-Verlag.
  • Devaney, R. L. (1986). An Introduction to Chaotic Dynamical Systems. Menlo Park, Benjamin/Cummings.
  • Guckenheimer, J. and P. Holmes (1983). Nonlinear Oscillations, Dynamical Systems, and Bifurcations of Vector Fields. New York, Springer-Verlag.
  • Katok, A. B. and B. Hasselblatt (1999). Introduction to the Modern Theory of Dynamical Systems. Cambridge, Cambridge University Press.
  • Moser, J. K., Ed. (1975). Dynamical Systems Theory and Applications. Springer Lecture Notes in Physics. Berlin, Springer-Verlag.
  • Meiss, J. D. (2017). Differential Dynamical Systems: Revised Edition. Philadelphia, SIAM.
  • Ott, E. (1993). Chaos in Dynamical Systems. Cambridge, Cambridge Univ. Press.
  • Poincaré, H. (1892). Les Methodes Nouvelles de la Mechanique Celeste. Paris, Gauthier-Villars.
  • Robinson, C. (1999). Dynamical Systems : Stability, Symbolic Dynamics, and Chaos. Boca Raton, Fla., CRC Press.
  • Strogatz, S. (1994). Nonlinear Dynamics and Chaos. Reading, Addison-Wesley.
  • Wiggins, S. (2003). Introduction to Applied Nonlinear Dynamical Systems and Chaos. New York, Springer-Verlag.

Internal references

External links

See Also

Attractors, Bifurcation, Chaos, Difference Equations, Equilibrium, Fixed Point, Hamiltonian Systems, Mapping, Ordinary Differential Equations, Partial Differential Equations, Periodic Orbit, Phase Space, Phase Portrait, Relaxation Oscillator, Stability, Topological Dynamics

Personal tools
Namespaces

Variants
Actions
Navigation
Focal areas
Activity
Tools