iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1010527
Neural adaptation and fractional dynamics as a window to underlying neural excitability | PLOS Computational Biology
Skip to main content
Advertisement
  • Loading metrics

Neural adaptation and fractional dynamics as a window to underlying neural excitability

  • Brian Nils Lundstrom ,

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    lundstrom.brian@mayo.edu

    Affiliation Neurology Department, Mayo Clinic, Rochester, Minnesota, United States of America

  • Thomas J. Richner

    Roles Formal analysis, Writing – review & editing

    Affiliation Neurology Department, Mayo Clinic, Rochester, Minnesota, United States of America

Correction

12 Jun 2023: Lundstrom BN, Richner TJ (2023) Correction: Neural adaptation and fractional dynamics as a window to underlying neural excitability. PLOS Computational Biology 19(6): e1011220. https://doi.org/10.1371/journal.pcbi.1011220 View correction

Abstract

The relationship between macroscale electrophysiological recordings and the dynamics of underlying neural activity remains unclear. We have previously shown that low frequency EEG activity (<1 Hz) is decreased at the seizure onset zone (SOZ), while higher frequency activity (1–50 Hz) is increased. These changes result in power spectral densities (PSDs) with flattened slopes near the SOZ, which are assumed to be areas of increased excitability. We wanted to understand possible mechanisms underlying PSD changes in brain regions of increased excitability. We hypothesized that these observations are consistent with changes in adaptation within the neural circuit. We developed a theoretical framework and tested the effect of adaptation mechanisms, such as spike frequency adaptation and synaptic depression, on excitability and PSDs using filter-based neural mass models and conductance-based models. We compared the contribution of single timescale adaptation and multiple timescale adaptation. We found that adaptation with multiple timescales alters the PSDs. Multiple timescales of adaptation can approximate fractional dynamics, a form of calculus related to power laws, history dependence, and non-integer order derivatives. Coupled with input changes, these dynamics changed circuit responses in unexpected ways. Increased input without synaptic depression increases broadband power. However, increased input with synaptic depression may decrease power. The effects of adaptation were most pronounced for low frequency activity (< 1Hz). Increased input combined with a loss of adaptation yielded reduced low frequency activity and increased higher frequency activity, consistent with clinical EEG observations from SOZs. Spike frequency adaptation and synaptic depression, two forms of multiple timescale adaptation, affect low frequency EEG and the slope of PSDs. These neural mechanisms may underlie changes in EEG activity near the SOZ and relate to neural hyperexcitability. Neural adaptation may be evident in macroscale electrophysiological recordings and provide a window to understanding neural circuit excitability.

Author summary

Electrophysiological recordings such as EEG from the human brain often come from many thousands of neurons or more. It can be difficult to relate recorded activity to characteristics of the underlying neurons and neural circuits. Here, we use a novel theoretical framework and computational neural models to understand how neural adaptation might be evident in human EEG recordings. Neural adaptation includes mechanisms such as spike frequency adaptation and short-term depression that emphasize stimulus changes and help promote stability. Our results suggest that changes in neural adaptation affect EEG signals, especially at low frequencies. Further, adaptation can lead to changes related to fractional derivatives, a kind of calculus with non-integer orders. Neural adaptation may provide a window into understanding specific aspects of neuron excitability even from EEG recordings.

Introduction

Neurons are electrically excitable cells that are a fundamental computational unit of the nervous system [13]. Excitatory and inhibitory neurons interact as neural circuits perform computational tasks. Broadly, excitatory neurons generate action potentials after some stimulus threshold is reached. The rate of action potentials then increases as the stimulus increases, which leads to an increase in downstream excitatory synaptic transmission.

However, multiple negative feedback mechanisms influence excitatory signals. These feedback mechanisms can act to directly decrease the rate of action potentials or can affect synaptic transmission. One example that appears to be present in the vast majority of excitatory neocortical neurons is spike frequency adaptation (SFA) [2,4,5], which is a negative feedback mechanism that slows action potential generation near the cell body at the axon hillock in response to a sustained stimulus (Fig 1). Another example is short-term synaptic depression, a negative feedback mechanism that is a form of synaptic plasticity [6,7]. These feedback mechanisms are typically described by a single timescale, such as an exponential time constant, although it is clearly recognized that there are often multiple timescales present. These timescales are generally thought to encompass a range varying from tens of milliseconds to several minutes. Over this range, multiple timescales of adaptation can approximate fractional dynamics, a form of power law dynamics with history dependence [8,9]. These mechanisms can be considered a form of short-term plasticity and work at a faster timescale than long-term depression and long-term facilitation, which typically are relevant over a scale of many minutes to hours [3].

thumbnail
Fig 1. Spike frequency adaptation (SFA) and short-term synaptic depression (STD) are forms of negative feedback adaptation or gain control.

(a) SFA leads to a decrease in action potential firing rate in response to a continuous stimulus. Examples show single and multiple timescale adaption. (b) With adaptation fewer action potentials are generated in noiseless (upper panel) and noisy (lower panel) stimulus conditions. STD decreases postsynaptic membrane potentials in response to (c) fixed presynaptic firing rates and (d) noisy presynaptic firing rates. The membrane potential is normalized such that the mean of the steady-state non-adapted response (black) is one. Results are from Hodgkin-Huxley neuron models and synaptic-depression models with single or multiple timescales of adaptation (see Methods).

https://doi.org/10.1371/journal.pcbi.1010527.g001

Despite the prevalence of negative feedback mechanisms, their effects on the properties of neural circuits are not clearly understood, especially as it relates to response timescales. Here, we define a neural model with pre- and postsynaptic multiple timescale adaptation and describe several emerging adaptation-related neural circuit properties at the level of local field potentials. We consider model results in the context of clinically recorded human neural activity, which has the potential to bridge local field potentials and human responses [10]. Neural mechanisms of adaptation may be evident at the level of macroscale electrophysiological recordings and provide a window to underlying neural dynamics and overall circuit excitability.

Results

Neural adaptation

First, we want to define a general model with which to examine multiple timescale adaptation. We define neural adaptation as processes that emphasize stimulus changes with computational properties similar to spike frequency adaptation (SFA) and synaptic depression (STD) [11,12], which are common forms of negative feedback adaptation. SFA was first described over a century ago by Lord Adrian, who won a Nobel Prize for his discoveries [13]. SFA is a type of high-pass filter or, equivalently, differentiator that emphasizes stimulus change. SFA linearizes the firing rate-input (f-I) curve of neurons [14] and can be approximated in the time domain and Laplace (frequency) domain as (see also S1 Text): (1) where y is the firing rate output (assumed to be non-negative), x is the current input, a is the adaptation variable, c and γ are positive gain constants, F governs the amount of adaptation (0 < F < 1), and τ is the effective adaptation time constant [15,16], which is smaller than the time constant of the underlying mechanism [17] (see also Table 1). The variable s is used for Laplace transforms and the s-domain. In these examples, can be substituted for s if readers are more familiar with the Fourier transform, which is a special case of the Laplace transform.

Short-term synaptic depression has input-output properties similar to SFA [11]. Specifically, synaptic depression can be modeled with the following nonlinear system of equations: (2)

Due to the nonlinearity of the input x and adaptation variable a in both equations, we assume small fluctuations around a0 and x0 and linearize, which yields the following state space and transfer function equations (see S1 Text): (3) where a and x now represent small differences from steady-state values a0 and x0. The equations for SFA and synaptic depression are similar. However, in contrast to SFA, the non-linearity of synaptic depression leads to a gain factor that depends inversely on the steady-state input x0 since a0 = 1/(1 + x0 F), which has important consequences.

SFA and STD are not the only negative feedback mechanisms in neural circuits. Other negative feedback mechanisms have similar forms, including circuit delays, which can have transfer functions of the form s/(τs + 1). Parallel feedforward networks in which inhibitory inputs lag behind excitatory inputs also lead to temporal differentiation with the same mathematical form [12]. The core feature of these mechanisms is that the inhibitory feedback components lag excitatory components. The result is a derivative-like computation that emphasizes stimulus change and leads to decreasing temporal responsiveness to constant stimuli. When mechanisms such as SFA and STD occur in the same neural circuit, their differentiating properties can be additive [11,18], as expected with linear computations. Generally, multiple adaptive mechanisms, which are high-pass filters, exert negative feedback control, differentiate the input, and emphasize stimulus change. Multiple timescales of adaptive mechanisms can also give rise to history dependence and preserve information about past stimuli.

Understanding multiple timescale adaptation as fractional differentiation

Each differentiating process is described by a single-timescale adaptive process governed by an effective time constant τ. However, each biological instantiation of negative feedback typically has a different timescale such that adaptation effectively has multiple timescales. We do not expect that each timescale will be the same or even similar. In fact, these timescales vary over at least an order of magnitude, such as demonstrated by mechanisms contributing to SFA [19] and STD [6]. Then, how might we understand the computational effects of multiple timescales? We show that multiple timescales of adaptation are related to fractional dynamics, power laws, timescale invariance, and history dependence.

While single timescale processes may lie at one end of the timescale continuum, power law processes can be seen as lying at the other. Power laws can be thought of as including an infinite number of timescales. In addition, power laws are scale invariant and no single timescale is preferred. Power law responses and multiple timescales of adaptation have long been observed in sensory processes [4,5] and may improve coding efficiency [20,21]. The relationship between power laws and single exponential timescales can be seen by arranging the definition of the gamma function with x = λt [5]:

The integrand is comprised of weights λk-1 for the exponential decay processes governed by timescale λ. In other words, power law decay can be thought of as composed of many exponential decay processes with properly balanced and varying timescales 1/λ. As described as early as Lord Adrian [13], decay processes are an example of neural adaptation in response to a step stimulus, or Heaviside step function. Practically, when these timescales occur in the internal variable of the feedback mechanism (e.g. Eq (1)), as few as three timescales can approximate a scale-invariant power law process over at least an order of magnitude [16,18].

We want to show the relationship between time domain and frequency domain representations of power laws, specifically that fractional derivatives and power laws are intimately connected. It is helpful to notice that if instead we arrange the gamma function with x = st, integrate with respect to t and let k = -α, we obtain: where it is immediately evident that sα is the Laplace transform of t-(α+1). We see that the power of s is related to the power law exponent. Regarding adaptation, we can also see that sα-1 is the Laplace transform pair of t. Thus, if the transfer function is sα (which we will see implies fractional dynamics) and the input is a step stimulus 1/s, or Heaviside function (which is the integral of the delta function), then t is the output [8]. Thus, power law responses to step stimuli imply fractional dynamics.

To further elaborate on this point, there are several ways to define fractional derivatives. We recall that sX(s) in the Laplace domain is the first derivative of x(t), just as iω X(ω) is the first derivative in the Fourier domain. A second derivative would then be s2X(s), and a derivative of arbitrary order is then sαX(s). When α is not an integer, fractional dynamics are present. We have seen that sα, which is a derivative of order α (and may be a non-integer), has power law dynamics in the time domain [8]. In the time domain, the fractional derivative Dα with order α for 0<α<1 can be written as [9,22]: where the power law kernel on the righthand side of the equation is evident. This is known as the Riemann-Liouville definition, and it clearly shows the link between fractional derivatives and power laws. Thus, at least if properly weighted, multiple timescale adaptation is closely related to power law dynamics and can be succinctly characterized by fractional differentiation of order α. The limits of integration from 0 to x demonstrate the history dependence of fractional differentiation.

When α = 1, the system performs a first derivative on the input, as described previously [11,12], and the mathematical computation is local in time. However, when 0 < α < 1 multiple timescales are present, stimulus history is maintained in the output, and adaptation is fundamentally non-local (Fig 2).

thumbnail
Fig 2.

Neuronal spike frequency adaptation (a) approximates fractional differentiation with order ~0.1–0.5. Properties for the fractional derivative transfer function (i2π)α include a power law amplitude response and frequency-independent phase response (see also S1 Text). (b) Higher fractional orders correspond to increased spike frequency adaptation, i.e., increased negative feedback inhibition.

https://doi.org/10.1371/journal.pcbi.1010527.g002

Adaptation, power spectra, and broadband power

Our previous work demonstrated multiple timescale adaptation consistent with fractional dynamics for in vitro and in vivo preparations. Neocortical neurons in slice preparations [8] and in vivo rodent thalamus and neocortical neurons [18] adapt to inputs in a manner consistent with fractional differentiation of order ~0.1–0.5. Here, we explore how these single neuron and neural circuit properties affect macroscale recordings such as EEG-recorded power spectral densities (PSD) and whether there is a PSD signature that may tell us something about underlying excitability.

Power spectra from local field potentials (LFPs) and EEG are generally thought to be dominated by post-synaptic membrane potentials [23], and here we consider summed post-synaptic membrane potentials as the source of the LFP and EEG signal. We focus on frequencies less than ~100 Hz and consider recorded LFP and EEG activity as resulting from a combination of high-pass and low-pass filtering mechanisms. Eqs (1) and (3) describe the high-pass components within the transfer function, which provide negative feedback and define the adaptive timescales, and can be generally represented by: (4) where the nth adaptive mechanism is represented by time constant τn with associated gain constants gn and An, where A>1 (closer to 1 signifies less adaptation). We assume adaptive mechanisms are in series and multiplicative in the frequency domain (see S1 Text). To illustrate the characteristics of these adaptive mechanisms, we show the step response for each of three adaptative mechanisms, where the decay of the step response represents the timescale of adaptation (Fig 3). In this case, the time constants are 0.05, 0.5 and 5 sec, corresponding to three timescales. For comparison, we also plotted an approximate decaying power law, which inherently contains multiple timescales and is scale-invariant.

thumbnail
Fig 3. Multiple timescale adaptation can approximate fractional dynamics.

(a) Step responses and (b) frequency domain characteristics of three high-pass filters with three timescales (τ = 0.05s, 0.5s, 5s). The combination of the three high-pass filters is similar to an approximate power law with exponent -0.1, which is equivalent to fractional differentiation of order 0.1. (c) Fractional dynamics demonstrate a frequency-independent phase advance that can be approximated by multiple timescale adaptation with either one gain parameter (A1 = A2 = A3) or three gain parameters (A1≠ A2≠ A3). (d) Weighting each timescale equally (i.e. with one gain parameter) can provide a reasonable approximation of fractional dynamics, especially for α<0.5. The value of A (for one gain parameter) or median value of A (for three gain parameters) are similar as fractional order α increases.

https://doi.org/10.1371/journal.pcbi.1010527.g003

Eq (4) may be constrained to ensure the multiple timescales are balanced such that there is not a single dominant timescale, as in the case of power laws and fractional dynamics [8]. In other words, for a given set of τn, balanced gain constants An yield an approximate power law response. One approach involves fitting An such that a frequency-independent phase results [16]. For example, a system comprised of three logarithmically spaced time constants and equal A shows an approximate frequency independent phase advance (phase advances are additive), which is consistent with scale-invariant power law dynamics for frequencies ~0.01–10 Hz (Fig 3B, right panel).

We more rigorously examine how well three adaptive timescales can approximate fractional dynamics. Fractional differentiation implies a frequency-independent phase advance of (απ/2) that can be seen from its definition as (iω)α and Euler’s identity exp(π/2) = i (see Fig 2A and also S1 Text). Practically, fractional dynamics imply a constant phase shift over some frequency range. Here, we use three timescales (0.05 s, 0.5 s, 5 s) to approximate the constant phase over the frequency range ~0.05–5 Hz. Examples are shown in Fig 2C under two conditions: with the same weight (An) for each time scale and with separate weights. We minimize the difference between Eq (4) and the constant phase defined by α using multidimensional unconstrained nonlinear minimization (Nelder-Mead method). We find that equal weights for distinct well-separated timescales can approximate fractional dynamics, especially for α< 0.5 (Fig 2D).

Adaptative negative feedback mechanisms can be combined with a low-pass filter to represent a basic unit of neural computation. This fundamental computational neural unit can be thought of as spike generation through to postsynaptic membrane potentials, which is largely the source of LFP or EEG signals. The input is comprised of colored noise arriving at the neuron cell body, which can approximate neural activity and synchrony [2428]. This input causes action potentials that propagate along the axon and eventually give rise to postsynaptic potentials (PSPs). Action potentials and excitatory PSPs are high-pass filtered by adaptive mechanisms, while resistive and capacitive membrane characteristics serve as a low-pass filters (Fig 4).

thumbnail
Fig 4. Model of neural components leading to LFP or EEG signals.

Colored noise inputs are filtered by high-pass and low-pass filtering mechanisms. Neural circuit topologies with parallel feed-forward connections and asymmetric time delays have similar computational components.

https://doi.org/10.1371/journal.pcbi.1010527.g004

The general transfer function H(s) for this series of low-pass and high-pass filters can be represented by: (5) where ki are gain constants for the low-pass filters. This is a general form of a multiple timescale adaptive neural circuit element. When the high-pass filters include multiple timescales that are appropriately weighted, as previously discussed, the transfer function is approximately: (6) where α succinctly describes multiple timescale negative feedback of the system. The frequency-independent phase related to α can be used to find Ai for a given set of τi such that fractional dynamics are approximated [16] (S1 Text). If Eq (4) is being used to describe a neuron with multiple adaptation currents, then Ai /τi is approximately proportional to the adaptation channel conductances; a slightly modified form of the filter can yield a more precise correspondence between channel conductances and Ai [16].

Effects of adaptation on power spectral densities (PSD)

Negative feedback adaptation such as SFA and STD are typically relevant for frequencies less than 100 Hz since the fastest relevant time constants are at least ~10 msec or larger [19]. The effect of multiple timescales is multiplicative in the frequency domain such that the largest effect of adaptation will be for lower frequencies. Thus, multiple timescale adaptation can change the PSD slope, as seen by α in Eq (6). To help exemplify this, we show the magnitude of the transfer function for a model with a single timescale of SFA, a single timescale of STD, and a low-pass filter: (7)

For SFA and STD, the denominator contains a frequency dependent term τ2ω2 and a frequency-independent term, which is either (1+cF)2 or (1+x0F)2 for SFA and STD, respectively; all constants are positive. In the frequency domain, this form leads to sigmoidal high-pass filters that are multiplied with the low-pass filter and more strongly affect lower frequencies (Fig 5A).

When there are multiple timescales of adaptation, the high-pass filters can be simplified such that for low to midrange frequencies (where adaptation is relevant) the magnitude of the transfer function is: where the approximation of Eq (6) is applied to Eq (7). For high frequencies (e.g. >10–100 Hz), the PSD is that of the low-pass filter. For low to midrange frequencies, the slope of the PSD is approximately -(2–2α).

thumbnail
Fig 5. Adaptation affects the gain and slope of neural power spectral densities.

(a) Example showing interaction of high-pass filters with low-pass filters. Eq (7) is plotted with three timescales (γ = k = c = x0 = 1, F = 0.5). (b) Relative steady-steady output for synaptic depression initially increases and then decreases as x0 increases (modeled via Eq (8) with n = 3, F = 0.5, γ = 2). (c) Adaptation alters the slope of PSDs, decreases the overall gain, and affects lower frequencies (e.g. 0–10 Hz) more than higher frequencies (e.g. 30–50 Hz). Adaptation interacts with input levels and includes Spike Frequency Adaptation (SFA) and Short-term Synaptic Depression (STD).

https://doi.org/10.1371/journal.pcbi.1010527.g005

It is typically assumed that overall levels of power generally reflect overall levels of neural activity. Consistent with this, evidence from human single unit recordings suggests that, at least under some conditions, broadband increases in the PSD are correlated with increased neural firing rates [29]. We expect this for SFA since Eq (1) is linear; if the input doubles, then the overall broadband power would double.

However, the non-linearity inherent in synaptic depression, as seen in Eq (3), means that increasing the input may lead to decreasing overall power. This dependence results from the fundamental non-linearity of synaptic depression, where the amplitude of post-synaptic potentials depends on the product of pre-synaptic spikes and the amount of depression [6,12]. For small deviations, we can examine some ramifications of this non-linearity using linear analyses. The steady-state output of synaptic depression Y(s) = γ a0 x0 has an inverse dependence on the input x0, since a0 = 1/1+Fx0. This means that for increasing levels of static (i.e. not time-varying) inputs, the DC output gain (i.e. the gain when s = iω = 0) may decrease. x0 is the steady state input, which in this case can be thought of as the mean firing rate that serves as the input for STD. Specifically, assuming that F is the same for all timescales (0<F<1), we see that the steady-state output for n timescales of synaptic depression would be: (8) which means that as x0 increases (i.e. as the steady-state firing rate increases between experiments or conditions) the steady-state output (or DC gain) increases until the derivative is zero at x0 = 1/(4nF-F), after which further increases in x0 cause the gain to decrease (Fig 5B). Similar shaped curves were obtained via STD simulations with varying levels of Poisson inputs. This non-monotonic, non-linearity also means that the effect of STD depends on its input, which is in part determined by SFA. In other words, the input x0 for STD is a function of SFA. For example, for a given input level, increasing SFA leads to fewer action potentials per second, which then decreases the input for STD (e.g. decreases input x0). Counterintuitively, this may lead to increased overall power in the postsynaptic membrane potential (Fig 5B). Similarly, increased inputs to a network with SFA and STD can lead to decreased power, especially for lower frequencies where STD has stronger effects. This interaction between input levels, SFA, and STD can paradoxically lead to PSDs that show increased power for high frequencies and decreased power for low frequencies as steady-state input levels are increased.

Several basic properties emerge from the general filter models in Eq (5):

  1. Adaptation affects power for lower frequencies more than higher frequencies.
  2. Increased input levels may decrease broadband power if synaptic depression is present.
  3. Multiple timescale adaptation flattens the slope of power spectra according to α.
  4. The interaction of increased input and adaptation can decrease low frequency power in the output while increasing high frequency power.

These properties are schematically demonstrated in Fig 5C. To summarize, adaptation affects low frequencies due to its high-pass characteristics. Multiple adaptation timescales decrease power at lower frequencies, thereby flattening PSDs. The non-linearity of STD means that higher levels of steady-state input can paradoxically lead to decreased broadband power. When only synaptic depression is present, increased input to the computational unit effectively increases the amount of negative feedback or adaptation in the neural circuity, which decreases power especially for lower frequencies. However, when SFA and STD are both present, moderately increased inputs decrease power for low frequencies and increase power for high frequencies, leading to a crossing of the baseline and final PSDs. The key difference between SFA and STD, in this respect, is the dependence on the steady-state portion of the input x0 in the denominator of STD, which is the result of inherent non-linearities in STD that are not present in SFA. If inputs are further increased, then power for all frequencies may be decreased.

Effects of multiple timescale feedback on power spectra

We wanted to examine how these properties may relate to PSDs from conductance-based models with spiking neurons and human EEG data from epilepsy patients. Previous work shows that the power spectra of EEG data demonstrate 1/fβ scaling [30,31], where β defines the exponent of the scaling and often varies between ~2 and ~4, primarily for higher frequencies. The shape or exponent of the power spectra may relate to the decay of synaptic inputs [30], ion channel or synaptic noise sources [32], or transitions between Up and Down states [33]. Decreases in the exponent, i.e. a counter-clockwise PSD tilt, have been associated with visuomotor task-related neuronal activation, with an average intersection or pivot point of 28 Hz [34].

We see a similar PSD tilt when comparing previous data recorded near the seizure onset zone (SOZ) with data far from the SOZ for 91 patients implanted with invasive EEG electrodes [35,36]. Specifically, power for frequencies less than ~1 Hz was decreased, while power for frequencies greater than ~1 Hz was increased near the SOZ compared to the non-SOZ (Fig 6). The cause of this tilt or crossing of the PSDs is not known, although the SOZ is generally considered to be hyperexcitable cortex.

thumbnail
Fig 6. Power spectra changes related to the seizure onset zone and increased adaptation.

(a) Previously published invasive EEG power spectra from 8 patients (left panel) [35] and 83 patients (right panel) [36] showing decreased low frequency activity and increased high frequency activity near the seizure onset zone (SOZ) compared to the non-SOZ. (b) Similar shapes can be obtained from constant input models (with one low-pass timescale to approximate membrane filtering) when more negative feedback is present (e.g., increased α or A), implemented either via fractional dynamics (left panel; α = 0.3, g = 0.7 or α = 0.1, g = 1; for both k = 100, τm = 0.3 s) or multiple timescales (right panel; A1 = A2 = A3 = 1.1, g1 = g2 = g3 = 1.3 or A1 = A2 = A3 = 1.5, g1 = g2 = g3 = 1.5; for both τ1 = 0.05 s, τ2 = 0.5 τ3 = 5 s, k = 500,τm = 5 s).

https://doi.org/10.1371/journal.pcbi.1010527.g006

A tilt in the PSD is also seen for Eq (5) and Eq (6) when the amount of adaptation or negative feedback is altered. Specifically, an increased α or A reflects increased negative feedback. Fig 6 shows that the power law exponent in the power spectral densities decreases as adaptation (negative feedback) increases. These examples don’t account for differing levels of input. To more realistically model PSD changes, we considered conductance-based models.

We implemented a feedforward neural network (Fig 7A) with conductance-based neurons (n = 100) that included three timescales of slow potassium currents, which implement spike frequency adaptation [16], as well as synaptic depression with two timescales [18]. As predicted, increasing the input amplitude leads to decreased broadband power when STD is present. If the input is increased and STD is decreased, either directly or indirectly by increasing SFA, the resulting PSDs show a “tilt” (Fig 7B). Generally, filters that act directly upon the membrane current, like STD, more strongly effect resulting timescales than filters that act upon hidden variables, like SFA [37].

thumbnail
Fig 7. Spiking neuron network model with spike frequency adaptation (SFA) and synaptic depression (STD).

(a) Feedforward neural network includes 100 conductance-based neurons with multiple timescales of SFA and STD. (b) PSDs show degrees of “tilt” or crossing when input is increased, and STD is decreased. Increasing SFA effectively decreases STD. (c) Increased adaptation, whether SFA or STD, generally leads to decreased power. However, increased SFA leads to decreased STD (rightmost panel). (d) Increasing input increases overall power when SFA is present. However, when STD is present, increasing the input leads to decreased overall power. When both SFA and STD are present, nonlinear effects are evident.

https://doi.org/10.1371/journal.pcbi.1010527.g007

As expected, increased adaptation results in decreased power (Fig 7C). However, increasing SFA can increase broadband power if STD is present, as the input to STD is then decreased (Fig 7C, rightmost panel). Thus, increasing input to the neural circuit can have contrasting effects. If only SFA is present, then power increases. However, if only STD is present, then power in fact can decrease (Fig 7D), as above. When both SFA and STD are present, these two negative feedback mechanisms can have opposing effects, with STD having a greater effect on lower frequencies. Overall, the results from the conductance-based modeling are consistent with results from the filter models.

Discussion

Negative feedback mechanisms, such as spike frequency adaptation (SFA) and short-term synaptic depression (STD), are thought to be prevalent in human neural circuitry and yet their effects on local field potentials and EEG remain unclear. Here, we develop an approach for understanding multiple timescales of adaptation. This approach incorporates fractional dynamics, a form of non-integer order calculus that is related to power laws and has history dependence as a key feature. We find that alterations in neural excitability, consistent with mechanisms of SFA and STD, are most evident in low frequency (e.g. <10 Hz) EEG changes. In general, increased adaptation can lead to a counterclockwise tilt of the PSD. However, with a concomitant increase of input, decreased adaptation can also lessen low frequency (<1 Hz) activity. Counterintuitively, we find that increased inputs can decrease broadband power via the effects of STD.

Typically, the SOZ is considered hyperexcitable cortex. Adaptive mechanisms may play a role in dampening hyperexcitability in the SOZ during the interictal period and fail during seizure initiation [38,39]. Previous interictal results show decreased low frequency (<1 Hz) and increased high frequency (~2–50 Hz) activity near the SOZ [35,36]. Present results suggest that this pattern is consistent with an interaction between altered adaptation and increased input to the local computational unit.

Previously, we have described computational properties consistent with fractional-order derivatives in rodents. Single neocortical neurons from brain slices were found to adapt with multiple timescales in response to injected currents of varying amplitudes and variances [8]. When compared with a time-varying input, the firing rate output was a waveform consistent with the fractional derivative of the input. The order of the fractional derivative was found to be approximately 0.15, and this order was increased or decreased when slow after-hyperpolarization (sAHP) currents were increased or decreased, respectively. Similar results were obtained from in vivo rodent neurons [18]. Adaptation to whisker movements of anesthetized rats was found in thalamic and neocortical neurons with orders of approximately 0.2 and 0.5, respectively. This is consistent with the idea that multiple SFA and STD processes along the sensory pathway contribute to the adaptation seen in the neocortex. We have previously suggested an approach for fitting multiple exponential adaptation processes to achieve a desired order of fractional differentiation, which approximates a power law [16], thus relating adaptation mechanisms to neural circuit responses.

Fractional differentiation and fractional derivatives are fundamentally related to power laws [5,9,22,40]. These mechanisms may promote efficient encoding and information transmission [4,20,21]. In general, adaptation has been linked to prediction [11] and memory [41], and multiple timescales may be critical for performing computational tasks [42]. As we have seen, mechanisms of pre- and post-synaptic adaptation can interact, and it is critical to consider them together [7].

Here, we presented a neural circuit model with SFA and STD to relate neuron-level adaptation mechanisms with changes in human EEG. These results suggest that EEG changes in the frequency range ~0.01–1 Hz may reflect, at least in part, changes in the level of adaptation of neural circuitry. Analysis of this EEG frequency range may provide estimates of neural excitability. Neural adaptation may be evident in macroscale electrophysiological recordings and provide a window to understanding neural circuit excitability.

Future work could include the extension of these results to larger neural circuit models and data related to human neural computation. Given the relationship to negative feedback mechanism, fractional dynamics may be relevant for neural circuit topology and stability. The extent to which fractional dynamics is present and gives important insights into human neural computation remains unclear. Fractional dynamics provides an attractive way of preserving history dependence and scale invariance during oscillatory responses that could provide deep insights into the nature and controllability of neural excitability.

Methods

Biophysical modeling

The neural network consisted of Hodgkin-Huxley (HH) neurons [2,43] and synapses that incorporated depression [6] implemented similar to previous work [8,18]. The single-compartment, conductance-based Hodgkin-Huxley (HH) model neuron was used with standard parameters: GNa = 120, GK = 36, and GLeak = 0.3 mS/cm2; ENa = 50, EK = -77, and ELeak = -54.4 mV; and C = 1 μF/cm2. Equations were solved numerically using fourth-order Runge-Kutta integration with a fixed time step of 0.05 msec, and spike times were identified as the upward crossing of the voltage trace at -10 mV (resting potential = -65 mV) separated by more than 2 msec. Input stimuli were zero mean exponentially-filtered (τ = 1 msec) Gaussian white noise stimulus with SD that was 10, 15, or 20 μA/cm2, which was used for the Low Input, Medium Input, and High Input conditions. Without adaptation, these inputs corresponded to ~1, 10, and 20 Hz spike output.

Spike frequency adaptation

Three slow adaptation currents were added to the HH neurons as previously [8]. For each AHP current, an additional current, or term, was added to the equation for dV/dt of the form–GAHPa(V–Ereversal), where a was incremented by one after each spike and decayed according to da/dt = -a/τ, with τ = [0.3, 1, 6] sec and GAHP = [0.05, 0.006, 0.004]GLeak. Low SFA employed GAHP, while High SFA was 5GAHP.

Synaptic depression

Synaptic depression was modeled as previously [6,18] with facilitation omitted. Depression variables (d and D) in the synapses relaxed exponentially to one with time constants of 0.6 and 9 s for fast depression and slow depression, respectively. For depression each input spike decreased d and D to new values of 0.4d and 0.995D, respectively. These parameters are similar to those found in the best-fit model for short-term synaptic depression [6]. This is equivalent to calculating the synaptic conductance amplitudes according to: where ti is the ith presynaptic spike time in seconds. The synaptic conductance amplitude, G = d D, for each neuron was then summed and filtered by an exponential with a time constant of 0.3 s. Presynaptic amplitudes are summed across neurons, and Welch’s Method was used to find PSDs. The above parameters were used for the “High STD” condition. For Low STD, each input spike decreased d and D to new values of 0.7d and 0.999D.

Supporting information

S1 Text. Further details supporting the mathematical equations presented in the main manuscript.

https://doi.org/10.1371/journal.pcbi.1010527.s001

(DOCX)

S1 Code. Data and code needed to create figures presented in the main manuscript.

This includes data and simulation approaches that have been previously published, as cited in the main manuscript.

https://doi.org/10.1371/journal.pcbi.1010527.s002

(ZIP)

References

  1. 1. Trappenberg TP. Fundamentals of Computational Neuroscience. Oxford University Press, USA; 2002.
  2. 2. Dayan P, Abbott LF. Theoretical neuroscience: computational and mathematical modeling of neural systems. Cambridge, Mass.: Massachusetts Institute of Technology Press; 2001. Available: http://www.loc.gov/catdir/toc/fy031/2001044005.html
  3. 3. Kandel ER, Schwartz JH, Jessell TM, Siegelbaum SA, Hudspeth AJ, editors. Principles of Neural Science. 5th edition. New York: McGraw-Hill Professional Pub; 2012.
  4. 4. Wark B, Lundstrom BN, Fairhall A. Sensory adaptation. Curr Opin Neurobiol. 2007;17: 423–429. pmid:17714934
  5. 5. Thorson J, Biederman-Thorson M. Distributed relaxation processes in sensory adaptation. Science. 1974;183: 161–172. pmid:4587440
  6. 6. Varela JA, Sen K, Gibson J, Fost J, Abbott LF, Nelson SB. A quantitative description of short-term plasticity at excitatory synapses in layer 2/3 of rat primary visual cortex. J Neurosci. 1997;17: 7926–7940. pmid:9315911
  7. 7. Mohan A, McDonnell M, Stricker C. Interaction of short-term depression and firing dynamics in shaping single neuron encoding. Frontiers in Computational Neuroscience. 2013;7. Available: https://www.frontiersin.org/article/10.3389/fncom.2013.00041 pmid:23626533
  8. 8. Lundstrom BN, Higgs MH, Spain WJ, Fairhall AL. Fractional differentiation by neocortical pyramidal neurons. Nat Neurosci. 2008;11: 1335–1342. nn.2212 [pii] pmid:18931665
  9. 9. Kleinz M, Osler TJ. A Child’s Garden of Fractional Derivatives. The College Mathematics Journal. 2000;31: 82–88.
  10. 10. Chari A, Thornton RC, Tisdall MM, Scott RC. Microelectrode recordings in human epilepsy: a case for clinical translation. Brain Communications. 2020;2. pmid:32954332
  11. 11. Puccini GD, Sanchez-Vives MV, Compte A. Integrated mechanisms of anticipation and rate-of-change computations in cortical circuits. PLoS Comput Biol. 2007;3: e82. pmid:17500584
  12. 12. Tripp BP, Eliasmith C. Population models of temporal differentiation. Neural Comput. 2010;22: 621–659. pmid:19922294
  13. 13. Adrian ED, Zotterman Y. The impulses produced by sensory nerve endings: Part 2. The response of a Single End-Organ. J Physiol. 1926;61: 151–171.
  14. 14. Ermentrout B. Linearization of F-I curves by adaptation. Neural Comput. 1998;10: 1721–1729. pmid:9744894
  15. 15. Benda J, Herz AV. A universal model for spike-frequency adaptation. Neural Comput. 2003;15: 2523–2564. pmid:14577853
  16. 16. Lundstrom BN. Modeling multiple time scale firing rate adaptation in a neural network of local field potentials. Journal of Computational Neuroscience. 2015;38: 189–202. pmid:25319064
  17. 17. Liu YH, Wang XJ. Spike-frequency adaptation of a generalized leaky integrate-and-fire model neuron. J Comput Neurosci. 2001;10: 25–45. pmid:11316338
  18. 18. Lundstrom BN, Fairhall AL, Maravall M. Multiple timescale encoding of slowly varying whisker stimulus envelope in cortical and thalamic neurons in vivo. J Neurosci. 2010;30: 5071–5077. pmid:20371827
  19. 19. La Camera G, Rauch A, Thurbon D, Luscher HR, Senn W, Fusi S. Multiple time scales of temporal response in pyramidal and fast spiking cortical neurons. J Neurophysiol. 2006. Available: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&dopt=Citation&list_uids=16807345 pmid:16807345
  20. 20. Fairhall AL, Lewen GD, Bialek W, de Ruyter Van Steveninck RR. Efficiency and ambiguity in an adaptive neural code. Nature. 2001;412: 787–792. pmid:11518957
  21. 21. Pozzorini C, Naud R, Mensi S, Gerstner W. Temporal whitening by power-law adaptation in neocortical neurons. Nat Neurosci. 2013;16: 942–948. pmid:23749146
  22. 22. Podlubny I. Fractional differential equations: an introduction to fractional derivatives, fractional differential equations, to methods of their solution and some of their applications. San Diego: Academic Press; 1999.
  23. 23. Nunez PL, Srinivasan R. Electric fields of the brain: the neurophysics of EEG. 2nd ed. Oxford; New York: Oxford University Press; 2006. Available: http://www.loc.gov/catdir/enhancements/fy0916/2005040874-t.htmlhttp://www.loc.gov/catdir/enhancements/fy0916/2005040874-d.html
  24. 24. Destexhe A, Rudolph M, Fellous JM, Sejnowski TJ. Fluctuating synaptic conductances recreate in vivo-like activity in neocortical neurons. Neuroscience. 2001;107: 13–24. pmid:11744242
  25. 25. Moreno R, de la Rocha J, Renart A, Parga N. Response of spiking neurons to correlated inputs. Phys Rev Lett. 2002;89: 288101. pmid:12513181
  26. 26. Destexhe A, Rudolph M, Pare D. The high-conductance state of neocortical neurons in vivo. Nat Rev Neurosci. 2003;4: 739–751. pmid:12951566
  27. 27. Rauch A, La Camera G, Luscher HR, Senn W, Fusi S. Neocortical pyramidal cells respond as integrate-and-fire neurons to in vivo-like input currents. J Neurophysiol. 2003;90: 1598–1612. pmid:12750422
  28. 28. Richardson MJ. Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons. Phys Rev E Stat Nonlin Soft Matter Phys. 2004;69: 51918. pmid:15244858
  29. 29. Manning JR, Jacobs J, Fried I, Kahana MJ. Broadband shifts in local field potential power spectra are correlated with single-neuron spiking in humans. J Neurosci. 2009;29: 13613–13620. pmid:19864573
  30. 30. Bédard C, Kröger H, Destexhe A. Does the 1/f frequency scaling of brain signals reflect self-organized critical states? Phys Rev Lett. 2006;97: 118102. pmid:17025932
  31. 31. Miller KJ, Sorensen LB, Ojemann JG, Den Nijs M. Power-law scaling in the brain surface electric potential. PLoS Computational Biology. 2009;5. pmid:20019800
  32. 32. Pettersen KH, Lindén H, Tetzlaff T, Einevoll GT. Power laws from linear neuronal cable theory: power spectral densities of the soma potential, soma membrane current and single-neuron contribution to the EEG. PLoS Comput Biol. 2014;10: e1003928. pmid:25393030
  33. 33. Baranauskas G, Maggiolini E, Vato A, Angotzi G, Bonfanti A, Zambra G, et al. Origins of 1/f2 scaling in the power spectrum of intracortical local field potential. J Neurophysiol. 2012;107: 984–994. pmid:22090461
  34. 34. Podvalny E, Noy N, Harel M, Bickel S, Chechik G, Schroeder CE, et al. A unifying principle underlying the extracellular field potential spectral responses in the human cortex. Journal of neurophysiology. 2015;114: 505–19. pmid:25855698
  35. 35. Lundstrom BN, Boly M, Duckrow R, Zaveri HP, Blumenfeld H. Slowing less than 1 Hz is decreased near the seizure onset zone. Scientific Reports. 2019;9: 6218. pmid:30996228
  36. 36. Lundstrom BN, Brinkmann BH, Worrell GA. Low frequency novel interictal EEG biomarker for localizing seizures and predicting outcomes. Brain Commun. 2021;3: fcab231. pmid:34704030
  37. 37. Beiran M, Ostojic S. Contrasting the effects of adaptation and synaptic filtering on the timescales of dynamics in recurrent networks. PLOS Computational Biology. 2019;15: e1006893. pmid:30897092
  38. 38. Meisel C, Storch A, Hallmeyer-Elgner S, Bullmore E, Gross T. Failure of Adaptive Self-Organized Criticality during Epileptic Seizure Attacks. PLOS Computational Biology. 2012;8: e1002312. pmid:22241971
  39. 39. Wenzel M, Hamm JP, Peterka DS, Yuste R. Acute Focal Seizures Start As Local Synchronizations of Neuronal Ensembles. J Neurosci. 2019;39: 8562–8575. pmid:31427393
  40. 40. Anastasio TJ. Nonuniformity in the linear network model of the oculomotor integrator produces approximately fractional-order dynamics and more realistic neuron behavior. Biol Cybern. 1998;79: 377–391. pmid:9851019
  41. 41. Fitz H, Uhlmann M, van den Broek D, Duarte R, Hagoort P, Petersson KM. Neuronal spike-rate adaptation supports working memory in language processing. Proceedings of the National Academy of Sciences. 2020;117: 20881–20889. pmid:32788365
  42. 42. Mi Y, Lin X, Wu S. Neural Computations in a Dynamical System with Multiple Time Scales. Frontiers in Computational Neuroscience. 2016;10. Available: https://www.frontiersin.org/article/10.3389/fncom.2016.00096
  43. 43. Hodgkin AL, Huxley AF. A quantitative description of membrane current and its application to conduction and excitation in nerve. Journal of Physiology. 1952;117: 500–544. pmid:12991237