## Persistence Models

Climatic noise often exhibits persistence (Section 1.3). Chapter 3 presents bootstrap methods as resampling techniques aimed at providing realistic confidence intervals or error bars for the various estimation problems treated in the subsequent chapters. The bootstrap works with artificially produced (by means of a random number generator) resam-ples of the noise process. Accurate bootstrap results need therefore the resamples to preserve the persistence of Xnoise(i). To achieve this requires a model of the noise process or a quantification of the size of the dependence. Model fits to the noise data inform about the "memory" of the climate fluctuations, the span of the persistence. The fitted models and their estimated parameters can then be directly used for the bootstrap resampling procedure.

It turns out that for climate time series with discrete times and uneven spacing, the class of persistence models with a unique correspondence to continuous-time models is rather limited. This "embedding" is necessary because it guarantees that our persistence description has a foundation on physics. The first-order autoregressive or AR(1) process has this desirable property.

### 2.1 First-order autoregressive model

The AR(1) process is a simple persistence model, where a realization of the noise process, Xnoise(i), depends on just the value at one time step earlier, Xnoise(i — 1). We analyse even and uneven spacing separately.

M. Mudelsee, Climate Time Series Analysis, Atmospheric and 33

Oceanographic Sciences Library 42, DOI 10.1007/978-90-481-9482-7_2, © Springer Science+Business Media B.V. 2010

2.1.1 Even spacing

In Eq. (1.2) we let the time increase with constant spacing d(i) = d > 0 and write the discrete-time Gaussian AR(1) noise model,

Xnoise(i) = a • Xnoise(i - 1) + ¿N(0, 1-«2)(i), i = 2, . . . , n.

Herein, —1 < a < 1 is a constant and EN(M, CT2)(-) is a Gaussian random process with mean variance ct2 and no serial dependence, that means, E [En(m, ct2) (i) • En(m, ct2) (j)] = 0 for i = j. It readily follows that Xnoise(i) has zero mean and unity variance, as assumed in our decomposition (Eq. 1.2). Figure 2.1 shows a realization of an AR(1) process.

Figure 2.1. Realization of an AR(1) process (Eq. 2.1); n = 200 and a = 0.7.

Figure 2.1. Realization of an AR(1) process (Eq. 2.1); n = 200 and a = 0.7.

The autocorrelation function,

{Xnoise(i + h) — E[Xnoise(i + h)] HXnoise(i) — E[Xnoise(i)] }

0 0