dxp\ i dxp where the "hats" have to be inserted. On the sample level, the parameter estimates and the elements of the estimated covariance matrix are plugged in.

Declustering records prior to fitting a GP distribution discards excess data and loses information, as noted by Coles (2001b). A more efficient GP estimation may come from retaining all excess data (also those within a cluster) and modelling the serial dependence. Fawcett and Walshaw (2006) present Monte Carlo evidence supporting this approach and an example where the AR(1) persistence model is applied to hourly wind-speed data from central and northern England (time interval 1974-1991, 10 sites; 1975-1984, 2 sites). An alternative for efficient GP estimation (Fawcett and Walshaw 2007) may be inflating the co-variance matrix (Eq. 6.14). Referring to a preprint by Smith RL, this paper advises to replace the covariance matrix by H-1VH-1, where H is the Fisher observed information matrix and V is the covariance matrix of the likelihood gradient vector. Ferro and Segers (2003) devised an automatic declustering scheme that relies on the extremal index (Section, which is estimated before declustering. Ledford and Tawn (2003) developed a diagnostic tool (autocorrelation measure for extreme values), which helps to assess whether declustering of a series has been successful.

The efficiency of a statistical estimator refers to its standard error under a particular parent distribution. Higher efficiency means smaller se.

Fisher information is a measure of the amount of information provided by a sample about an unknown parameter (Kullback 1983). In case of maximum likelihood estimation of the parameters of the GEV distribution (Eq. 6.8), the information is related to the expectation of the negative of the matrix that gives the curvature of the log-likelihood function. Efron and Hinkley (1978) gave the advice, and this is also the modern tendency (Davison AC 2009, personal communication), to use instead of the Fisher expected information the observed information matrix, that means, not to use the expectation (Eq. 6.8), but rather the numerically determined derivatives of the log-likelihood function.

Optimal estimation is a general theme, it opens a wide research field (Section 9.5). Regarding the GEV and GP models, in addition to estimation methods presented in Section 6.2, many articles devoted to improving the estimation appeared, including the following. Castillo and Hadi (1997) reviewed GP estimation methods and suggested a new one (elemental percentile method), which is based on a two-stage procedure. Castillo and Hadi (1997: p. 1611 therein) wrote: "In the first stage, preliminary initial estimates of the parameters are calculated [based on

{xout, sort (j)}"=1]. These initial estimates are combined in the second stage to give overall estimates of the parameters. These final estimates are then substituted in the quantile function to obtain estimates of all desired quantiles." They provided theoretical asymptotic as well as Monte Carlo evidence in support of their estimator. Martins and Stedinger (2000) augmented maximum likelihood estimation of GEV parameters with a Bayesian method to restrict values of the "critical" shape parameter, £, to "statistically/physically reasonable" ranges. Subsequently, Martins and Stedinger (2001) extended this "generalized maximum likelihood estimation" to the case of GP parameters and quantiles. A full Bayesian estimation method with computing-intensive determination of the distribution of the GEV parameter estimates was presented by Reis and Stedinger (2005). Bayesian methods for GP parameter estimation were developed also to include either prior expert knowledge (Parent and Bernier 2003b) or additional historical information (Parent and Bernier 2003a). Documentary data, although less accurate than runoff measurements, may cover longer time intervals (Section 6.1) and can therefore lead to an improved tail estimation. Hewa et al. (2007) applied an adaption of PWM estimation of the GEV model, where weighting is imposed on the extreme part of the distribution, to study low river flows in Australia.

Time-dependent extreme value distributions have been applied in a number of climatological and environmental studies, with the GEV model seemingly preferred over the GP. Smith (1989) fitted the GEV model with linearly time-dependent location (Eq. 6.23) and constant scale and shape parameters to hourly ground-level ozone concentration time series from a station in Texas, April 1973 to December 1986 (n = 119,905). Despite this simple form of time-dependence, he reported that the maximization of the log-likelihood function was nontrivial and that numerical techniques that approximate the second derivatives (instead of explicitly calculating them) performed better. In his later review, Smith (2004) applied the same GEV model to wind-speed extremes, where he allowed seasonality in the time-dependence of the location by including terms ~ sin(2nT/T0), where T0 = 1 a. In that paper, he compared this model with a more elaborate model (exponential increases with time in location and scale) for rainfall extremes from a number of stations in the United States of America, period 1951-1997. Coles (2001a) used the GEV model with time-dependent location and scale parameters and a constant shape for analysing annual maxima of wind speed recorded at stations in the United States of America between 1912 and 1987. Regarding time-dependence in shape, Coles (2001a: Section 2.2 therein) remarks that "such a model is likely to be difficult to identify." Katz et al. (2002) considered the GEV model with linear trend in location, exponential increase in scale and constant shape for studying extreme precipitation and runoff events in a changing climate. Seasonality can be taken into account at the stage of data selection by setting the block size to 1 year or by dividing the year into seasons and building separate models. Coles and Pericchi (2003) and Coles (2004) formulated the division of the year into two seasons as an inference problem of the 2 days of change. These papers present also an adaption of the likelihood function for a GEV model to a situation where partly only annual maxima were recorded and partly daily values exist. Their example, rainfall in Venezuela, with d(i) = 1 year for 1951-1961 and d(i) = 1 day for 1961-1999, is not unusual within the context of direct meteorological observations. Naveau et al. (2005) applied the GEV model with exponential trend in the location parameter to time series of lichen size from a moraine formation in the Andes mountains with the objective to study glacier retreats over the past approximately 700 years. Kharin and Zwiers (2005) studied the global, gridded near-surface air temperature and precipitation for the interval 1990-2100 using the CGCM2 climate model driven by various greenhouse gas emission scenarios. These authors applied the GEV model with linear trends in location and exponential trends in scale. Interestingly, they allowed for a linear trend in the shape parameter and found no "serious computational obstacle" to solving the maximum likelihood estimation, although £(Tout) w 0 was found as a result for most of the grid-point time series. Kharin and Zwiers (2005) also preferred error bars from nonparametric bootstrap resampling over the more traditional estimates from the covariance matrix. Rust et al. (2009) fitted the GEV model with seasonal trends in location and scale (and constant shape) to daily rainfall at 689 stations across the United Kingdom. From their analysis of the interval from 1 January 1900-31 December 2006, they concluded that during the winter season (Rust et al. 2009: p. 106 therein) "the entire west coast shows a band of return levels larger than the inland and the east coast." Pujol et al. (2007) tested for trends in the GEV distribution fitted with maximum likelihood to time series of monthly and annual rainfall maxima from 92 stations in the French Mediterranean region. The competing models were the stationary (three parameters), and the model with linear trends in location and scale and constant shape (five parameters). The test statistic employed was the deviance, which is defined as D = 2ln(L1 — L0), where L1 and L0 is the maximized log-likelihood of the linear and the stationary model, respectively. Under stationar-ity and for large m, D is approximately chi-squared distributed with the degrees of freedom equal to the difference in number of parameters (Coles 2001b), that is, two in this case. Zhang et al. (2004) analysed the test power by means of Monte Carlo simulations and showed the superiority over the Mann-Kendall test for detecting trends in GEV parameters. In a series of papers, Strupczewski et al. (2001a), Strupczewski and Kaczmarek (2001) and Strupczewski et al. (2001b) developed the methodology of time-dependent moments and analysed runoff extremes from Polish rivers, interval 1921-1990. Trends in location and scale of various degree of complexity were fitted by maximum likelihood or an adaption of weighted least squares, and model selection was based on the AIC, similar to the deviance test. Instead of letting, say, the location parameter depend directly on time, one may let it depend on another, informative variable (covariate): ^(Tout) = Ao + A1Y(Tout) is a linear model. Smith and Shively (1995) analysed trends in ground-level ozone concentration, X(Tout), by means of GP distributions dependent on time and other covariates, Y(Tout), such as maximum temperature or average wind speed. The GP distribution with time-dependent scale parameter was applied in other work dealing with surface-air temperature extremes in the North Atlantic region during 1948-2004 (Nogaj et al. 2006) or river floods in the Czech Republic during 1825-2003 (Yiou et al. 2006). Time-dependent GP and GEV models were fitted to runoff records from Germany during 1941-2000 (Kallache 2007). Assuming a constant shape parameter this author found no major numerical problems in likelihood maximization using the simplex method, even for polynomial time-dependences in location and scale of orders up to four (Kallache M 2008, personal communication).

Covariates, Y(i), bear information about the extremal part of the climate variable of interest, X(i). This chapter focuses on the time, T(i), as covariate. However, other covariates as well may help, also jointly, to predict X(i) extremes. This leads to methods of regression between two processes (Chapter 8). In particular, a climate model may perform better at predicting the Y(i) than the extremal part of X(i). Better climate risk forecasts should then come from model-predicted Y(i). For example, Cooley et al. (2007) use as covariates (1) mean precipitation and (2) topography to model extreme precipitation return levels for Colorado (time series from 56 stations, interval 1948-2001).

Semi-parametric estimation of the time-dependent GEV distribution based on kernel weighting and local likelihood estimation was introduced by Davison and Ramesh (2000) and Hall and Tajvidi (2000). The unweighted local log-likelihood function, see Eq. (6.6), is written as ln[L(^, a, £; y(j))], where a and £ are the GEV parameters and y(j) is a scaled extreme (Eq. 6.7). The weighted log-likelihood function is formed by putting a kernel weight, K, to the local log-likelihood:

m ln [L(^, a, £; T)] = ^ K([T - Tout(j)] /h) • ln [L(^ a, £; y(j))] , (6.57) j=i where h is the bandwidth. Hall and Tajvidi (2000) present several bandwidth selectors. Maximization of the weighted log-likelihood function produces the local (in T) maximum likelihood estimates. Davison and Ramesh (2000) further adapted bootstrap resampling by studentizing to determine the estimation uncertainty. They presented Monte Carlo experiments for sample size m = 100, which demonstrated acceptable coverage performance. The semi-parametric method was then applied to the central England temperature time series (Section 2.6), which showed that (Davison and Ramesh 2000: p. 202 therein) "the change in upper extremes is mostly due not to changes in the location or in the shape of their distribution but in their variability." In a later paper (Ramesh and Davison 2002), the authors applied semi-parametric local likelihood estimation to study time-dependent extremes in sea-level data from Venice, 1887-1981. Butler et al. (2007) employed local likelihood estimation to quantify trends in extremes of modelled North Sea surges for the period 1955-2000. Another semi-parametric estimation method (Pauli and Coles 2001; Chavez-Demoulin and Davison 2005) uses spline functions (Eq. 4.62) to model the time-dependences of the GEV parameters. This was applied to annual temperature maxima between 1900 and 1980 at two stations in England (Pauli and Coles 2001) and daily winter temperature minima between 1971 and 1997 at 21 stations in Switzerland (Chavez-Demoulin and Davison 2005).

Poisson and point processes are treated in the books by Cox and Lewis (1966), Cox and Isham (1980) and Karr (1986).

Occurrence rate is the name employed in this book for the parameter A or the function A(T) of the Poisson process, prohibiting misunderstandings from the alternatively used "intensity."

Parametric occurrence rate models are often used in combination with statistical tests. Loader (1992) developed tests, based on maximum likelihood estimation, to choose among three models. The first is a gradual change-point model where Tchange is the change-point in time. It includes the second, abrupt change-point model, which has A = 0, and it includes also the simple model (Eq. 6.39), which has = 0. Loader (1992) derived analytical approximations of the test powers. Worsley (1986) had previously devised a test for the abrupt change-point model with null hypothesis "constant occurrence rate." Frei and Schar (2001) constructed a test for increasing (decreasing) occurrence rate in the logistic model (Eq. 6.40) and carried out Monte Carlo simulations to evaluate the test power. A caveat is that their experiments do not simulate serial dependence. This may lead to an overestimated power when applied to a climate time series that stems from a persistent process.

Model suitability of the inhomogeneous Poisson process can theoretically be tested using methods (Solow 1991; Smith and Shively 1994, 1995) based on the spacing of the event times, Sout(j) = Tout(j) — Tout(j — 1). One procedure is to construct a probability plot (as in Fig. 6.3e) to test the shape of the distribution function, the other is to calculate the correlation (Chapter 7) between successive Sout(j) to assess the statistical independence. Further tests are reviewed by Lang et al. (1999).

Quantile regression (Section 4.4) may in principle be used for estimating time-dependent quantiles. Few studies exist yet in climatology. Sankarasubramanian and Lall (2003) presented a Monte Carlo experiment that compares this method with the semi-parametric local likelihood estimation (Davison and Ramesh 2000). Both methods exhibited similar bias and RMSE values of quantile estimates. Sankarasubrama-nian and Lall (2003) further applied both methods to estimate time-dependent risk of floods in the river Clark Fork, based on daily runoff data from the interval 1930-2000. Elsner et al. (2008) found an increasing magnitude of Atlantic tropical cyclones for the period from 1981 to 2006. This result may be interpreted with caution as the study did deliberately not take persistence into account. Allamano et al. (2009) found that "global warming increases flood risk in mountainous areas" on ba

exp(A) + AT) for T(1) < T < Tchange, exp(A) + AT + A) for Tchange < T < T(n),

sis of quantile regression analyses of annual maxima of 27 Swiss runoff series over the past approximately 100 years. Unfortunately, their paper did not provide the details required to reproduce their finding (station names, data sizes and missing values). For example, spurious upwards (downwards) trends might arise if missing values cluster in the earlier (later) period. A second caveat against accepting the found significance of the increased flood risk comes from the authors' deliberate ignorance of the Hurst phenomenon of long-term persistence (Section 2.5.3).

Timescale-uncertainty effects on extreme value analyses seem not to have been studied yet. For stationary models (Section 6.2), we anticipate sizable effects on block extremes-GEV estimates only when the uncertainties distort strongly the blocking procedure. For nonstationary models (Section 6.3), one may augment confidence band construction by inserting a timescale simulation step (after Step 4 in Algorithm 6.1).

The Elbe flood in August 2002 has received extensive scientific coverage. Ulbrich et al. (2003b) analyse the meteorological situation that led to this extreme event. Engel et al. (2002) and Ulbrich et al. (2003a) explain the hydrographical development. Griinewald et al. (2003) and Becker and Griinewald (2003) assess the damages caused by the catastrophe and consider consequences such as improving the risk protection.

The Elbe flood occurrence rate since 1021 was estimated by Mudelsee et al. (2003). This paper and Mudelsee et al. (2004) consider besides climatological influences the following other potential factors: deforestation, solar activity variations, river engineering, reservoir construction and land-use changes. Analyses of flood risk, not only of the Elbe, benefit from considering seasonal effects. In many parts of central Europe, the floods in hydrological summer are caused by heavy rainfall, in the winter additionally by thawing snow (Fischer 1907; Griinewald et al. 1998). Breaking river ice may function as barrier, enhancing winter floods severely (Griinewald et al. 1998). Elbe summer flood risk during the instrumental period (from 1852) does not show trends in occurrence of heavy floods (Mudelsee et al. 2003). This season can therefore be analysed using a stationary model (Fig. 6.3). Elbe winter flood risk decreased significantly during the instrumental period (Fig. 6.7).

Volcanism and climate are coupled: a volcanic eruption releases material into the atmosphere, which changes the radiative forcing and leads generally to cooling. This and other mechanisms have been observed for the past millennium via proxy variables (Robock 2000). Volcanic influences on climate act also on longer timescales: the Holocene (Zielinski et al. 1994), the late Pleistocene (Zielinski et al. 1996) and the Pliocene (Prueher and Rea 2001). The results obtained with kernel oc currence rate estimation on sulfate data from the NGRIP ice core (Fig. 6.8), interval 10-110 ka, may be compared with the findings (Zielin-ski et al. 1996) from histogram estimation on sulfate data from the GISP2 ice core. These authors report elevated levels of activity during [6 ka; 17 ka] and [22 ka; 35 ka]. These time intervals, and possibly also that of another high during [55 ka; 70 ka] (Zielinski et al. 1996: Fig. 5 therein), agree qualitatively well with the results from NGRIP. Quantitative agreement (at maximum a few tens of eruptions per ka) is approached when adopting the more liberal detection threshold (Fig. 6.8a). The occurrence rate of volcanic eruptions, restricted to the tropical region and shorter timescales (period 1400-1998), was estimated by application of a parametric logistic model to sulfate records from ice cores (Ammann and Naveau 2003). These authors found indications for the existence of a cycle of 76 year period in occurrence rate and adapted the logistic model (Eq. 6.40) by adding a sinusoidal time-dependence.

A hurricane activity peak during medieval times was also found on proxy data in the form of overwash sediment records from sites along the North American East Coast (Mann et al. 2009), confirming the previous finding by Besonen et al. (2008). A hurricane is a tropical cyclone in the North Atlantic-West Indies region with near-surface wind speed equal to or larger than 64 knots or about 119 ms-1 (Elsner and Kara 1999). There is a considerable, partly heated debate in the scientific literature, before and after the Katrina hurricane in August 2005, on the trend in hurricane risk during the twentieth century. Papers on data and analysis include Landsea (1993), Bengtsson et al. (1996), Landsea et al. (1996, 1997), Michener et al. (1997), Wilson (1997), Pielke and Landsea (1998), Elsner et al. (1999), Landsea et al. (1999), Easterling et al. (2000), Meehl et al. (2000), Goldenberg et al. (2001), Cutter and Em-rich (2005), Emanuel (2005), Pielke et al. (2005), Elsner (2006), Mann and Emanuel (2006), Chang and Guo (2007), Holland (2007), Landsea (2007), Mann et al. (2007a,b), Nyberg et al. (2007), Elsner et al. (2008), Landsea et al. (2008), Vecchi and Knutson (2008), Knutson et al. (2010) and Landsea et al. (2010). While the issue of the trend seems not resolved, it appears clear that (1) economic losses are not a good proxy variable of hurricane occurrence or magnitude and (2) there is room for enhancing the analyses by means of advanced statistical methods.

Heatwaves are events of extreme temperature lasting several days to weeks. An example is the summer heat 2003 in Europe (Beniston 2004). To capture the intensity and duration aspects of a heatwave, various index variables (Kysely 2002; Meehl and Tebaldi 2004; Khaliq et al. 2005; Alexander et al. 2006; Della-Marta et al. 2007) can be constructed from measured daily temperature series. A direct approach is the exceedance product (Kürbis et al. 2009), an index variable formed by multiplying the exceedance of a previous record temperature by the number of days an exceedance occurs within a summer season. Kürbis et al. (2009) devise a hypothesis test based on MBB resampling to evaluate trends in the exceedance product and apply it to long instrumental records from Potsdam (1893-2005) and Prague-Klementinum (1775-2004). An open research field is the analysis of the distributional properties of function-als like the heatwave index variables within the context of multivariate extremes (Beirlant et al. 2004: Chapters 8 and 9 therein). In an application to daily minimum temperature from a station in Ohio, interval 1893-1987, Smith et al. (1997) studied various functionals such as the length of a cluster of cold extremes.

Applications of a fitted inhomogeneous Poisson process with bootstrap confidence band to extreme events in the climate system include the following. Solow (1991) studied explosive volcanism in the northern hemisphere, 1851-1985, and linked the upwards trend in occurrence rate to the increase in northern hemisphere temperature. Mudelsee et al. (2006) estimated flood risk of the German river Werra over the past 500 years and found trends that partly deviate from trends of neighboured rivers Elbe and Oder (Mudelsee et al. 2003). This demonstrates the spatial variability of river flood risk. Fleitmann et al. (2007b) explored, via Ba/Ca proxy evidence from a coral, events of extreme soil erosion in Kenya, 1700-2000, and detected upwards trends that set in around 1900, after the colonization. Girardin et al. (2006b) inferred den-droclimatically a record of wildfires in Canada that goes back to 1769. Augmenting this data set with other series from the region and climate model output, Girardin and Mudelsee (2008) studied past and possible future (up to 2100) trends in wildfire risk and conclude that past high levels (A(T) w 0.2 a-1) may again be reached. Abram et al. (2008) explored the Indian Ocean Dipole (IOD, east-west sea-surface temperature gradient), 1846-2008, using coral proxy evidence and find an increase in occurrence of extreme IOD events during the past decades.

6.6 Technical issues

Maximum likelihood estimation of the GEV distribution has the following regularity conditions (Smith 1985):

■ for £ > -0.5, the estimators have the asymptotic properties of mul-tivariate normality with the covariance matrix as described in Section;

■ for —1 < £ < -0.5, the estimators may exist but do not have the asymptotic properties;

■ for £ < — 1, consistent maximum likelihood estimators do not exist.

The log-likelihood function to be employed for the GEV model with £ = 0 ("Gumbel likelihood") is (Coles 2001b), m m ln [L(p, a)] = — m ln (a) — £ y(j) — £ exp [-y(j)], (6.59)

Kharin and Zwiers (2005: Appendix therein) describe details (starting values, local minima) of the numerical maximization of the log-likelihood function of the GEV model. Van Montfort and Witter (1985: Appendix B therein) do similar for the GP model.

The digamma function ^(x) is the logarithmic derivative of the gamma function, ^(x) = d ln [r(x)] /dx. See Abramowitz and Stegun (1965: Section 6.3 therein) for more details on the digamma function.

The simplex method is a numerical search technique applicable to optimization problems (Press et al. 1992: Section 10.4 therein) such as high-dimensional maximum likelihood estimation. Consider a space of dimension (number of estimation parameters) k. A simplex is a non-degenerate geometric figure spanned by k + 1 points (starting values) in the space. The task is to move and shrink the simplex in the space in a way that it includes with sufficient precision the maximum likelihood solution. The method does not perform gradient calculation for deciding how to move/shrink, it selects among possible steps more in a brute-force manner. It may be slower than gradient search techniques but, on the other hand, also more robust.

Gaussian kernel functions for occurrence rate estimation offer the advantage that Eq. (6.33) can be computed fast in the Fourier domain (Silverman 1982; Jones and Lotwick 1984). Fourier transform algorithms (FFT) are abundant (Monro 1975, 1976; Press et al. 1996).

Cross-validation function evaluation for kernel occurrence rate estimation (Eqs. 6.37 and 6.38) is computationally expensive. The second term on the right-hand side of Eq. (6.37) constitutes a sum of exponentials over a rectangle (j = 1,..., m; k = 1,..., m^). Because of the symmetry only approximately half of the summands have to be determined. The summands near the upper left or lower right corner of the rectangle are small (rc exp{ —[(Tout(j) — T[ut(k))/h]2/2}), the summands near the 1:1 line are around unity. The following approximation could in principle reduce further computing costs. Calculate the summands only in the intermediate range, set the summands near ("near" defined by machine precision) the 1:1 line equal to unity, and omit the summands near the two corners. However, for typical sample sizes, m, in climatology (less than a few thousand) and typical machine precisions (PC and workstation systems with 32- or 64-bit processors), the reduction is negligible (Mudelsee 2001, unpublished manuscript).

Software tools for fitting stationary extreme value distributions to data are abundant, while programs for estimating nonstationary extreme value models are rare.

MLEGEV is a Fortran subroutine (Hosking 1985; Macleod 1989) for maximum likelihood estimation of the parameters of the stationary GEV model. It serves as a basis for many software tools developed later. A download site is http://lib.stat.cmu.edu/apstat/215 (14 July 2008).

Statistical Modelling in Hydrology is the title of a book (Clarke 1994) that contains Genstat and Matlab programs implementing various estimation methods for stationary extreme value distributions.

Xtremes (Reiss and Thomas 1997) is a compiled Windows software package for analysing stationary extreme value models by means of several estimation methods, bootstrap resampling and model suitability tests.

Flood Frequency Analysis is the title of a book (Rao and Hamed 2000) that includes Matlab programs for maximum likelihood and PWM estimation of stationary GEV and GP distributions.

WAFO is a Matlab package (WAFO group 2000) that includes maximum likelihood and PWM estimation of stationary GEV and GP distributions. The software can be downloaded from the following site: http://www.maths.lth.se/matstat/wafo (7 July 2008).

The ismev package for the R computing environment supports the computations carried out in the book by Coles (2001b). It is available at http://cran.r-project.org/web/packages/ismev (7 July 2008).

The evd package for the R computing environment augments ismev. It is available at http://cran.r-project.org/web/packages/evd (7 July 2008).

EVIM is a Matlab package (Gencay et al. 2001) for stationary extreme value analysis: declustering, fitting GEV and GP models and assessing suitability. It is available at the following internet address: http://www.bilkent.edu.tr/~faruk/evim.htm (7 July 2008).

Dataplot is a software (Unix, Linux, Windows) for fitting stationary extreme value distributions with bootstrap CIs and performing model suitability analysis. It can be obtained from the following internet address: http://www.itl.nist.gov/div898/winds/dataplot.htm (4 July 2008).

Extremes is a software tool (R language), based on ismev and evd routines, for analysing interactively stationary extreme value models. It is available at http://www.isse.ucar.edu/extremevalues/evtk.html (4 July 2008).

GEVFIT is a module (Stata computing environment) for maximum likelihood estimation of a GEV model. It resides on the following internet address: http://ideas.repec.org/c/boc/bocode/s456892.html (4 July 2008).

The declustering method for GP estimation (Fawcett and Walshaw 2006) was implemented as an R code. It is available at the internet address http://www.mas.ncl.ac.uk/~nlf8 (25 May 2010).

VGAM is a mixed package (C, Fortran 77 and 90, S-Plus/R) for fitting a wide class of regression models, so-called vector generalized additive models (Yee and Wild 1996), to time series. This includes not only estimation of stationary extreme value distributions but also quantile regression (nonstationarity). The software can be downloaded from http://www.stat.auckland.ac.nz/~yee/VGAM (7 July 2008).

Statistics of Extremes is the title of a book (Beirlant et al. 2004) that is accompanied by a set of routines written in S-Plus and FORTRAN 77. Besides fitting stationary models and estimating distribution parameters and quantiles, the routines for Chapter 7 of the book allow for covariates and may be used for fitting nonstationary models. The software resides at http://lstat.kuleuven.be/Wiley (7 July 2008).

Caliza is a Fortran 90 software for fitting a nonstationary inhomo-geneous Poisson process with bootstrap confidence band to POT data. It includes CLIM-X-DETECT for threshold selection and extremes detection (Chapter 4). Caliza also performs the Cox-Lewis test for trends in the occurrence of extreme events. A demo version is available at the web site for this book.

Part III

Bivariate Time Series

Chapter 7

Renewable Energy Eco Friendly

Renewable Energy Eco Friendly

Renewable energy is energy that is generated from sunlight, rain, tides, geothermal heat and wind. These sources are naturally and constantly replenished, which is why they are deemed as renewable.

Get My Free Ebook

Post a comment