Evidence That Plant Disease Patterns Have Changed Due To Climate Change

If patterns of plant disease in an area have shifted at the same time that changes in climate are observed, when can this correlation be taken as evidence of climate change impacts on disease? Such an analysis is complicated by the number of factors that interact to result in plant disease. For example, if a disease becomes important in an area in which it was not important in the past, there are several possible explanations. The pathogen populations may have changed so that they can more readily infect and damage hosts. The pathogen species or particular vectors of the pathogen may be newly introduced to the area. In agricultural systems, host populations may have changed as managers have selected new cultivars based on criteria other than resistance to the disease in question. Management of the abiotic environment may have changed, such as changes in how commonly fields are tilled (tillage often reduces disease pressure), or changes in planting dates (which may result in more or less host exposure to pathogens). To rule out such competing explanations for changes in plant disease pattern, the argument for climate change as an important driver is strongest when (a) the pathogen is known to have been present throughout the area during the period in question, (b) the genetic composition of the pathogen and host populations has apparently not shifted to change resistance dynamics, (c) management of the system has not changed in a way that could explain the changes in disease pattern, (d) the climatic requirements of the pathogen and/or vector are well-understood and better match the climate during the period of greater disease pressure and (e) the change in disease pattern has been observed long enough to establish a convincing trend beyond possible background variation.

Even though the impact of changes in temperature, humidity and precipitation patterns has been quantified, the simulations of the potential impact of climate change remain just that, simulations. By their very nature these simulations depend on the best available projections of meteorological models.

Real evidence for the impact of climate change on plant disease could come from verification of the accuracy of these projections. This would require long-term records of disease intensity for the regions where impacts are projected and for control regions. Long-term monitoring of pathogens and other plant-associated microbes is necessary in general to understand their ecology, and to develop predictions of their impact on plant pathology [35]. The lack of availability of long-term data about disease dynamics in natural systems, and even in agricultural systems, limits opportunities for analysis of climate change effects on plant disease [36,37].

Interannual variation in climatic conditions can have important effects on disease risk. For wheat stripe rust (caused by P. striiformis Westend. f. sp. tritici Eriks.) in the US Pacific Northwest, disease severity was lower in El Nino years than in non-El Nino years [38]. If climate change alters the frequency and/or the intensity of El Nino events [39] or other extreme weather events, it will also alter patterns of disease risk; knowledge of the associations between disease and climate cycles is needed to inform predictions about plant disease epidemics under climate change [38].

Some general historical analyses of the relationship between disease and environmental factors have been developed. For example, the first annual appearance of wheat stem rust (caused by Puccinia graminis Pers.:Pers. f. sp. tritici Eriks. and E. Henn.) was compared for cool (1968 1977) and warm (1993 2002) periods in the US Great Plains, but a significant difference in arrival date was not observed [40]. In the UK, the abundance of two different wheat pathogens shifted in close correlation with patterns of SO2 pollution during the 1900s [41,42]. For potato light blight, Zwankhuizen and Zadoks [43] have analysed epidemics in the Netherlands from 1950 to 1996 using agronomic and meteorological variables as predictors of disease severity. They found that some factors were associated with enhanced disease, such as greater numbers of days with precipitation, greater numbers of days with temperatures between 10 and 27 °C, and a relative humidity >90% during the growing season. Temperatures above 27 °C and higher levels of global radiation in the Netherlands appeared to reduce disease risk [43]. Baker et al. [44] evaluated late blight risk in central North America and found that the trends in climatic conditions should result in increased risk. Hannukkala et al. [45] evaluated late blight incidence and first appearance in Finland 1933 2002, concluding that there was higher risk in more recent years. The comparison of years is complicated in this case by changes in the pathogen population and management practices. Increases in fungicide use were consistent with increased disease risk; records of pesticide use or other management change are one potential form of evidence for climate change impacts.

Pathogens and insect pests of lodgepole pine (Pinus contorta) have been well-studied and offer an interesting example of a potential climate change fingerprint. Lodgepole pine is the most widely distributed pine species in natural (unmanaged) forests in western North America [46], including forests in British Columbia where there are more than 14 million ha of lodgepole pine [47]. Due to a lack of natural or human mediated disturbances, lodge-pole pine has been increasing in abundance in British Columbia since the 1900s [47,48]. Recently, there have been increased cases of decline of lodgepole pines in these forests and researchers are evaluating the potential effects of climate change on these events.

Mountain pine beetle (Dendroctonus ponderosae) is a bark beetle native to western North American forests [49]. This beetle can infest many pine species, and lodgepole pine is a preferred host [46,48]. The distribution range has not been limited by availability of the host but by the temperature range required for beetle survival through the winter [46,50]. The beetle causes physiological damage to the host trees by creating tunnels (insect galleries) underneath the bark, and in addition, microorganisms, such as the blue-stain fungi complex, can take advantage of these wounds to cause secondary infestation that may further reduce plant health [46,49]. Dead pines are not marketable and also can facilitate the spread of wild fire [51]. Beetle populations can be very low for many decades, but when there is an outbreak, a large area of susceptible hosts may be killed. The beetle has been known to be native to British Columbia [48], but, probably due to low winter temperatures, outbreak events were not common. However, there have been a series of outbreaks in recent years, and 8 million hectares in British Columbia were affected in 2004 [48,51]. Carroll et al. [50] evaluated the shift in infestation range and concluded that the trend toward warmer temperatures more suitable for the beetle is part of the reason for this series of outbreaks. Further, in a study by Mock et al [48], genetic markers did not reveal any significant differences among beetle genotypes from inside and outside of British Columbia, indicating the beetle population had not changed. Thus, other factors including climate change are likely to be the reason why there have been more outbreaks in northern areas.

Dothistroma needle blight is a fungal disease (causal agent Dothistroma septosporum) of a variety of pine species worldwide [52], including lodgepole pines. The disease is associated with mild temperature ranges (18 °C is the optimum temperature for sporulation [53]) and rain events [52,54], and causes extensive defoliation, mortality and a reduced growth rate in pine [52,55]. As with the mountain pine beetle, Dothistroma needle blight has been found in British Columbia in the past, but damage due to this disease was relatively minor. However, the number of cases and intensity of epidemics in this region has increased since the late 1990s [55]. A study by Woods et al. [55] evaluated the relationship between these disease outbreaks and (i) regional climate change and (ii) long-term climate records (utilising the Pacific Decadal Oscillation, PDO, as an indicator variable). Although they did not find a substantial increase in regional temperature nor a significant correlation between PDO and directional increase of precipitation or temperature, increased mean summer precipitation in the study area was observed. The authors also found that in some locations, up to 40% of forest stands became dominated by lodgepole pine due to plantation development, and they hypothesised that a combination of increased rain events and the abundance of the favoured host were the probable cause of increased disease occurrence.

For both mountain pine beetle and Dothistroma needle blight, it is reasonable to assume that climate has influenced pathogen and pest behaviour; however, at the same time, there has been a substantial increase in the abundance of the host (lodgepole pine) in British Columbia [47,48]. Widely available and genetically similar hosts generally increase plant disease risk [56], and these factors may also explain at least part of the change in risk observed for lodgepole pine.

Another important disease that has exhibited recent changes in its pattern of occurrence is wheat stripe rust (or yellow rust, caused by the fungus P. striiformis f. sp. tritici). This disease decreased and then increased in importance in the US during the past century. Stripe rust was economically important in the 1930s 1960s, but the development of resistant wheat varieties successfully reduced the number of epidemic events. However, several epidemic events have been observed since 2000 [57,58]. The disease can cause 100% yield loss at a local scale [58], and epidemics in 2003 in the US resulted in losses estimated to total $300 million. Are these changes related to climate change?

Historically, P. striiformis f. sp. tritici was known to be active at relatively lower temperature ranges. Under favourable conditions (i.e. with dew or free water on plant surfaces), its spores can germinate at 0 °C [59], and the temperature range for infection was measured as between 2 and 15 °C with an optimum temperature of 7 8 °C [60,61]. And it could produce spores between 0 and 24.5 °C [59]. This pathogen species was not well adapted for higher temperature conditions and disease development declined at temperatures above 20 °C [60 62], while spores produced at 30 °C were shown to be nonviable [59].

However, more recent populations of P. striiformis f. sp. tritici were adapted to warmer temperature ranges [63]. Isolates from the 1970s to 2003 were compared, and newer (post-2000) isolates had a significantly (P < 0.05) higher germination rate and shorter latent period (period between infection and production of spores) than older isolates when they were incubated at 18 °C, whereas isolate effects were not different when incubation took place at 12 °C. In a follow-up study, Markell and Milus [64] examined isolates from the 1960s to 2004 with genetic markers and morphological comparisons, and found that isolates collected pre- and post-2000 could be classified into two different groups. Although within a population group less than nine polymorphic markers were identified, when pre-and post-2000 populations were compared there were 110 polymorphic markers [64]. The large difference between pre- and post-2000 groups led the authors to conclude that post-2000 isolates were introduced from outside of the US, rather than resulting from mutations in pre-2000 isolates.

Results from annual race surveys conducted by the United States Agricultural Research Service of Pullman, WA, indicated that pre-2000 isolates were not commonly collected in surveys after 2000 [64]. Thus, it seems that post-2000 isolates took the place of pre-2000 isolates. The question remains whether the success of post-2000 isolates is due to the change in climatic conditions (i.e. increase in overall temperature) or something else. Since post-2000 isolates were better adapted to a warmer temperature range, climate change might have played a role in selection for the new isolates, but there is another important factor for post-2000 isolates. All post-2000 isolates examined were able to cause disease on wheat plants with resistance genes Yr8 and Yr9, while these resistance genes were effective at preventing disease for pre-2000 isolates [57,64]. There are other wheat varieties that are resistant to post-2000 isolates, but these varieties were less commonly grown since they were not effective against older isolates. Thus, the ability of new isolates to overcome these resistance genes was most likely the major factor behind the drastic change in populations of P. striiformis f. sp. tritici and recent epidemic events.

In summary, there is no doubt that plant disease responds to weather and that changes in weather events due to climate change are likely to shift the frequency and intensity of disease epidemics. Simulated climate change experiments reveal changes in plant disease intensity and the profile of plant diseases. When evidence for climate change is sought in observed changes in plant disease patterns, conclusions are less clear. Since the search for fingerprints of climate change is correlative by nature, there may always be alternative predictors for the changes, but this seems particularly true for plant disease. It is a typical biological irony that, while plant disease risk may be particularly sensitive to climatic variables and climatic shifts, plant disease may also be particularly difficult to use as an indicator of climate change because of the many interactions that take place to result in disease. However, as more data sets are collected and synthesised [37], and climate patterns exhibit greater changes over a longer period, the impacts of climate change on plant disease are likely to become clearer.

0 0

Post a comment