The DNA molecule degrades over time, just like other cellular components if not repaired. Often the degradation is relatively fast, as fossil remains that are only a few hundred years old contain little or no amplifiable endogenous DNA. One basic question in research on ancient DNA is "how long can DNA and cells survive?" This question is not easily answered because it depends on numerous interacting factors. A maximum DNA survival of 50,000-1 million years has been suggested from theoretical considerations and empirical studies. It is clear that temperature is an important factor, because low temperatures and dry conditions slow the rate of chemical processes that degrade DNA. Given that rates of reaction generally drop an order of magnitude for every 10°C drop in temperature, colder environments are naturally better environments for long-term storage of DNA (Smith et al. 2001). Other natural processes that accelerate the degradation of the DNA molecule are endogenous and exogenous nucleases, as well as hydrolysis (Lindahl 1993; Handt et al. 1994; Hofreiter et al. 2001a). Despite the predicted maximum age of DNA, several studies have claimed to be able to extract DNA many million of years old, yet others fail to amplify DNA with a very young origin. How do we explain this discrepancy? On the one hand, we know that DNA degrades over time and that fossil remains can contain very little or no DNA. This is a problematic situation, which makes the studies very prone to contamination, giving false-positive results.
Was this article helpful?