4.8 Article

Woody species do not differ in dormancy progression: Differences in time to budbreak due to forcing and cold hardiness

Publisher

NATL ACAD SCIENCES
DOI: 10.1073/pnas.2112250119

Keywords

dormancy; spring phenology; cold hardiness; budbreak; climate change

Funding

  1. Putnam Fellowship Program of the Arnold Arboretum

Ask authors/readers for more resources

Budbreak, an important phenological phase in perennial plants, is influenced by dormancy which is poorly understood. The exposure to temperature, specifically chilling and forcing, plays a role in modeling budbreak. Cold hardiness is found to be a crucial aspect of dormancy and should be considered in studying dormancy and predicting budbreak. The rates of cold hardiness loss vary among species, leading to different times to budbreak, and are influenced by the accumulation of chill. Inherent differences in deacclimation rates between species can be standardized using a deacclimation potential measurement. This finding contradicts previous estimations based on budbreak assays and emphasizes the need for understanding cold hardiness dynamics in comparing dormancy control.
Budbreak is one of the most observed and studied phenological phases in perennial plants, but predictions remain a challenge, largely due to our poor understanding of dormancy. Two dimensions of exposure to temperature are generally used to model budbreak: accumulation of time spent at low temperatures (chilling) and accumulation of heat units (forcing). These two effects have a well-established negative correlation; with more chilling, less forcing is required for budbreak. Furthermore, temperate plant species are assumed to vary in chilling requirements for dormancy completion allowing proper budbreak. Here, dormancy is investigated from the cold hardiness standpoint across many species, demonstrating that it should be accounted for to study dormancy and accurately predict budbreak. Most cold hardiness is lost prior to budbreak, but rates of cold hardiness loss (deacclimation) vary among species, leading to different times to budbreak. Within a species, deacclimation rate increases with accumulation of chill. When inherent differences between species in deacclimation rate are accounted for by normalizing rates throughout winter by the maximum rate observed, a standardized deacclimation potential is produced. Deacclimation potential is a quantitative measurement of dormancy progression based on responsiveness to forcing as chill accumulates, which increases similarly for all species, contradicting estimations of dormancy transition based on budbreak assays. This finding indicates that comparisons of physiologic and genetic control of dormancy require an understanding of cold hardiness dynamics. Thus, an updated framework for studying dormancy and its effects on spring phenology is suggested where cold hardiness in lieu of (or in addition to) budbreak is used.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available