Thongchai Thailand

Archive for November 2019






  1. The claimed causal connection between AGW climate change and the destructiveness of tropical cyclones did not emerge from the science nor from an extensive study of historical data but from an unlikely event in 2005 when Hurricane Katrina damaged a levee system that had not been properly maintained. The damage to the levee caused a catastrophic flooding of New Orleans that became the signature issue in the destructiveness of Hurricane Katrina as seen in the 2005-2009 reports in paragraph#2 below. The role of levee management in the destruction was downplayed and forgotten and the entire destruction was thus attributed to fossil fueled climate change with the subsumed climate action lesson of Katrina being that such destruction can and must be attenuated by reducing and eventually eliminating the use of fossil fuels.
  2. Hurricane Katrina Historical Notes: (i) HURRICANE KATRINA IS THE HARBINGER OF WORSE YET TO COME: The IPCC claimed that it had the scientific evidence to prove that our use of fossil fuels caused Hurricane Katrina to forecast with a great certainty that there was more to come in the 2006 hurricane season but the 2006 hurricane season turned out to be milder than normal. The IPCC blamed the dissipation of El Nino for the mild hurricane season in 2006 and issued a new warning that 2007 will be the hottest year on record and will bring a killer hurricane season worse than 2005 but the 2007 forecast also failed. The IPCC’s dream hurricane season has finally arrived in 2008 unannounced and unexpected with strong hurricanes Gustav and Hanna expected to be followed by Ike and a dozen others before the season is through. More info: [LINK] . (ii) The IPCC’s claim that Hurricane Katrina was caused by man-made global warming has been thoroughly discredited and their forecasts for more severe hurricane seasons in 2006 and 2007 have been proven wrong. They are merchants of fear and their method is the dissemination of convenient lies. More info: [LINK] . (iii) Climate science shows that AGW climate change is increasing the frequency and severity of extreme weather as for example Hurricane Katrina. Further research shows a causal link between AGW and increasing wave intensity that provides direct evidence of the extreme weather impacts of AGW. More info: [LINK] . (iv) 2009: In 2005 climate science declared that Hurricane Katrina was the harbinger of killer super storms yet to come created by fossil fueled global warming but after no further cyclone activity in the North Atlantic Basin in the next three years, new evidence for the destructiveness of AGW extreme weather was found in Cyclone Nargis in the North Indian Basin. Though not unusually strong, Nargis did create a freak storm surge in rising tides that swept up the Irrawaddy River in Burma and claimed a horrific death toll. Nargis thus became an AGW issue and climate scientists changed their extreme weather focus from the North Atlantic Basin to the North Indian Basin saying that Cyclone Nargis was a creation of climate change caused by fossil fuel emissions and as the harbinger of “destruction on a global scale” by human caused global warming. More [LINK] .
  3. The Climate Science of Hurricane Katrina:  It was thus that the climate science of the destructiveness hurricanes did not predict Katrina but was in fact constructed from Katrina based only on the destructiveness of the flood caused by the break in the levee system and by discounting the greater climatology data that it was Wilma, not Katrina, that was by far the stronger hurricane of the 2005 season but Katrina, not Wilma, was clearly the better tool to sell AGW’s fear based climate action agenda. This relationship between climate science and real science is seen more seen clearly in the foundational and keynote paper by noted MIT climate scientist Professor Kerry Emanuel reviewed in detail in a related post [LINK] . There it is shown that the need by climate science to establish the fear of climate change in terms of hurricanes made it possible for Professor Emanuel to abandon all pretension to scientific principles and statistical rigor to publish in a peer reviewed journal a circular reasoning paper that begins with the assumption that AGW increases the destructiveness of hurricanes and then proves that AGW increases the destructiveness of hurricanes [LINK] .
  4. Yet another issue is that the single minded focus on the North Atlantic Basin (NA) for the detection of the impact of climate change on tropical cyclone destructiveness during periods when NA is unusually active is a form of circular reasoning; particularly so because NA by itself is not a globally important source of cyclone energy. An additional consideration is the finding by Knutson (2010) and others that total cyclone energy variance for a single basin is too large to come to meaningful conclusions about trends and recommended that only the aggregate of all six basins could contain useful trend information.
  5. There are six tropical cyclone basins in the world where tropical cyclones form.  These are the West Pacific (WP), South Indian Ocean (SI), East Pacific (EP), North Atlantic (NA), North Indian Ocean (NI), and the South Pacific (SP). The most intensive and active  basin is the West Pacific Basin where tropical cyclones are called Typhoons. The North Atlantic Basin, where tropical cyclones are called Hurricanes, is a lesser basin and not a significant source of total global cyclone energy. Of the other four basins, the South Indian Ocean basin is the most active. Together, WP and SI generate more than 60% of the total global cyclone energy with the East Pacific and the North Atlantic together coming in second with about 25% of the world’s cyclone energy. The North Atlantic generates about 14% of the world’s cyclone energy. The details of this comparison are tabulated in Paragraph#6.
  7. Since AGW climate change is proposed as a global phenomenon, its effect on tropical cyclones must be studied and understood only in terms of global measures of tropical cyclone activity and not in terms of convenient localized phenomena that fit the narrative or that might drive from a USA bias of American researchers and news media. This line of inquiry is presented in three related posts where the global impact of AGW on tropical cyclone activity is measured as a global aggregate of all six cyclone basins.
  8. Study#1 > Trends[LINK] . The trend study presents data for total cyclone energy for all six basins for the 70-year study period 1945 to 2014. The object variable is the Accumulated Cyclone Energy (ACE) used as a measure of total cyclone energy. Knutson (2010) and others have suggested that year to year variance in cyclone energy is too large and random to draw meaningful interpretation of and recommended a decadal time scale for the study of tropical cyclone trends. Accordingly, the total global ACE for all six cyclone basins is computed for each of the seven decades in the 70-year study period. Trend analysis is carried out by comparing each decade against the other six. The results are summarized in the Table presented in Paragraph#9 below. They show that only two statistically significant differences are found. Specifically, we find that Decade#5 (1985-1994) and Decade#6 (1995-2004) show higher total global cyclone energy than Decade#1 (1945-1954). No other statistically significant difference is found among the seven decades studied.
  9. It is tempting here to conclude that the higher global cyclone energy in the two recent decades from 1985 to 2004 than in the decade 1945-1954 can and should be attributed to AGW climate change but there are other well understood considerations that explain this difference. It is well established and generally accepted in the tropical cyclone research community that the early decade in this study, 1945-1954, suffered from a measurement bias such that not all tropical cyclones were detected and of those that were not all were adequately measured. In other words the early data are incomplete and the incompleteness of the data provides a stronger and more rational explanation of the observed statistically significant trend in total cyclone energy. We conclude from these results that he data do not show an impact of AGW climate change in the form of increasing the destructiveness of tropical cyclones. 
  10. Study #2: SST: [LINK] . Sea surface temperature (SST) is the link that connects climate change with tropical cyclone activity with the proposition that higher SST provides more energy for tropical cyclones that form on the basis of high SST. Cyclone theory tells us that cyclone formation, and intensification are related to SST (Vecchi, 2007) (Knutson, 2010). Testable implications of the theory for empirical research are derived from climate model simulations (Knutson, 2010). Knutson’s work also suggests that the high variance in tropical cyclone activity at an annual time scale or for any single cyclone basin means that data analysis must be carried out on a global basis for all six tropical cyclone basins and time scales longer than annual should be used. Detrended correlation analysis for total cyclone energy and SST are carried out at a decadal time scale 1945-2014. The results are tabulated in Paragraph#12. They show that the high correlation seen between total global cyclone energy (ACE) and global sea surface temperature (SST) derives from a rising trend in both time series and not from a responsiveness of ACE to SST at a decadal time scale.
  11. We conclude from the results presented in Paragraph#8 to Paragraph#12 that no evidence is found for the usual assumption in climate science that AGW climate change is intensifying tropical cyclone activity by way of SST.
  12. Study#3: Pre-Industrial: [LINK] . The fundamental theoretical  basis for the theory of AGW climate change is a stark difference between “pre-industrial times” and the “era of the industrial economy” in terms of climate as assumed in climate science. A testable implication of the claimed impact of AGW climate change on tropical cyclones in terms of this dichotomy is that a comparison of the two eras should show a stark difference in tropical cyclone activity in terms of an absence of intense and destructive tropical cyclones.
  13. The Treasure Coast Hurricanes of 1715 & 1733The Dreadful Hurricane of 1667, The Calcutta Cyclone of 1737, The Great Hurricane of 1780, The Great September Gale of 1815, The Coringa Cyclone of 1839, and The Last Island Hurricane of 1856, The San Diego Hurricane of 1858 are described and presented as tropical cyclones with intensity and destructiveness comparable to the high profile hurricanes cited by climate science as evidence of the impact of AGW climate change. We conclude from the comparison that it does not provide convincing evidence that tropical cyclones such as the destructive hurricanes cited by climate science as a creation of AGW are unique to the industrial economy that could not have occurred in pre-industrial times.
  14. CONCLUSION: The data and their interpretation presented in these posts reveal serious weaknesses in the claim by climate science that the industrial economy has caused greater intensity and destructiveness of tropical cyclones by way of global warming and rising sea surfact temperature.





















  1. The lead author of the paper (photo above) is Adam Pellegrini PhD, Earth System Science Department, Stanford University where he is a NOAA Climate and Global Change Postdoc Fellow. The paper (item#14 in the “Forest Fire Effects on Topsoil Chemistry” Bibliography below) reports findings on the effect of forest fires on soil chemistry. It is an extensive study across 48 sites globally distributed and three different kinds of forest, grasslands, broad leaf, and needle leaf. The authors studied the effects of forest fires on the chemistry of topsoil, the uppermost loose layer of the soil about six inches in depth normally rich in life and organic matter. Unlike the other studies of forest fires listed below where effects of prescribed fires and wildfires are reported, the Pelligrini 2018 study is experimental where forest fires in selected regions and forest types are intentionally started by the researchers with different frequency of fires in order to study the long term effects over a period of 65 years.
  2. They found that frequent fires in grasslands and broad leaf forests  caused significant declines in carbon and nitrogen in topsoil. No significant effect of frequent fires was found in needleleaf forests or for infrequent fires in any of the three forest types. The findings are consistent with field data and with a dynamic computer model of global vegetation. The findings are thus taken as a validation of the dynamic global vegetation model (DGVM).
  3. Further study with the DGVM predicts that the long-term losses of soil nitrogen  from more frequent burning may lower the ability of the topsoil to perform its assumed carbon sequestration function that is relevant to the carbon cycle and climate change because the carbon sequestration function of healthy topsoil sequesters 20% of the carbon released by wildfire activity in forests and grasslands.
  4. The relevance of this finding to the climate change issue is that if wildfire frequency is increased by climate change, the expected impact on the topsoil’s ability to sequester carbon may act as a feedback by increasing atmospheric CO2 concentration and thereby accelerating climate change severity and thereby the rise in wildfire frequency.
  5. The assumed causal connection, that AGW climate change increases wildfire frequency, is derived from the various works of Leroy Westerling, Professor of Climatology, the University of California at Merced from 2006 to 2011 and some later works by other authors. The references are listed in the “climate change wildfire” bibliography below. These references show that ….
  6. In certain specific regions (eg California), but not in others, wildfires have increased since the mid-1980s while at the same time AGW climate change was causing increased warmth and desiccation that could enhance wildfire conditions. These relationships are taken as evidence that AGW climate change did in fact cause an increase in wildfires. The weaknesses in this argument are many as listed below.
  7. (i) Evidence of the effect of global warming on wildfire frequency or severity is not established globally; but rather for specific regions where rising devastation by wildfires is known to have occurred are selected for the evaluation; (ii) That variable y is rising while at the same time variable x was also rising establishes neither correlation nor causation even when x and y could have a causation relationship; yet this is the sole argument presented for the attribution of wildfire severity and/or frequency to AGW other than the rationalization that AGW is expected to cause increased warmth and desiccation (iii) Other factors that are also concomitant are not considered such as changes in California logging regulations that were made around the time when the Spotted Owl was declared to be an endangered species threatened by logging. Logging in California’s wilderness was banned. At the same time, prescribed forest management fires were banned or severely curtailed. These management changes also occurred in the late 1980s and early 1990s but they have been removed from consideration to make way for a single minded focus on a pre-determined causation relationship between AGW climate change and wildfires. Even more egregious, if indeed the wildfire devastation in California is related to the failure of forest management by way of inadequate prescribed fires, the Pellegrini 2018 implication that prescribed fires are bad because of lost carbon sequestration is the exact opposite of the forest management change needed in California. (iv) Computer modeling of the impact of AGW climate change on wildfires will of course show that relationship because it has been programmed into it. These results serve only to show that the model works the way it is supposed to work and can’t be presented as evidence that the observed increase in California wildfire devastation since the 1990s must have been caused by AGW. Computer models of expected theoretical relationships are an expression of the theory itself and it cannot also serve as the data to test theory.
  8. The works listed in the AGW Wildfire bibliography below, particularly those by Professor Westerling are biased in this way. Results of modeling studies and climate theories of the impact of AGW climate change on wildfires have created a sense that the truth of the causation relationship is a given and that the observational evidence plays the minor role of providing the kind of data that are expected in going through the required formality for a causation that has been fully accepted by the researchers apriori. In other words, that AGW climate change increases wildfire devastation is the null hypothesis. However, it is necessary for empirical test of theory to be carried out in exactly the opposite way where the null hypothesis is the absence of the causation relationship and sufficient and convincing unbiased evidence must be presented before the null hypothesis can be rejected.
  9. The finding in {Pellegrini, Adam FA, et al. “Fire frequency drives decadal changes in soil carbon and nitrogen and ecosystem productivity.” Nature 553.7687 (2018)} that climate change causes loss of soil carbon sequestration which in turn exacerbates climate change contains two significant flaws. The first is, as stated above, insufficient evidence is provided for the relationship that AGW climate change increases wildfire devastation. The second flaw in the Pellegrini 2018 paper is the interpretation of carbon sequestration loss in soils in terms of AGW climate change. AGW is a theory that the carbon in fossil fuels are extraneous and unknown to the current carbon cycle and climate system such that this system of nature suffered an unnatural perturbation when humans dug up ancient carbon in fossil fuels and injected that extraneous carbon, (carbon that does not belong in the current account of the carbon cycle), into the carbon cycle and climate system.
  10. A related issue is the importance of the Industrial Revolution that divides climate history into pre-industrial (the reference climate relative to which AGW is measured as the impact of the industrial economy), and the post industrial era where fossil fuel emissions of the industrial economy is said to have created a new, unprecedented, and artificial climate regime. AGW climate change must be understood only in this context. Thus, the impact of wildfires in general is not an issue because there were wildfires in pre-industrial times. Therefore only the effect of those wildfires that would not have been without the industrial economy can serve as a creation of fossil fuels and thus be related to AGW and used as climate change arguments.
  11. The argument made or implied by the authors is that the loss of carbon sequestration capacity of topsoil due to forest fires, whether prescribed control burns or wildfires, will serve to increase atmospheric CO2 concentration and thereby act as an AGW feedback loop so that the more warming there is the more forest fires there will be and the more soil sequestration that will be lost and so even more warming there will be and so on. The statistical significance of this relationship must first be established in the context of the very high uncertainties in carbon cycle flows described in a related post [LINK] . The interpretation of this change must be made in the context of the AGW theoretical framework that AGW warming is the result of the perturbation of the carbon cycle with external carbon dug up from under the ground where it had been sequestered for millions of years.
  12. The related post linked above [LINK] shows extremely high uncertainties as expected since carbon cycle flows are not directly measurable but must be inferred. For example, carbon cycle flows related to “land use change” are listed as Land use change: surface to atmosphere:Mean=1.1,SD=0.8 implying a 90% confidence interval of -0.216 to 2.416 GT (IPCC AR5 2018). The relevance of the proposed forest fire effect must be shown in the context of these large uncertainties in unmeasurable carbon cycle flows.
  13. An analysis is presented in related posts on this site that shows that given the large uncertainties in carbon cycle flows, fossil fuel emissions are statistically not detectable [LINK] [LINK] . Carbon cycle flows are not directly measurable but must be inferred. Given these uncertainties in carbon cycle flows, a global estimate of carbon sequestration lost due to forest fires and its uncertainty must be estimated and shown to be detectable and relevant in the context of the carbon cycle and its extreme uncertainties. Until these additional information can be made available, the findings of {Pellegrini, Adam FA, et al. “Fire frequency drives decadal changes in soil carbon and nitrogen and ecosystem productivity.” Nature 553.7687 (2018)}, though interesting, cannot be assumed to have practical implications for AGW climate change and the corresponding propositions for climate action.
  14. In summary, what we see in the Spotted Owl case is that an excess of environmentalism zeal can harm the environment. That pattern now emerges in climate change environmentalism.



This post was motivated by a related post on the WUWT site [LINK] 




Featured Author: Anthony Leroy Westerling, Professor of Climatology, UC Merced


  1. Fried, Jeremy S., Margaret S. Torn, and Evan Mills. “The impact of climate change on wildfire severity: a regional forecast for northern California.” Climatic change 64.1-2 (2004): 169-191.  We estimated the impact of climatic change on wildland fire and suppression effectiveness in northern California by linking general circulation model (GCM) output to local weather and fire records and projecting fire outcomes with an initial-attack suppression model. The warmer and windier conditions corresponding to a 2 × CO2 climate scenario produced fires that burned more intensely and spread faster in most locations. Despite enhancement of fire suppression efforts, the number of escaped fires (those exceeding initial containment limits) increased 51% in the south San Francisco Bay area, 125% in the Sierra Nevada, and did not change on the north coast. Changes in area burned by contained fires were 41%, 41% and –8%, respectively. When interpolated to most of northern California’s wildlands, these results translate to an average annual increase of 114 escapes (a doubling of the current frequency) and an additional 5,000 hectares (a 50% increase) burned by contained fires. On average, the fire return intervals in grass and brush vegetation types were cut in half. The estimates reported represent a minimum expected change, or best-case forecast. In addition to the increased suppression costs and economic damages, changes in fire severity of this magnitude would have widespread impacts on vegetation distribution, forest condition, and carbon storage, and greatly increase the risk to property, natural resources and human life. [FULL TEXT PDF]
  2. Westerling, Anthony L., et al. “Warming and earlier spring increase western US forest wildfire activity.” science 313.5789 (2006): 940-943Western United States forest wildfire activity is widely thought to have increased in recent decades, yet neither the extent of recent changes nor the degree to which climate may be driving regional changes in wildfire has been systematically documented. Much of the public and scientific discussion of changes in western United States wildfire has focused instead on the effects of 19th- and 20th-century land-use history. We compiled a comprehensive database of large wildfires in western United States forests since 1970 and compared it with hydroclimatic and land-surface data. Here, we show that large wildfire activity increased suddenly and markedly in the mid-1980s, with higher large-wildfire frequency, longer wildfire duration, and longer wildfire seasons. The greatest increases occurred in mid-elevation, Northern Rockies forests, where land-use histories have relatively little effect on fire risks and are strongly associated with increased spring and summer temperatures and an earlier spring snowmelt. In the Conclusions section of the paper the authors write “Robust statistical associations between wildfire and hydroclimate in western forests indicate that increased wildfire activity over recent decades reflects sub-regional responses to changes in climate. Historical wildfire observations exhibit an abrupt transition in the mid-1980s from a regime of infrequent large wildfires of short (average of 1 week) duration to one with much more frequent and longer burning (5 weeks) fires. This transition was marked by a shift toward unusually warm springs, longer summer dry seasons, drier vegetation (which provoked more and longer burning large wildfires), and longer fire seasons. Reduced winter precipitation and an early spring snowmelt played a role in this shift. Increases in wildfire were particularly strong in mid-elevation forests. [LINK TO FULL TEXT DOWNLOAD]
  3. Scholze, Marko, et al. “A climate-change risk analysis for world ecosystems.” Proceedings of the National Academy of Sciences 103.35 (2006): 13116-13120.  We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model (DGVM) with multiple scenarios from 16 climate models and mapping the proportions of model runs showing forest/nonforest shifts or exceedance of natural variability in wildfire frequency and freshwater supply. Our analysis does not assign probabilities to scenarios or weights to models. Instead, we consider distribution of outcomes within three sets of model runs grouped by the amount of global warming they simulate: <2°C (including simulations in which atmospheric composition is held constant, i.e., in which the only climate change is due to greenhouse gases already emitted), 2–3°C, and >3°C. High risk of forest loss is shown for Eurasia, eastern China, Canada, Central America, and Amazonia, with forest extensions into the Arctic and semiarid savannas; more frequent wildfire in Amazonia, the far north, and many semiarid regions; more runoff north of 50°N and in tropical Africa and northwestern South America; and less runoff in West Africa, Central America, southern Europe, and the eastern U.S. Substantially larger areas are affected for global warming >3°C than for <2°C; some features appear only at higher warming levels. A land carbon sink of ≈1 Pg of C per yr is simulated for the late 20th century, but for >3°C this sink converts to a carbon source during the 21st century (implying a positive climate feedback) in 44% of cases. The risks continue increasing over the following 200 years, even with atmospheric composition held constant. [FULL TEXT PDF DOWNLOAD] .
  4. Westerling, A. L., and B. P. Bryant. “Climate change and wildfire in California.” Climatic Change 87.1 (2008): 231-249Wildfire risks for California under four climatic change scenarios were statistically modeled as functions of climate, hydrology, and topography. Wildfire risks for the GFDL and PCM global climate models (note: GFDL and PCM are different resolutions of GCM climate models) and the A2 and B1 emissions scenarios were compared for 2005–2034, 2035–2064, and 2070–2099 against a modeled 1961–1990 reference period in California and neighboring states. Outcomes for the GFDL model runs, which exhibit higher temperatures than the PCM model runs, diverged sharply for different kinds of fire regimes, with increased temperatures promoting greater large fire frequency in wetter, forested areas, via the effects of warmer temperatures on fuel flammability. At the same time, reduced moisture availability due to lower precipitation and higher temperatures led to reduced fire risks in some locations where fuel flammability may be less important than the availability of fine fuels. Property damages due to wildfires were also modeled using the 2000 U.S. Census to describe the location and density of residential structures. In this analysis the largest changes in property damages under the climate change scenarios occurred in wildland/urban interfaces proximate to major metropolitan areas in coastal southern California, the Bay Area, and in the Sierra foothills northeast of Sacramento. [FULL TEXT PDF]
  5. Cannon, Susan H., and Jerry DeGraff. “The increasing wildfire and post-fire debris-flow threat in western USA, and implications for consequences of climate change.” Landslides–disaster risk reduction. Springer, Berlin, Heidelberg, 2009. 177-190.  In southern California and the intermountain west of the USA, debris flows generated from recently-burned basins pose significant hazards. Increases in the frequency and size of wildfires throughout the western USA can be attributed to increases in the number of fire ignitions, fire suppression practices, and climatic influences. Increased urbanization throughout the western USA, combined with the increased wildfire magnitude and frequency, carries with it the increased threat of subsequent debris-flow occurrence. Differences between rainfall thresholds and empirical debris-flow susceptibility models for southern California and the intermountain west indicate a strong influence of climatic and geologic settings on post-fire debris-flow potential. The linkages between wildfires, debris-flow occurrence, and global warming suggests that the experiences in the western United States are highly likely to be duplicated in many other parts of the world, and necessitate hazard assessment tools that are specific to local climates and physiographies. [FULL TEXT PDF]
  6. Abatzoglou, John T., and Crystal A. Kolden. “Climate change in western US deserts: potential for increased wildfire and invasive annual grasses.” Rangeland Ecology & Management 64.5 (2011): 471-478.  The influence of climate change on future invasions depends on both climate suitability that defines a potential species range and the mechanisms that facilitate invasions and contractions. A suite of downscaled climate projections for the mid–21st century was used to examine changes in physically based mechanisms, including critical physiological temperature thresholds, the timing and availability of moisture, and the potential for large wildfires. Results suggest widespread changes in 1) the length of the freeze-free season that may favor cold-intolerant annual grasses, 2) changes in the frequency of wet winters that may alter the potential for establishment of invasive annual grasses, and 3) an earlier onset of fire season and a lengthening of the window during which conditions are conducive to fire ignition and growth furthering the fire-invasive feedback loop. We propose that a coupled approach combining bioclimatic envelope modeling with mechanistic modeling targeted to a given species can help land managers identify locations and species that pose the highest level of overall risk of conversion associated with the multiple stressors of climate change. [FULL TEXT PDF]
  7. Girardin, Martin P., et al. “Vegetation limits the impact of a warm climate on boreal wildfires.” New Phytologist 199.4 (2013): 1001-1011.  Strategic introduction of less flammable broadleaf vegetation into landscapes was suggested as a management strategy for decreasing the risk of boreal wildfires projected under climatic change. However, the realization and strength of this offsetting effect in an actual environment remain to be demonstrated. Here we combined paleoecological data, global climate models and wildfire modelling to assess regional fire frequency (RegFF, i.e. the number of fires through time) in boreal forests as it relates to tree species composition and climate over millennial time‐scales. Lacustrine charcoals from northern landscapes of eastern boreal Canada indicate that RegFF regional fire frequency during the mid‐Holocene (6000–3000 yr ago) was significantly higher than pre‐industrial RegFF (ad c. 1750). In southern landscapes, RegFF was not significantly higher than the pre‐industrial RegFF in spite of the declining drought severity. The modelling experiment indicates that the high fire risk brought about by a warmer and drier climate in the south during the mid‐Holocene was offset by a higher broadleaf component. Our data highlight an important function for broadleaf vegetation in determining boreal RegFF in a warmer climate. We estimate that its feedback may be large enough to offset the projected climate change impacts on drought conditions. [FULL TEXT]  
  8. Westerling, Anthony LeRoy. “Increasing western US forest wildfire activity: sensitivity to changes in the timing of spring.” Philosophical Transactions of the Royal Society B: Biological Sciences 371.1696 (2016): 20150178.  Prior work shows western US forest wildfire activity increased abruptly in the mid-1980s. Large forest wildfires and areas burned in them have continued to increase over recent decades, with most of the increase in lightning-ignited fires. Northern US Rockies forests dominated early increases in wildfire activity, and still contributed 50% of the increase in large fires over the last decade. However, the percentage growth in wildfire activity in Pacific northwestern and southwestern US forests has rapidly increased over the last two decades. Wildfire numbers and burned area are also increasing in non-forest vegetation types. Wildfire activity appears strongly associated with warming and earlier spring snowmelt. Analysis of the drivers of forest wildfire sensitivity to changes in the timing of spring demonstrates that forests at elevations where the historical mean snow-free season ranged between two and four months, with relatively high cumulative warm-season actual evapotranspiration, have been most affected. Increases in large wildfires associated with earlier spring snowmelt scale exponentially with changes in moisture deficit, and moisture deficit changes can explain most of the spatial variability in forest wildfire regime response to the timing of spring. [FULL TEXT]





  1. Shakesby, Richard A., et al. “Impacts of prescribed fire on soil loss and soil quality: an assessment based on an experimentally-burned catchment in central Portugal.” Catena 128 (2015): 278-293Prescribed (controlled) fire has recently been adopted as an important wildfire-fighting strategy in the Mediterranean. Relatively little research, however, has assessed its impacts on soil erosion and soil quality. This paper investigates hillslope-scale losses of soil, organic matter and selected nutrients before and after a ‘worst-case scenario’ prescribed fire in a steep, shrub-vegetated catchment with thin stony soil in central Portugal. Comparison is made with soil erosion measured: (1) on a nearby hillslope burned by wildfire and monitored at the hillslope scale; and (2) on long-unburned terrain at small-plot, hillslope- and catchment-scales. Hillslope-scale pre- and post-fire soil erosion was recorded over periods of 6 weeks to 5 months for (1) 9.5 months pre-fire and 27 months post-fire in the prescribed fire catchment, and (2) c. 3 years post-fire at the wildfire site. Organic matter content, pH, total N, K2O, P2O5, Ca2 + and Mg2 + were measured in the eroded sediment and in pre- and post-prescribed fire surface soil. Results indicate that: (1) both the prescribed fire and the wildfire caused expected marked increases in erosion compared with unburned terrain; and (2) the hillslope-scale post-prescribed fire soil losses (up to 2.41 t ha− 1 yr− 1) exceeded many reported plot-scale post-prescribed fire and post-wildfire erosion rates in the Mediterranean. As a comparison, post-fire erosion for both fire types was less than that caused by some other forms of common soil disturbance (e.g. types of tillage) and even that on undisturbed shrubland in low rainfall areas of the region. Total estimated post-prescribed fire particulate losses of organic matter and nutrients represent only 0.2–2.9% of the content in the upper 2 cm of soil, suggesting only a modest fire effect on soil quality, although this may reflect in part a lack of extreme rainfall events following the fire. The longer-term implications for soil conservation of repeated prescribed fire in the Mediterranean are explored and future research priorities identified.
  2. Pellegrini, Adam FA, et al. “Fire alters ecosystem carbon and nutrients but not plant nutrient stoichiometry or composition in tropical savanna.” Ecology 96.5 (2015): 1275-1285.  Fire and nutrients interact to influence the global distribution and dynamics of the savanna biome, (Biome=large naturally occurring community of flora and fauna such as a forest) but the results of these interactions are both complex and poorly known. A critical but unresolved question is whether short‐term losses of carbon and nutrients caused by fire can trigger long‐term and potentially compensatory responses in the nutrient stoichiometry of plants, or in the abundance of dinitrogen‐fixing trees. There is disagreement in the literature about the potential role of fire on savanna nutrients, and, in turn, on plant stoichiometry and composition. A major limitation has been the lack of fire manipulations over time scales sufficiently long for these interactions to emerge. We use a 58‐year, replicated, large‐scale, fire manipulation experiment in Kruger National Park (South Africa) in savanna to quantify the effect of fire on (1) distributions of carbon, nitrogen, and phosphorus at the ecosystem scale; (2) carbon : nitrogen : phosphorus stoichiometry of above‐ and below-ground tissues of plant species; and (3) abundance of plant functional groups including nitrogen fixers. Our results show dramatic effects of fire on the relative distribution of nutrients in soils, but that individual plant stoichiometry and plant community composition remained unexpectedly resilient. Moreover, measures of nutrients and carbon stable isotopes allowed us to discount the role of tree cover change in favor of the turnover of herbaceous biomass as the primary mechanism that mediates a transition from low to high soil carbon and nutrients in the absence of fire. We conclude that, in contrast to extra‐tropical grasslands or closed‐canopy forests, vegetation in the savanna biome may be uniquely adapted to nutrient losses caused by recurring fire.
  3. Fultz, Lisa M., et al. “Forest wildfire and grassland prescribed fire effects on soil biogeochemical processes and microbial communities: Two case studies in the semi-arid Southwest.” Applied soil ecology 99 (2016): 118-128.  Fire is a natural disturbance that shapes many ecosystems. In semi-arid regions, where high temperatures and low soil moisture limit nutrient cycling and plant growth, fire is critical to supply nutrients and drive vegetation composition. We examined soil chemical and biological properties to assess the short-term impacts of wildfire and prescribed fires on soil functioning in semi-arid regions of Texas. Better understanding of soil organic matter transformation and nutrient cycling processes will aid land managers in predicting ecosystem recovery response post-fire. Soil samples were collected following both prescribed grassland fires in June of 2009 in Lubbock, TX and the April 2012 Livermore Ranch Complex Fire located in the Davis Mountains, TX. Prescribed fire samples (0–2.5 cm) were collected within 6 hours prior to burning and again at 0.5, 24, 48, and 168 hours post-fire to experimentally examine short-term influences of fire and fire frequency (1× vs. 2×) on soil carbon dynamics, inorganic nitrogen, and microbial community composition. Wildfire samples (0–5 cm) were collected two and six months following the wildfire. We evaluated the effects of three burn severity levels and sampled under three tree species (Juniperus deppeanaPinus cembroides, and Quercus grisea). Within 0.5 h of the prescribed fire, CO2 flux, NH4+-N concentration and total microbial biomass (as estimated by total fatty acid methyl esters) increased. A shift in the microbial community from a predominance of fungi to Gram positive bacteria occurred immediately following the fire. Chemical shifts were short lived (decreased within 24 h), but the biotic shift to a dominance of Gram negative bacteria and actinomycetes was measured in samples collected after 168 h. Soil pH and NH4+-N concentration increased at two and six months following the wildfire. In contrast, soil organic matter content decreased at two months post wildfire which, in combination of abiotic conditions such as low moisture content (<3.3%), resulted in reduced soil microbial biomass and enzyme activity. Increased soil moisture six months post fire created more favorable conditions for nitrification resulting in increased NO3-N concentration (0.8 to 36.1 mg NO3-N kg−1 soil), particularly following high severity fire. Prescribed fire did not have lasting impacts on soil nutrients, but both prescribed and wildfire resulted in increased NH4+-N, shifts in microbial community structure and decreased in microbial biomass. While the increase in nitrogen maybe be beneficial to the plant growth and revegetation, the loss of microbial biomass may have far reaching implications to the overall sustainability of the soils in these systems.
  4. Brown, Julian, Alan York, and Fiona Christie. “Fire effects on pollination in a sexually deceptive orchid.” International Journal of Wildland Fire 25.8 (2016): 888-895. Research into the effectiveness of prescribed fire in managing pollination has only recently begun. The effects of fire on pollination have not been explored in sexually deceptive systems. Further, the potential for multiple effects operating at different spatial scales has not been explored in any pollination system despite multi-scale effects on pollination observed in agricultural landscapes. We observed the frequency of pollinator visitation to flowers of sexually deceptive Caladenia tentaculata and related it to the post-fire age class of the vegetation at local and landscape scales. We also related the number of the pollinator’s putative larval hosts (scarab beetles) captured at these sites to age class. At the local scale (i.e. the sample location), visitation was highest in recently burnt sites. At the landscape scale, positive associations were observed between (1) putative pollinator hosts and vegetation burnt 36–50 years ago, and (2) pollinator visitation and vegetation burnt ≥50 years ago. Local- and landscape-scale effects on visitation were synergistic, such that visitation was greatest when fire age was heterogeneous within pollinator foraging range.
  5. Alcañiz, M., et al. “Long-term dynamics of soil chemical properties after a prescribed fire in a Mediterranean forest (Montgrí Massif, Catalonia, Spain).” Science of the total environment 572 (2016): 1329-1335.  This study examines the effects of a prescribed fire on soil chemical properties in the Montgrí Massif (Girona, Spain). The prescribed forest fire was conducted in 2006 to reduce understory vegetation and so prevent potential severe wildfires. Soil was sampled at a depth of 0–5 cm at 42 sampling points on four separate occasions: prior to the event, immediately after, one year after and nine years after. The parameters studied were pH, electrical conductivity (EC), total carbon (C), total nitrogen (N), available phosphorus (P), potassium (K+), calcium (Ca2 +) and magnesium (Mg2 +). All parameters (except pH) increased significantly immediately after the fire. One year after burning, some chemical parameters – namely, EC, available P and K+ – had returned to their initial, or even lower, values; while others – pH and total C – continued to rise. Total N, Ca2 + and Mg2 + levels had fallen one year after the fire, but levels were still higher than those prior to the event. Nine years after the fire, pH, total C, total N and available P are significantly lower than pre-fire values and nutrients concentrations are now higher than at the outset but without statistical significance. The soil system, therefore, is still far from being recovered nine years later.
  6. Armas-Herrera, Cecilia M., et al. “Immediate effects of prescribed burning in the Central Pyrenees on the amount and stability of topsoil organic matter.” Catena 147 (2016): 238-244Prescribed burning is the deliberate application of fire under selected conditions to accomplish predetermined management objectives. It is generally accepted that controlled use of fire has neutral or even positive effects on soils due to its lower temperature, intensity and severity compared to wildfires. However, very few studies have examined the effects of prescribed burning of shrub vegetation in humid mountain areas on soil properties. The objective of this work was to determine the immediate effects of prescribed burning on the quality and biochemical stability of soil organic matter (SOM) in areas encroached by shrubs in the Central Pyrenees (NE Spain). Soil samples were sampled in triplicate immediately before and after burning from the Ah horizon at 0–1, 1–2 and 2–3 cm depths. We quantified the variations as a direct result of burning in (1) the SOM content, (2) the content and mineralization rates of labile and recalcitrant C pools as inferred from incubation assays (141 days), and (3) the soil biological activity related to C cycling (microbial biomass C and β-D-glucosidase activity). Nearly all the soil properties studied were significantly affected by fire, varying in terms of extent of the effect and the soil depth affected. The total soil organic C (SOC), C/N ratio, β-D-glucosidase activity, C-CO2 efflux and estimated content of labile SOC decreased significantly up to 3 cm depth. The total N and microbial biomass C were significantly affected only in the upper cm of the soil (0–1 cm). These results describe a short-term stronger impact of the prescribed fire on topsoil properties than usually reported. However, comparing these findings to other studies should be performed with caution because of the different environments considered in each case, as well as the differing soil thicknesses found in the literature, typically between 5 and 15 cm, which can lead to a dilution effect associated with the actual impacts of fire on soil properties. In this sense, the choice of a suitable soil thickness or sampling just after burning can be relevant factors in the detection of the immediate effects of fire. Short- and medium-term monitoring of the soils is needed to assess the suitability of this practice for pasture maintenance and for adapting the frequency of prescribed fires in order to minimize its impact on soil.
  7. Sun, Hui, et al. “Bacterial community structure and function shift across a northern boreal forest fire chronosequence.” Scientific reports 6 (2016): 32411Soil microbial responses to fire are likely to change over the course of forest recovery. Investigations on long-term changes in bacterial dynamics following fire are rare. We characterized the soil bacterial communities across three different times post fire in a 2 to 152-year fire chronosequence by Illumina MiSeq sequencing, coupled with a functional gene array (GeoChip). The results showed that the bacterial diversity did not differ between the recently and older burned areas, suggesting a concomitant recovery in the bacterial diversity after fire. The differences in bacterial communities over time were mainly driven by the rare operational taxonomic units (OTUs < 0.1%). Proteobacteria (39%), Acidobacteria (34%) and Actinobacteria (17%) were the most abundant phyla across all sites. Genes involved in C and N cycling pathways were present in all sites showing high redundancy in the gene profiles. However, hierarchical cluster analysis using gene signal intensity revealed that the sites with different fire histories formed separate clusters, suggesting potential differences in maintaining essential biogeochemical soil processes. Soil temperature, pH and water contents were the most important factors in shaping the bacterial community structures and function. This study provides functional insight on the impact of fire disturbance on soil bacterial community.
  8. Badía, David, et al. “Burn effects on soil properties associated to heat transfer under contrasting moisture content.” Science of the Total Environment 601 (2017): 1119-1128. The aim of this work is to investigate the topsoil thickness affected by burning under contrasting soil moisture content (field capacity versus air-dried conditions). A mollic horizon of an Aleppo pine forest was sampled and burned in the laboratory, recording the temperature continuously at the topsoil surface and at soil depths of 1, 2, and 3 cm. Changes in soil properties were measured at 0–1, 1–2, 2–3, and 3–4 cm. Both the maximum temperature and the charring intensities were significantly lower in wet soils than in air-dried soils up to 3 cm in depth. Moreover, soil heating was slower and cooling faster in wet soils as compared to dry soils. Therefore, the heat capacity increase of the soil moistened at field capacity plays a more important role than the thermal conductivity increase on heat transfer on burned soils. Burning did not significantly modify the pH, the carbonate content and the chroma, for either wet or dry soil. Fire caused an immediate and significant decrease in water repellency in the air-dried soil, even at 3 cm depth, whereas the wet soil remained hydrophilic throughout its thickness, without being affected by burning. Burning depleted 50% of the soil organic C (OC) content in the air-dried soil and 25% in the wet soil at the upper centimeter, which was blackened. Burning significantly decreased the total N (TN) content only in the dry soil (to one-third of the original value) through the first centimeter of soil depth. Soluble ions, measured by electrical conductivity (EC), increased after burning, although only significantly in the first centimeter of air-dried soils. Below 2 cm, burning had no significant effects on the brightness, OC, TN, or EC, for either wet or dry soil.
  9. Dove, Nicholas C., and Stephen C. Hart. “Fire reduces fungal species richness and in situ mycorrhizal colonization: a meta-analysis.” Fire Ecology 13.2 (2017): 37-65Soil fungal communities perform many functions that help plants meet their nutritional demands. However, overall trends for fungal response to fire, which can be especially critical in a post-fire context, have been difficult to elucidate. We used meta-analytical techniques to investigate fungal response to fire across studies, ecosystems, and fire types. Change in fungal species richness and mycorrhizal colonization were used as the effect size metrics in random effects models. When different types of methods for assessing fungal species richness and mycorrhizal colonization were considered together, there was an average reduction of 28 % in fungal species richness post fire, but no significant response in mycorrhizal colonization. In contrast, there was a 41 % reduction in fungal species richness post fire when assessed by sporocarp surveys, but fungal species richness was not significantly affected when assessed by molecular methods. Measured in situ, fire reduced mycorrhizal colonization by 21 %, yet no significant response occurred when assessed by ex situ bioassays. These findings suggest that the putative magnitude of fire effects on soil fungal communities may be dependent on the approach and assessment method used. Furthermore, biome, but not fire type (i.e., wildfire versus prescribed fire) was a significant moderator of our categorical models, suggesting that biome might be a more useful predictor of fungal species richness response to fire than fire type. Reductions in fungal species richness and in situ mycorrhizal colonization post fire declined logarithmically and approached zero (i.e., no effect) at 22 and 11 years, respectively. We concluded that fire reduces fungal species richness and in situ mycorrhizal colonization, but if conditions allow communities to recover (e.g., without subsequent disturbance, favorable growing conditions), soil fungi are resilient on decadal time scales; the resiliency of soil fungi likely contributes to the overall rapid ecosystem recovery following fire.
  10. Girona-García, Antonio, et al. “Effects of prescribed burning on soil organic C, aggregate stability and water repellency in a subalpine shrubland: Variations among sieve fractions and depths.” Catena 166 (2018): 68-77Soil organic matter, aggregation and water repellency are relevant interrelated soil properties that can be affected by fire. The aim of this work was to analyse the effects of shrub prescribed burning for pasture reclamation on the soil aggregate stability, organic carbon and water repellency of different soil depths and aggregate sizes in a subalpine environment. Soil samples were collected from an area treated by an autumnal low-intensity prescribed fire in the Central Pyrenees (NE-Spain) at 0–1, 1–2, 2–3 and 3–5 cm depths just before and ~1 h, 6 months and 12 months after burning. Samples were separated as whole soil (<10 mm) and 6 sieve fractions, <0.25, 0.25–0.5, 0.5–1, 1–2, 2–4 and 4–10 mm. We analysed soil organic Carbon (SOC), aggregate stability (AS) and soil water repellency (SWR). In the unburned samples, SOC and SWR were higher in the <0.25 to 2 mm sieve fractions than the 2 to 10 mm sieve fractions. Fire severely and significantly decreased the SOC content in the whole soil and the <0.25 mm fraction at 0–1 cm depth and in the 0.25–0.5 mm fraction at 0–2 cm depth. SWR was reduced by burning mainly at 0–1 cm depth for the whole soil and the <0.25 to 2 mm sieve fractions. Nevertheless, the AS of the 0.25–0.5 mm aggregates increased after fire, while the rest of the sieve fractions remained virtually unaffected. One year after the prescribed burning, SOC slightly increased and SWR recovered in the fire-affected fractions, while the AS for all aggregate sizes and depths showed a considerable decrease. The results suggest that the direct effects of burning are still present one year after burning, and the post-fire situation may pose an increased risk of soil loss. Furthermore, our results indicate that fine soil fractions are more likely to be affected by fire than coarser soil fractions and highly influence the whole soil behaviour.
  11. Butler, Orpheus M., et al. “The phosphorus‐rich signature of fire in the soil–plant system: a global meta‐analysis.” Ecology letters 21.3 (2018): 335-344.  The biogeochemical and stoichiometric signature of vegetation fire may influence post‐fire ecosystem characteristics and the evolution of plant ‘fire traits’. Phosphorus (P), a potentially limiting nutrient in many fire‐prone environments, might be particularly important in this context; however, the effects of fire on Phosphorus   cycling often vary widely. We conducted a global‐scale meta‐analysis using data from 174 soil studies and 39 litter studies, and found that fire led to significantly higher concentrations of soil mineral Phosphorus as well as significantly lower soil and litter carbon:Phosphorus  and nitrogen:Phosphorus ratios. These results demonstrate that fire has a Phosphorus ‐rich signature in the soil–plant system that varies with vegetation type. Further, they suggest that burning can ease Phosphorus limitation and decouple the biogeochemical cycling of Phosphorus , carbon and nitrogen. These effects resemble a transient reversion to an earlier stage of ecosystem development, and likely underpin at least some of fire’s impacts on ecosystems and organisms.
  12. Alcañiz, M., et al. “Effects of prescribed fires on soil properties: a review.” Science of The Total Environment 613 (2018): 944-957.  Soils constitute one of the most valuable resources on earth, especially because soil is renewable on human time scales. During the 20th century, a period marked by a widespread rural exodus and land abandonment, fire suppression policies were adopted facilitating the accumulation of fuel in forested areas, exacerbating the effects of wildfires, leading to severe degradation of soils. Prescribed fires had emerged as an option for protecting forests and their soils from wildfires through the reduction of fuels levels. However such fires can serve other objectives, including stimulating the regeneration of a particular plant species, maintaining biological diversity or as a tool for recovering grasslands in encroached lands. This paper reviews studies examining the short- and long- term impacts of prescribed fires on the physical, chemical and biological soil properties; in so doing, it provides a summary of the benefits and drawbacks of this technique, to help determine if prescribed fires can be useful for managing the landscape. From the study conducted, we can affirm that prescribed fires affect soil properties but differ greatly depending on soil initial characteristics, vegetation or type of fire. Also, it is possible to see that soil’s physical and biological properties are more strongly affected by prescribed fires than are its chemical properties. Finally, we conclude that prescribed fires clearly constitute a disturbance on the environment (positive, neutral or negative depending on the soil property studied), but most of the studies reviewed report a good recovery and their effects could be less pronounced than those of wildfires because of the limited soil heating and lower fire intensity and severity.
  13. Koltz, Amanda M., et al. “Global change and the importance of fire for the ecology and evolution of insects.” Current opinion in insect science 29 (2018): 110-116Climate change is drastically altering global fire regimes, which may affect the structure and function of insect communities. Insect responses to fire are strongly tied to fire history, plant responses, and changes in species interactions. Many insects already possess adaptive traits to survive fire or benefit from post-fire resources, which may result in community composition shifting toward habitat and dietary generalists as well as species with high dispersal abilities. However, predicting community-level resilience of insects is inherently challenging due to the high degree of spatio-temporal and historical heterogeneity of fires, diversity of insect life histories, and potential interactions with other global change drivers. Future work should incorporate experimental approaches that specifically consider spatiotemporal variability and regional fire history in order to integrate eco-evolutionary processes in understanding insect responses to fire.
  14. Pellegrini, Adam FA, et al. “Fire frequency drives decadal changes in soil carbon and nitrogen and ecosystem productivity.” Nature 553.7687 (2018): 194. Fire frequency is changing globally and is projected to affect the global carbon cycle and climate. However, uncertainty about how ecosystems respond to decadal changes in fire frequency makes it difficult to predict the effects of altered fire regimes on the carbon cycle; for instance, we do not fully understand the long-term effects of fire on soil carbon and nutrient storage, or whether fire-driven nutrient losses limit plant productivity. Here we analyse data from 48 sites in savanna grasslands, broadleaf forests and needleleaf forests spanning up to 65 years, during which time the frequency of fires was altered at each site. We find that frequently burned plots experienced a decline in surface soil carbon and nitrogen that was non-saturating through time, having 36 per cent (±13 per cent) less carbon and 38 per cent (±16 per cent) less nitrogen after 64 years than plots that were protected from fire. Fire-driven carbon and nitrogen losses were substantial in savanna grasslands and broadleaf forests, but not in temperate and boreal needleleaf forests. We also observe comparable soil carbon and nitrogen losses in an independent field dataset and in dynamic model simulations of global vegetation. The model study predicts that the long-term losses of soil nitrogen that result from more frequent burning may in turn decrease the carbon that is sequestered by net primary productivity by about 20 per cent of the total carbon that is emitted from burning biomass over the same period. Furthermore, we estimate that the effects of changes in fire frequency on ecosystem carbon storage may be 30 per cent too low if they do not include multidecadal changes in soil carbon, especially in drier savanna grasslands. Future changes in fire frequency may shift ecosystem carbon storage by changing soil carbon pools and nitrogen limitations on plant growth, altering the carbon sink capacity of frequently burning savanna grasslands and broadleaf forests. CONCLUSION: In conclusion, our results reveal the sensitivity of surface soils to fire and the substantial effects that changes in soil pools have on long-term ecosystem C exchange. The large empirical and conservative modelbased
    estimates of soil C changes suggest that present estimates of fire-driven C losses7, which primarily consider losses from plant biomass pools, may substantially underestimate the effects of long-term trends in fire frequencies in savanna grasslands and broadleaf forests in particular. Our findings suggest that future alterations in fire regimes in savanna grasslands and broadleaf forests may shift ecosystem C storage by changing soil C levels and changing the N limitation of plant growth, altering the carbon-sink capacity of these fire-prone ecosystems.
  15. Pressler, Yamina, John C. Moore, and M. Francesca Cotrufo. “Belowground community responses to fire: meta‐analysis reveals contrasting responses of soil microorganisms and mesofauna.” Oikos 128.3 (2019): 309-327Global fire regimes are shifting due to climate and land use changes. Understanding the responses of below-ground communities to fire is key to predicting changes in the ecosystem processes they regulate. We conducted a comprehensive meta‐analysis of 1634 observations from 131 empirical studies to investigate the effect of fire on soil microorganisms and mesofauna. Fire had a strong negative effect on soil biota biomass, abundance, richness, evenness, and diversity. Fire reduced microorganism biomass and abundance by up to 96%. Bacteria were more resistant to fire than fungi. Fire reduced nematode abundance by 88% but had no significant effect on soil arthropods. Fire reduced richness, evenness and diversity of soil microorganisms and mesofauna by up to 99%. We found little evidence of temporal trends towards recovery within 10 years post‐disturbance suggesting little resilience of the soil community to fire. Interactions between biome, fire type, and depth explained few of these negative trends. Future research at the intersection of fire ecology and soil biology should aim to integrate soil community structure with the ecosystem processes they mediate under changing global fire regimes.



This post was motivated by a related post on the WUWT site [LINK] . 

Millar, Richard J., and Pierre Friedlingstein. “The utility of the historical record for assessing the transient climate response to cumulative emissions.” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 376.2119 (2018): 20160449. ABSTRACT: The historical observational record offers a way to constrain the relationship between cumulative carbon dioxide emissions and global mean warming. We use a standard detection and attribution technique, along with observational uncertainties to estimate the all-forcing or ‘effective’ transient climate response to cumulative emissions (TCRE) from the observational record. Accounting for observational uncertainty and uncertainty in historical non-CO2 radiative forcing gives a best-estimate from the historical record of 1.84°C/TtC (1.43–2.37°C/TtC 5–95% uncertainty) for the effective TCRE and 1.31°C/TtC (0.88–2.60°C/TtC 5–95% uncertainty) for the CO2-only TCRE. While the best-estimate TCRE lies in the lower half of the IPCC likely range, the high upper bound is associated with the not-ruled-out possibility of a strongly negative aerosol forcing. Earth System Models have a higher effective TCRE range when compared like-for-like with the observations over the historical period, associated in part with a slight underestimate of diagnosed cumulative emissions relative to the observational best-estimate, a larger ensemble mean-simulated CO2-induced warming, and rapid post-2000 non-CO2 warming in some ensemble members. This article is part of the theme issue ‘The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels’.









  1. It has long been recognized that the climate sensitivity of surface temperature to the logarithm of atmospheric CO2  (ECS), which lies at the heart of the anthropogenic global warming and climate change (AGW) proposition, was a difficult issue for climate science because of the large range of empirical values reported in the literature and the so called “uncertainty problem” it implies {Caldeira, et al “Climate sensitivity uncertainty and the need for energy without CO2 emission.” Science 299.5615 (2003): 2052-2054}. The ECS uncertainty issue was interpreted in two very different ways. Climate science took the position that ECS uncertainty implies that climate action has to be greater than that implied by the mean value of ECS in order to ensure that higher values of ECS that are possible will be accommodated while skeptics argued that the large range means that we don’t really know. At the same time skeptics also presented convincing arguments against the assumption that observed changes in atmospheric CO2 concentration can be attributed to fossil fuel emissions [[LINK] , [LINK] .
  2. A breakthrough came in 2009 when Damon Matthews, Myles Allen, and a few others almost simultaneously published almost identical papers reporting the discovery of a “near perfect” correlation (ρ≈1) between surface temperature and cumulative emissions {2009: Matthews, H. Damon, et al. “The proportionality of global warming to cumulative carbon emissions” Nature 459.7248 (2009): 829}. They had found that, irrespective of the timing of emissions or of atmospheric CO2 concentration, emitting a trillion tonnes of carbon will cause 1.0 – 2.1 C of global warming. This linear regression coefficient corresponding with the near perfect correlation between cumulative warming and cumulative emissions (note: temperature=cumulative warming), initially described as the Climate Carbon Response (CCR) was later termed the Transient Climate Response to Cumulative Emissions (TCRE). Initially a curiosity, it gained in importance when it was found that it was in fact predicting future temperatures consistent with model predictions. The consistency with climate models was taken as a validation of the new tool and the TCRE became integrated into the theory of climate change. However, as noted in a related post [LINK] [LINK] , the consistency likely derives from the assumption that emissions accumulate in the atmosphere.
  3. Thereafter the TCRE became incorporated into the foundation of climate change theory particularly so in terms of its utility in the construction of carbon budgets for climate action plans for any given target temperature rise, an application for which the TCRE appeared to be tailor made. Most importantly, it solved or perhaps bypassed the messy and inconclusive uncertainty issue in ECS climate sensitivity that remained unresolved. The importance of this aspect of the TCRE is found in the 2017 paper “Beyond Climate Sensitivity” by prominent climate scientist Reto Knutti where he declared that the TCRE metric should replace the ECS as the primary tool for relating warming to human caused emissions {2017: Knutti, Reto, Maria AA Rugenstein, and Gabriele C. Hegerl. “Beyond equilibrium climate sensitivity.” Nature Geoscience 10.10 (2017): 727}. The anti ECS Knutti paper was not only published but received with great fanfare by the journal and by the climate science community in general.
  4. The TCRE has continued to gain in importance and prominence as a tool for the practical application of climate change theory in terms of its utility in the construction and tracking of carbon budgets for limiting warming to a target such as the Paris Climate Accord target of +1.5C above pre-industrial.  {Matthews, H. Damon. “Quantifying historical carbon and climate debts among nations.” Nature climate change 6.1 (2016): 60}. A bibliography on the subject of TCRE carbon budgets is included below at the end of this post.
  5. However, a vexing issue has arisen in the practical matter of applying and tracking TCRE based carbon budgets. The unsolved matter in the TCRE carbon budget is the remaining carbon budget puzzle {Rogelj, Joeri, et al. “Estimating and tracking the remaining carbon budget for stringent climate targets.” Nature 571.7765 (2019): 335-342}. It turns out that midway in the implementation of a carbon budget, the remaining carbon budget computed by subtraction does not match the TCRE carbon budget for the latter period computed directly using the Damon Matthews proportionality of temperature with cumulative emissions for that period. As it turns out, the difference between the two estimates of the remaining carbon budget has a rational explanation in terms of the statistics of a time series of cumulative values of another time series described in a related post [LINK] .
  6. It is shown in the related  post that a time series of the cumulative  values of another time series has neither time scale nor degrees of freedom and that therefore statistical properties of this series can have no practical interpretation. It is demonstrated with random numbers that the only practical implication of the “near perfect proportionality” correlation reported by Damon Matthews is that the two time series being compared (annual warming and annual emissions) tend to have positive values. In the case of emissions we have all positive values, and during a time of global warming, the annual warming series contains mostly positive values. The correlation between temperature (cumulative warming) and cumulative emissions derives from this sign bias as demonstrated with random numbers with and without sign bias in a related post [LINK] .
  7.  The sign bias explains the correlation between cumulative values of time series data and also the remaining carbon budget puzzle. It is shown in the related post [LINK] that the TCRE regression coefficient between these time series of cumulative values derives from the positive value bias in the annual warming data. Thus, during a period of accelerated warming, the second half of the carbon budget period may contain a higher percentage of positive values for annual warming and it will therefore show a carbon budget that exceeds the proportional budget for the second half computed from the full span regression coefficient that is based on a lower bias for positive values.
  8. In short, the bias for positive annual warming is highest for the second half, lowest for the first half, and midway between these two values for the full span – and therein lies the simple statistics explanation of the remaining carbon budget issue that climate science is trying to solve in terms of climate theory and its extension to Earth System Models. The Millar and Friedlingstein 2018 paper is yet another in a long line of studies that ignore the statistical issues the TCRE correlation and instead try to explain its anomalous behavior in terms of climate theory whereas in fact their explanation lies in statistical issues that have been overlooked by these young scientists [LINK] .
  9. The fundamental problem with the construction of TCRE carbon budgets and their interpretation in terms of climate action is that the TCRE is a spurious correlation that has no interpretation in terms of a relationship between emissions and warming. Complexities in these carbon budgets such as the remaining carbon budget are best understood in these terms and not in terms of new and esoteric variables such as those in earth system models. 





  1. MacDougall, Andrew H., et al. “Sensitivity of carbon budgets to permafrost carbon feedbacks and non-CO2 forcings.” Environmental Research Letters 10.12 (2015): 125003.  The near proportionality between cumulative CO2 emissions and change in near surface temperature can be used to define a carbon budget: a finite quantity of carbon that can be burned associated with a chosen ‘safe’ temperature change threshold. Here we evaluate the sensitivity of this carbon budget to permafrost carbon dynamics and changes in non-CO2 forcings. The carbon budget for 2.0 C ◦ of warming is reduced from 1320 Pg C when considering only forcing from CO2 to 810 Pg C when considering permafrost carbon feedbacks as well as other anthropogenic contributions to climate change. We also examined net carbon budgets following an overshoot of and return to a warming target. That is, the net cumulative CO2 emissions at the point in time a warming target is restored following artificial removal of CO2 from the atmosphere to cool the climate back to a chosen temperature target. These overshoot net carbon budgets are consistently smaller than the conventional carbon budgets. Overall carbon budgets persist as a robust and simple conceptual framework to relate the principle cause of climate change to the impacts of climate change. [FULL TEXT PDF]
  2. Millar, Richard, et al. “The cumulative carbon budget and its implications.” Oxford Review of Economic Policy 32.2 (2016): 323-342.  The cumulative impact of carbon dioxide (CO 2 ) emissions on climate has potentially profound economic and policy implications. It implies that the long-term climate change mitigation challenge should be reframed as a stock problem, while the overwhelming majority of climate policies continue to focus on the flow of CO 2 into the atmosphere in 2030 or 2050. An obstacle, however, to the use of a cumulative carbon budget in policy is uncertainty in the size of this budget consistent with any specific temperature-based goal such as limiting warming to 2°C. This arises from uncertainty in the climate response to CO 2 emissions, which is relatively tractable, and uncertainty in future warming due to non-CO 2 drivers, which is less so. We argue these uncertainties are best addressed through policies that recognize the need to reduce net global CO 2 emissions to zero to stabilize global temperatures but adapt automatically to evolving climate change. Adaptive policies would fit well within the Paris Agreement under the UN Framework Convention on Climate Change.
  3. Rogelj, Joeri, et al. Differences between carbon budget estimates unravelled,  Nature Climate Change 6.3 (2016): 245-252.  Several methods exist to estimate the cumulative carbon emissions that would keep global warming to below a given temperature limit. Here we review estimates reported by the IPCC and the recent literature, and discuss the reasons underlying their differences. The most scientifically robust number — the carbon budget for CO2-induced warming only — is also the least relevant for real-world policy. Including all greenhouse gases and using methods based on scenarios that avoid instead of exceed a given temperature limit results in lower carbon budgets. For a >66% chance of limiting warming below the internationally agreed temperature limit of 2 °C relative to pre-industrial levels, the most appropriate carbon budget estimate is 590–1,240 GtCO2 from 2015 onwards. Variations within this range depend on the probability of staying below 2 °C and on end-of-century non-CO2 warming. Current CO2 emissions are about 40 GtCO2 yr−1, and global CO2 emissions thus have to be reduced urgently to keep within a 2 °C-compatible budget.
  4. MacDougall, Andrew H., et al. “Corrigendum: Sensitivity of carbon budgets to permafrost carbon feedbacks and non-CO2 forcings (2015Environ. Res. Lett. 10.” (2016). The near proportionality between cumulative CO2 emissions and change in near surface temperature can be used to define a carbon budget: a finite quantity of carbon that can be burned associated with a chosen ‘safe’ temperature change threshold. Here we evaluate the sensitivity of this carbon budget to permafrost carbon dynamics and changes in non-CO2 forcings. The carbon budget for 2.0 C ◦ of warming is reduced from 1320 Pg C when considering only forcing from CO2 to 810 Pg C when considering permafrost carbon feedbacks as well as other anthropogenic contributions to climate change. We also examined net carbon budgets following an overshoot of and return to a warming target. That is, the net cumulative CO2 emissions at the point in time a warming target is restored
    following artificial removal of CO2 from the atmosphere to cool the climate back to a chosen temperature target. These overshoot net carbon budgets are consistently smaller than the conventional carbon budgets. Overall carbon budgets persist as a robust and simple conceptual framework to relate the principle cause of climate change to the impacts of climate change.
  5. Friedlingstein, P. “Differences between carbon budget estimates unravelled. (2016).  Several methods exist to estimate the cumulative carbon emissions that would keep global warming to below a given temperature limit. Here we review estimates reported by the IPCC and the recent literature, and discuss the reasons underlying their differences. The most scientifically robust number — the carbon budget for CO2-induced warming only — is also the least relevant for real-world policy. Including all greenhouse gases and using methods based on scenarios that avoid instead of exceed a given temperature limit results in lower carbon budgets. For a >66% chance of limiting warming below the internationally agreed temperature limit of 2 °C relative to pre-industrial levels, the most appropriate carbon budget estimate is 590–1,240 GtCO2 from 2015 onwards. Variations within this range depend on the probability of staying below 2 °C and on end-of-century non-CO2 warming. Current CO2 emissions are about 40 GtCO2 yr–1, and global CO2 emissions thus have to be reduced urgently to keep within a 2 °C-compatible budget [FULL TEXT] .
  6. Matthews, H. Damon, et al. “Estimating carbon budgets for ambitious climate targets.” Current Climate Change Reports 3.1 (2017): 69-77.  Carbon budgets, which define the total allowable CO2 emissions associated with a given global climate target, are a useful way of framing the climate mitigation challenge. In this paper, we review the geophysical basis for the idea of a carbon budget, showing how this concept emerges from a linear climate response to cumulative CO2 emissions. We then discuss the difference between a “CO2-only carbon budget” associated with a given level of CO2-induced warming and an “effective carbon budget” associated with a given level of warming caused by all human emissions. We present estimates for the CO2-only and effective carbon budgets for 1.5 and 2 °C, based on both model simulations and updated observational data. Finally, we discuss the key contributors to uncertainty in carbon budget estimates and suggest some implications of this uncertainty for decision-making. Based on the analysis presented here, we argue that while the CO2-only carbon budget is a robust upper bound on allowable emissions for a given climate target, the size of the effective carbon budget is dependent on the how quickly we are able to mitigate non-CO2 greenhouse gas and aerosol emissions. This suggests that climate mitigation efforts could benefit from being responsive to a changing effective carbon budget over time, as well as to potential new information that could narrow uncertainty associated with the climate response to CO2 emissions.
  7. MacDougall, Andrew H. “The oceanic origin of path-independent carbon budgets.” Scientific reports 7.1 (2017): 10373.  Virtually all Earth system models (ESM) show a near proportional relationship between cumulative emissions of CO2 and change in global mean temperature, a relationship which is independent of the emissions pathway taken to reach a cumulative emissions total. The relationship, which has been named the Transient Climate Response to Cumulative CO2 Emissions (TCRE), gives rise to the concept of a ‘carbon budget’. That is, a finite amount of carbon that can be burnt whilst remaining below some chosen global temperature change threshold, such as the 2.0 °C target set by the Paris Agreement. Here we show that the path-independence of TCRE arises from the partitioning ratio of anthropogenic carbon between the ocean and the atmosphere being almost the same as the partitioning ratio of enhanced radiative forcing between the ocean and space. That these ratios are so close in value is a coincidence unique to CO2. The simple model used here is underlain by many assumptions and simplifications but does reproduce key aspects of the climate system relevant to the path-independence of carbon budgets. Our results place TCRE and carbon budgets on firm physical foundations and therefore help validate the use of these metrics for climate policy.
  8. van der Ploeg, Frederick. “The safe carbon budget.” Climatic change 147.1-2 (2018): 47-59.  Cumulative emissions drive peak global warming and determine the carbon budget needed to keep temperature below 2 or 1.5 °C. This safe carbon budget is low if uncertainty about the transient climate response is high and risk tolerance (willingness to accept risk of overshooting the temperature target) is low. Together with energy costs, this budget determines the optimal carbon price and how quickly fossil fuel is abated and replaced by renewable energy. This price is the sum of the present discounted value of all future losses in aggregate production due to emitting one ton of carbon today plus the cost of peak warming that rises over time to reflect the increasing scarcity of carbon as temperature approaches its upper limit. If policy makers ignore production losses, the carbon price rises more rapidly. If they ignore the peak temperature constraint, the carbon price rises less rapidly. The alternative of adjusting damages upwards to factor in the peak warming constraint leads initially to a higher carbon price which rises less rapidly.
  9. Matthews, H. Damon, et al. “Focus on cumulative emissions, global carbon budgets and the implications for climate mitigation targets.” Environmental Research Letters 13.1 (2018): 010201.  The Environmental Research Letters focus issue on ‘Cumulative Emissions, Global Carbon Budgets and the Implications for Climate Mitigation Targets’ was launched in 2015 to highlight the emerging science of the climate response to cumulative emissions, and how this can inform efforts to decrease emissions fast enough to avoid dangerous climate impacts. The 22 research articles published represent a fantastic snapshot of the state-or-the-art in this field, covering both the science and policy aspects of cumulative emissions and carbon budget research. In this Review and Synthesis, we summarize the findings published in this focus issue, outline some suggestions for ongoing research needs, and present our assessment of the implications of this research for ongoing efforts to meet the goals of the Paris climate agreement.
  10. Millar, Richard J., and Pierre Friedlingstein. “The utility of the historical record for assessing the transient climate response to cumulative emissions.” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 376.2119 (2018): 20160449. The historical observational record offers a way to constrain the relationship between cumulative carbon dioxide emissions and global mean warming. We use a standard detection and attribution technique, along with observational uncertainties to estimate the all-forcing or ‘effective’ transient climate response to cumulative emissions (TCRE) from the observational record. Accounting for observational uncertainty and uncertainty in historical non-CO2 radiative forcing gives a best-estimate from the historical record of 1.84°C/TtC (1.43–2.37°C/TtC 5–95% uncertainty) for the effective TCRE and 1.31°C/TtC (0.88–2.60°C/TtC 5–95% uncertainty) for the CO2-only TCRE. While the best-estimate TCRE lies in the lower half of the IPCC likely range, the high upper bound is associated with the not-ruled-out possibility of a strongly negative aerosol forcing. Earth System Models have a higher effective TCRE range when compared like-for-like with the observations over the historical period, associated in part with a slight underestimate of diagnosed cumulative emissions relative to the observational best-estimate, a larger ensemble mean-simulated CO2-induced warming, and rapid post-2000 non-CO2 warming in some ensemble members. This article is part of the theme issue ‘The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels’.
  11. Rogelj, Joeri, et al. “Estimating and tracking the remaining carbon budget for stringent climate targets.” Nature 571.7765 (2019): 335-342.  Research reported during the past decade has shown that global warming is roughly proportional to the total amount of carbon dioxide released into the atmosphere. This makes it possible to estimate the remaining carbon budget: the total amount of anthropogenic carbon dioxide that can still be emitted into the atmosphere while holding the global average temperature increase to the limit set by the Paris Agreement. However, a wide range of estimates for the remaining carbon budget has been reported, reducing the effectiveness of the remaining carbon budget as a means of setting emission reduction targets that are consistent with the Paris Agreement. Here we present a framework that enables us to track estimates of the remaining carbon budget and to understand how these estimates can improve over time as scientific knowledge advances. We propose that application of this framework may help to reconcile differences between estimates of the remaining carbon budget and may provide a basis for reducing uncertainty in the range of future estimates.















  1. Sea Ice Volume is calculated using the Pan-Arctic Ice Ocean Modeling and Assimilation System (PIOMAS, Zhang and Rothrock, 2003) and published as anomalies by the Polar Science Center of the University of Washington [LINK] . Anomalies for each day are calculated relative to the average over the 1979 -2016 period for that day of the year to remove the annual cycle. The data as received for the calendar months January to September in the period 1979-2019 are displayed graphically in Figure 1. The red line through the data is a 3rd order polynomial regression curve. Sustained sea ice volume decline is seen for all nine months studied. As of this writing data for all years in the study period 1979-2019 are available for the calendar months January to September. This period contains the seasonal minimum and maximum sea ice extents that occur in September and March respectively.
  2. The average rate of decline in sea ice volume in thousands of cubic kilometers per year for each of the nine calendar months studied is summarized graphically in Figure 2. It shows high rates of decline in winter and spring and lower rates of decline in summer and fall.
  3. The alarming decline in sea ice is thought to be driven by anthropogenic global warming (AGW) climate change and is thought to be ecologically harmful to the region and to the world as a whole as well as a posing the possibility of initiating runaway global warming through a feedback system caused by lost albedo. It is proposed that the dangerous rate of sea ice decline can and must be attenuated by taking climate action in the form of reducing or eliminating fossil fuel emissions.
  4. To determine whether the observed loss in sea ice volume can be attributed to AGW climate change such that it can be attenuated with climate action in terms of reducing or eliminating fossil fuel emissions, we compute the correlation between AGW driven temperature rise and loss in sea ice volume. The UAH satellite data for lower troposphere temperature over the North Polar Ocean region is used as the relevant temperature record. Since rising temperature is expected to cause declining sea ice volume, the attribution of sea ice loss to temperature requires a statistically significant negative correlation. The correlation between temperature and the loss in sea ice volume for each of the nine calendar months is shown with the blue line in Figure 3 and it does show statistically significant negative correlations between temperature and the rate of decline in Arctic sea ice volume.
  5. However, source data correlation between  x and y in time series data derive from responsiveness of y to x at the time scale of interest and also from shared long term trends. These two effects can be separated by detrending both time series and then computing the correlation between the detrended series.  When the trend effect is removed only the responsiveness of y to x remains. This is why detrended correlation is a better measure of responsiveness than source data correlation as explained very well by Alex Tolley in his Youtube lecture [LINK] . That spurious correlations can be found in time series data when detrended analysis is not used is demonstrated with examples at the Tyler Vigen Spurious Correlation website [LINK] .
  6. Accordingly, the correlations between the detrended series are computed and reported in the red line of Figure 3. There we find that without the shared opposing trends in the two time series, the correlation is essentially zero. None of the correlation in the source data survives into the detrended series indicating that the correlation is an artifact of shared trends and not an indication of responsiveness at an annual time scale. Thus we find no evidence in the data that the observed decline in sea ice volume can be attributed to AGW climate change.
  7. The continued attribution of sea ice dynamics whether in extent, area, or volume, to AGW climate change (see history of attribution below), likely derives from the atmosphere bias of climate science such that there is a tendency to explain all observed changes in the Arctic, such as sea ice melt, in terms of AGW climate change and overlook the extensive geothermal heat sources in the Arctic – an area known to be geologically active. Some of the geological features of the Arctic including the Mid Arctic Rift system and the Jan Mayen Trend are described in related posts [LINK] [LINK] and in the Graphic below.
  8. SUMMARY: The data do show declining Arctic sea ice in its various measures such as extent, area, thickness, age, and PIOMAS volume but without evidence for the assumed attribution of these changes to AGW climate change and therefore without support for the claim that these changes can be attenuated with climate action in the form of reducing or eliminating fossil fuel emissions. A historical list of such attributions of convenience that have eroded the credibility of climate science is provided below.


bandicam 2019-07-01 16-29-44-526





  1. The atmosphere bias of climate science in terms of its study of sea ice is evident in the following historical notes (paragraph#13 to 23 below) on the continuing concern that AGW climate change is melting away the sea ice. 
    An unprecedented 4-year study of the Arctic shows that polar bears, walruses, and some seals are becoming extinct. Arctic summer sea ice may disappear entirely. Combined with a rapidly melting Greenland ice sheet, it will raise the sea level 3 feet by 2100 inundating lowlands from Florida to Bangladesh. Average winter temperatures in Alaska and the rest of the Arctic are projected to rise an additional 7 to 13 degrees over the next 100 years because of increasing emissions of greenhouse gases from human activities. The area is warming twice as fast as anywhere else because of global air circulation patterns and natural feedback loops, such as less ice reflecting sunlight, leading to increased warming at ground level and more ice melt. Native peoples’ ways of life are threatened. Animal migration patterns have changed, and the thin sea ice and thawing tundra make it too dangerous for humans to hunt and travel.
    The Arctic Climate Impact Assessment (ACIA) report says: increasing greenhouse gases from human activities is causing the Arctic to warm twice as fast as the rest of the planet; in Alaska, western Canada, and eastern Russia winter temperatures have risen by 2C to 4C in the last 50 years; the Arctic will warm by 4C to 7C by 2100. A portion of Greenland’s ice sheet will melt; global sea levels will rise; global warming will intensify. Greenland contains enough melting ice to raise sea levels by 7 meters; Bangkok, Manila, Dhaka, Florida, Louisiana, and New Jersey are at risk of inundation; thawing permafrost and rising seas threaten Arctic coastal regions; climate change will accelerate and bring about profound ecological and social changes; the Arctic is experiencing the most rapid and severe climate change on earth and it’s going to get a lot worse; Arctic summer sea ice will decline by 50% to 100%; polar bears will be driven towards extinction; this report is an urgent SOS for the Arctic; forest fires and insect infestations will increase in frequency and intensity; changing vegetation and rising sea levels will shrink the tundra to its lowest level in 21000 years; vanishing breeding areas for birds and grazing areas for animals will cause extinctions of many species; “if we limit emission of heat trapping carbon dioxide we can still help protect the Arctic and slow global warming”.
    Climate science declares that the low sea ice extent in the Arctic is the leading indicator of climate change. We are told that the Arctic “is screaming”, that Arctic sea ice extent is the “canary in the coal mine”, and that Polar Bears and other creatures in the Arctic are dying off and facing imminent extinction. Scientists say that the melting sea ice has set up a positive feedback system that would cause the summer melts in subsequent years to be greater and greater until the Arctic becomes ice free in the summer of 2012. We must take action immediately to cut carbon dioxide emissions from fossil fuels.
    The unusual summer melt of Arctic sea ice in 2007 has encouraged climate science to warn the world that global warming will cause a steep decline in the amount of ice left in subsequent summer melts until the Arctic becomes ice free in summer and that could happen as soon as 2080 or maybe 2060 or it could even be 2030. This time table got shorter and shorter until, without a “scientific” explanation, the ice free year was brought up to 2013. In the meantime, the data showed that in 2008 and 2009 the summer melt did not progressively increase as predicted but did just the opposite by making a comeback in 2008 that got even stronger in 2009. More info:
    Our use of fossil fuels is devastating the Arctic where the volume of sea ice “fell to its lowest recorded level to date” this year and that reduced ice coverage is causing a non-linear acceleration in the loss of polar ice because there is less ice to reflect sunlight. More info:
  7. 2008: THE ARCTIC WILL BE ICE FREE IN SUMMER IN 2008, 2013, or 2030
    The unusually low summer sea ice extent in the Arctic in 2007
    The IPCC has taken note and has revised its projection of an ice free Arctic first from 2008 to 2013 and then again from 2013 to 2030. The way things are going it may be revised again to the year 2100. More info:
    When there was a greater focus on Antarctica climate scientists said that global warming was melting the West Antarctic Ice Shelf; but the melting was found to be localized and with an active volcano underneath the melting and the attention of “melt forecast” climate science shifted to Arctic sea ice after the an extensive summer melt was observed in September 2007. More info:
    The second lowest was 2008 and the first lowest was 2007. This is not a trend that shows that things are getting worse. It shows that things are getting better and yet it is being sold and being bought as evidence that things are getting worse due to rising fossil fuel emissions. More info:
    An alarm is raised that the extreme summer melt of Arctic sea ice in 2007 was caused by humans using fossil fuels and it portends that in 20 years human caused global warming will leave the Arctic Ocean ice-free in the summer raising sea levels and harming wildlife. More info:
    Climate scientists continue to extrapolate the extreme summer melt of Arctic sea ice in 2007 to claim that the summer melt of 2007 was a climate change event and that it implies that the Arctic will be ice free in the summer from 2012 onwards. This is a devastating effect on the planet and our use of fossil fuels is to blame. More info:
    Summer melt of Arctic ice was the third most extensive on record in 2009, second 2008, and the most extensive in 2007. These data show that warming due to our carbon dioxide emissions are causing summer Arctic ice to gradually diminish until it will be gone altogether. More info:










  1. TRANSCRIPTION: The humpbacks have returned to the shores of western Greenland. They bounced  back from near extinction thanks to an international effort to stop their slaughter and here they are feeding at the edge of Greenland’s largest glacier. The Yakob Harbin glacier stretches inland for around 40 miles. But for how long? The melt here in Greenland hit record levels in the summer (of 2019). Greenland’s ice cap which holds about 8% of the world’s fresh water just lost 12.5 billion tonnes of ice in a single day. It raised sea levels globally. That melt, on August 2nd 2019, was the largest single day loss in recorded history.
  2. The entire ice sheet that covers Greenland contains enough ice to raise sea levels across the globe by 20 feet if it melted. Climate models say that won’t happen for a while but consider this: the summer’s level of ice melt wasn’t supposed to happen for another 50 years. Greenland is now losing so much ice that it is shaping the world in such a way that you cannot ignore it any more.
  3. Question: But doesn’t the world naturally warm up and melt ice in interglacials? Answer: Laughter! Greenland is losing approximately 8,500 tonnes of ice per second day in day out around the clock, 8,500 tonnes of ice per second every second! Mind blowing! Hard to…. that’s why it’s a concern!
  4. In fact over half the Arctic’s permanent ice has melted revealing a landscape that has been hidden for 40,000 years. Up the glacier an area of ice 10 times the size of the UK pouring billions of tonnes of water into the Atlantic Ocean every day of the year every year and the melt rate is gaining momentum meaning that unless we act soon, this incredible destruction of Greenland’s glaciers will be unstoppable.
  5. Disappearing sea ice means shipping routes like the Northwest Passage are opening up. That could cut freight miles but others say these waters should be protected as a world heritage site. We’re expecting more traffic from the metal industry, oil industry, tourism, and more traffic in the Northwest Passage.
  6. Question: But ships are safe now. The fact that you can open something up to shipping doesn’t mean necessarily that the area will be destroyed, does it? Answer: Right here in this pristine area you can use heavy fuel oil. Once you bring heavy fuel oils into this area there could be a catastrophe of enormous dimensions.
  7. Moments later as if on cue we suddenly have company – diesel powered company (a boatload of tourists arrive). The problem is these Arctic waters are some of the most fragile marine environments on the planet; and there is no control about the numbers of luxury cruise ships driving through it.
  8. (Natives singing and dancing performing for tourists) In 17 degrees summer heat it is t-shirt weather. It’s hot work performing in seal skin jackets and polar bear trousers. Traditions of the past played out for tourists. But the old way of life is disappearing partly due to climate change. Rainfall, unusual in the High Arctic is triggering landslides. This village is entirely built on permafrost but its foundations are melting in the summer. Now more than half the town will need to be demolished and rebuilt. 75-year-old Inuit Joseph Manumina is a village elder. He’s seen the glaciers melt at a rapid rate and he says “The sun is warmer now than it has been and that is the reason for the melt”. It was not like this before. 
  9. A young Inuit speaks: We can see this time of year. That is a benefit for us because we can travel further and faster by boat but when it is getting dark in the early winter like if we say when the sun comes down like in October and November that is where it starts, we cannot go up by boat because we cannot see anything and sea ice will be too thin to go out by dogsled. That’s where the problem is.
  10. So the dog population reduces year by year. Without sea ice you can’t use sled dogs so they shoot the dogs.
  11. Tourism is developing a new economy here. Whale watching! Some Inuits are still allowed to hunt whales but many find that there is more money to be made in the whale watching tourism business rather than harpooning. But the tourists are also coming for the Arctic surroundings and as Greenland melts four times faster than previously predicted the future of the humpbacks and us humans around the world are in jeopardy.


  1. RESPONSES: CLAIM: The humpbacks have returned to the shores of western Greenland. They bounced  back from near extinction thanks to an international effort to stop their slaughter and here they are feeding at the edge of Greenland’s largest glacier. RESPONSE: The only relevance of this introduction is possibly to set an environmental and ecological high ground for the climate change discussion to follow. 
  2. CLAIMThe melt here in Greenland hit record levels in the summer (of 2019). and lost 12.5 billion tonnes of ice in a single day”RESPONSE: The melt reported for a single day is acknowledged as an extreme event and it cannot be treated as an average but even if it were an average, at 12.5 gigatonnes per day every day Greenland would lose 4563 gigatons per year that would raise sea level by 0.417 inches per year and at that rate the whole of the ice sheet will be gone in 576 years. Even more important is that this melt rate was a single extreme day in August when the most rapid melt occurs and in the winter months,Greenland gains ice. The extreme one day summer melt event, though presented in alarming tone and language, does not appear to provide reason for the alarm. 
  3. CLAIM: The entire ice sheet that covers Greenland contains enough ice to raise sea levels across the globe by 20 feet if it melted. RESPONSE: Yes it would but as noted in the prior response, even at the extreme melt rate of 12.5 gigatons per day, every day summer and winter, it would take 576 years to melt the whole of the ice sheet and raise sea levels by 0.417 inches per year until the scary level of 20 feet is reached 576 years from now. The fear mongering is presented with great emotion and strong language but with little substance in the data presented as the rationale for the fear.
  4. CLAIM: Greenland is losing 8,500 tonnes of ice per second every second of the day every day of the year. RESPONSE: Year-round ice loss is normally not possible as Greenland loses ice in summer but gains ice in winter; but even if it were possible, 8,500 tonnes per second amounts to 268 gigatonnes per year that would cause the Greenland ice sheet to be gone in 9,800 years raising sea level by 0.024 inches per year in the meantime. The data presented do not support the fear that is being sold in terms of the data.
  5. CLAIM: In fact over half the Arctic’s permanent ice has melted revealing a landscape that has been hidden for 40,000 years. RESPONSEThe current interglacial is only about 10,000 years old and so the 40,000 year time span into the past takes us to 30,000 years into the last glacial period and its glacial maximum. The comparison between the glacial maximum with interglacial conditions may contain useful information about glaciation cycles but it tells us nothing about Holocene ice dynamics. This claim, though meant to be a reason to fear fossil fuel driven AGW, does not provide any information that would serve as a reason to fear AGW. 
  6. CLAIM: The opening of the Northwest Passage will allow commercial shipping into the Arctic and they use “heavy fuel oil” that poses a catastrophic ecological threat to the region. RESPONSE: The North Sea is a major producer of offshore oil that has been actively producing and shipping oil since 1980 and since that time oil production has gone up from 0.4 million barrels/day in 1980 to more than 1.8 million barrels per day today (2019). Significant shipping of oil, equipment, and personnel is seen in the region but there has been no significant ecological damage to the region caused by ships “using heavy fuel oil”. It appears that no climate fear could be attributed to the opening of the Northwest Passage and so ecological fear was attempted but the argument does not make sense in light of what we know about shipping. 
  7. CLAIM: A young Inuit says that dog sleds can no longer be used because the sea ice is too thin and so they have to shoot the dogs. RESPONSE: That may be so but it can’t be blamed on AGW as described in three related posts [LINK] [LINK] [LINK] that describe the overlooked role of geological activity in the Arctic. 
  8. CLAIM:  75-year-old Inuit Joseph Manumina is a village elder. He’s seen the glaciers melt at a rapid rate and he says “The sun is warmer now than it has been and that is the reason for the melt”. It was not like this beforeRESPONSE: Figure 1 below shows the temperature at Nuuk, Greenland from 1866 to 2013 for each calendar month. A significant rate of warming is not seen particularly so in the summer months. The absence of a strong warming trend in summer is made somewhat clearer in Figure 2 where we see that the overall warming trend derives mostly from warming in the early portion of the time span from 1866 to 1939 with very little warming and even some cooling seen in the second half 1940 to 2013. These data do not support the village elder’s claim that in his 74-year lifetime he has seen rapid warming such that it it is causing the ice  to melt.







SUMMARY: At current sea level rise forecasts, it was projected that 110 million people will be affected by coastal high tide flooding events by the year 2100 but new improved DEM data for coastal land elevation shows that they are not as high as we had thought and so the number of people affected by high tide flood events at the same rate of sea level rise will be higher, maybe 190 million or somewhere between 140 and 240 million. A problem with that assessment is that the large uncertainty in coastal land elevation data may mean that we don’t really know what the coastal land elevation is exactly. 


  1. How the media sees it: November 1, 2019 at 6:37 a.m. GMT+7: HERE’S ANOTHER piece of evidence that climate change might be worse than scientists previously predicted. The seas are rising, and will continue to rise, because hotter temperatures melt land-based ice and expand the volume existing ocean water takes up. But while much study has examined the shift in amount and warmth of seawater humans will face, there is another variable scientists must get right to assess the risk to humanity: just how many people live in low-lying areas. A new paper suggests previous estimates of land elevation — and, therefore, the number of at-risk people — were wrong. The study, published Tuesday in the journal Nature Communications, corrects satellite elevation data, and it “reveals a developed global coastline three times more exposed to extreme coastal water levels than previously thought,” the authors warn. Even under an optimistic scenario in which heat-warming greenhouse emissions are restrained and Antarctic ice sheets remain stable, “the global impacts of sea-level rise and coastal flooding this century will likely be far greater than indicated by the most pessimistic past analyses.” [LINK]
  2. TRANSLATION: Sadly, it looks like AGW climate change driven sea level rise won’t be as high and as scary as we were hoping for but there is still hope for us. What if coastal lands are not as high as we think they are? That would cause the same degree of devastation at the lower sea level rise that we now have to live with. All those people in Bangladesh and elsewhere living close to sea level will die and it will all be your fault for using fossil fuels. [RELATED POST] .
  3. What their source paper says: Article Open Access Published: 29 October 2019: New elevation data triple estimates of global vulnerability to sea-level rise and coastal flooding, Scott A. Kulp & Benjamin H. Strauss, Nature Communications volume 10, Article number: 4844 (2019) Cite this article. ABSTRACT: Most estimates of global mean sea-level rise this century fall below 2 m. This quantity is comparable to the positive vertical bias of the principle digital elevation model (DEM) used to assess global and national population exposures to extreme coastal water levels, NASA’s SRTM. CoastalDEM is a new DEM utilizing neural networks to reduce SRTM error. Here we show – employing CoastalDEM—that 190 M people (150–250 M, 90% CI) currently occupy global land below projected high tide lines for 2100 under low carbon emissions, up from 110 M today, for a median increase of 80 M. These figures triple SRTM-based values. Under high emissions, CoastalDEM indicates up to 630 M people live on land below projected annual flood levels for 2100, and up to 340 M for mid-century, versus roughly 250 M at present. We estimate one billion people now occupy land less than 10 m above current high tide lines, including 250 M below 1 m.
  4. TRANSLATION: Using a new improved digital elevation model (DEM) developed by NASA, we find that land is not as high as we thought it was. We used to think that land was higher such that only 110 million people were at risk of suffering from coastal flooding at current SLR projections for the year 2100. But new land elevation data shows that land is lower than we thought it was such that at the same sea level rise, 190 million people are at risk of suffering from coastal flooding at current SLR projections for the year 2100. The uncertainty in this projection shows a range of {190-40=150} to {190+60=250} million people affected for a 90% confidence interval. Subtracting 40 million for the low end of the confidence interval and adding 60 million for the high end of the confidence interval makes it a little scarier, we thought.
  5. UNCERTAINTY: The Zhang etal 2019 paper shows that the error in the DEM estimations may be even higher with values ranging from 1.74 to 14.29 and the Dai etal 2019 paper finds a much lower uncertainty within 2 meters with much lower scary predictions. Although uncertainty measures the extent of our ignorance the advantage of uncertainty in climate alarmism is that the less we know, the wider the 90% confidence interval gets, and the scarier climate change becomes as explained in paragraph#5 in this related post [LINK] {Activism needs of researchers also corrupt how the statistical property of variance is viewed and interpreted. In statistics and also in information theory, high variance implies low information content. In other words, the higher the variance the less we know. In this context high variance is undesirable because it degrades the information we can derive from the data. However, high variance also yields large confidence intervals and where either end of this confidence interval is extreme enough to support the activism needs of climate researchers, high variance is interpreted not as absence of information but as information about a danger of how extreme it could be. This interpretation in conjunction with the precautionary principle leads to a perverse interpretation of uncertainty such that uncertainty becomes transformed into certainty of extreme values}. In other words, THE LESS WE KNOW THE SCARIER IT GETS. 
  6. An alternative methodology that bypasses the uncertainty problem in SRTM v4. 1 and MERIT DEM data was suggested in a WUWT comment by Hans Erren as follows: “November 3, 2019 at 4:46 am. A simple solution springs to my mind: use sea level gauges in coastal areas and do not use satellites at all” [LINK] . The Hans Erren insight is that an unnecessary complexity is imposed on the study of high tide floods in a list of specific locations by the climate science reliance on global mean eustatic sea level (GMES). It is true that the study of sea level rise and its overall impacts should be studied in terms of GMES but the study of localized events in terms of global data creates an unnecessary complication that introduces layers of uncertainty that do not exist in local data. To understand localized high tide floods as a function of GMES requires land elevation data on a standardized global scale. On the other hand to understand localized high tide flood events as a function of local sea level is a very simple exercise that does not require uncertain satellite measures of land elevation. Since the coastal areas at risk have already been identified, and their tidal gauge data and  high tide flood events are recorded, much greater precision in forecasting future high tide floods can be realized if these at-risk-areas are studied separately instead translating them into global data and then back again to local data. The number of people at risk at each coastal area can then be assessed. The global number can then be estimated by summation. The study of global GMES and DEM data in order to understand localized phenomena is the source of the large uncertainties that ultimately erode the utility of such findings.



  1. Kulp, Scott A., and Benjamin H. Strauss. “CoastalDEM: A global coastal digital elevation model improved from SRTM using a neural network.” Remote sensing of environment 206 (2018): 231-239.  Positive vertical bias in elevation data derived from NASA’s Shuttle Radar Topography Mission (SRTM) is known to cause substantial underestimation of coastal flood risks and exposure. Previous attempts to correct SRTM elevations have used regression to predict vertical error from a small number of auxiliary data products, but these efforts have been focused on reducing error introduced solely by vegetative land cover. Here, we employ a multilayer perceptron artificial neural network to perform a 23-dimensional vertical error regression analyses, where in addition to vegetation cover indices, we use variables including neighborhood elevation values, population density, land slope, and local SRTM deviations from ICESat altitude observations. Using lidar data as ground truth, we train the neural network on samples of US data from 1–20 m of elevation according to SRTM, and assess outputs with extensive testing sets in the US and Australia. Our adjustment system reduces mean vertical bias in the coastal US from 3.67 m to less than 0.01 m, and in Australia from 2.49 m to 0.11 m. RMSE is cut by roughly one-half at both locations, from 5.36 m to 2.39 m in the US, and from 4.15 m to 2.46 in Australia. Using ICESat data as a reference, we estimate that global bias falls from 1.88 m to −0.29 m, and RMSE from 4.28 m and 3.08 m. The methods presented here are flexible and effective, and can be effectively applied to land cover of all types, including dense urban development. The resulting enhanced global coastal DEM (CoastalDEM) promises to greatly improve the accuracy of sea level rise and coastal flood analyses worldwide.
  2. Kulp, Scott Andrew, and B. Strauss. “Improved elevation data more than doubles estimates of global coastal vulnerability to sea level rise.” AGU Fall Meeting Abstracts. 2018.  As sea levels rise and damaging storm surge becomes more intense and frequent, accurate flooding vulnerability assessments are essential to prepare coastal communities for the growing impacts and damage these threats may bring. A digital elevation model (DEM) is the foundation of such analyses, but large numbers of assessments performed outside of the United States use NASA’s SRTM, which has a multimeter mean vertical bias in the coastal zone globally – more than most sea level projections for this century. Here, we apply an improved global coastal elevation model we have developed using artificial neural newtorks, CoastalDEM, that reduces mean vertical bias to on the order of 10cm. A global vulnerability assessment with our new model suggests that SRTM has gravely underestimated coastal threats from sea level rise. Across multiple carbon emission pathways and sea level rise projection models, CoastalDEM predicts more than twice as many people living on land at risk of permanent inundation this century than SRTM does.
  3. Hirt, Christian. “Artefact detection in global digital elevation models (DEMs): The Maximum Slope Approach and its application for complete screening of the SRTM v4. 1 and MERIT DEMs.” Remote Sensing of Environment 207 (2018): 27-41.  Despite post-processing efforts by space agencies and research institutions, contemporary global digital elevation models (DEMs) may contain artefacts, i.e., erroneous features that do not exist in the actual terrain, such as spikes, holes and line errors. The goal of the present paper is to illuminate the artefact issue of current global DEM data sets that might be an obstacle for any geoscience study using terrain information. We introduce the Maximum Slope Approach (MSA) as a technique that uses terrain slopes as indicator to detect and localize spurious artefacts. The MSA relies on the strong sensitivity of terrain slopes for sudden steps in the DEM that is a direct feature of larger artefacts. In a numerical case study, the MSA is applied for globally complete screening of two SRTM-based 3 arc-second DEMs, the SRTM v4.1 and the MERIT-DEM. Based on 0.1° × 0.1° sub-divisions and a 5 m/m slope threshold, 1341 artefacts were detected in SRTM v4.1 vs. 108 in MERIT. Most artefacts spatially correlate with SRTM voids (and thus with the void-filling) and not with the SRTM-measured elevations. The strong contrast in artefact frequency (factor ~12) is attributed to the SRTM v4.1 hole filling. Our study shows that over parts of the Himalaya Mountains the SRTM v4.1 data set is contaminated by step artefacts where the use of this DEM cannot be recommended. Some caution should be exercised, e.g., over parts of the Andes and Rocky Mountains. The same holds true for derived global products that depend on SRTM v4.1, such as gravity maps. Primarily over the major mountain ranges, the MERIT model contains artefacts, too, but in smaller numbers. As a conclusion, globally complete artefact screening is recommended prior to the public release of any DEM data set. However, such a quality check should also be considered by users before using DEM data. MSA-based artefact screening is not only limited to DEMs, but can be applied as quality assurance measure to other gridded data sets such as digital bathymetric models or gridded physical quantities such as gravity or magnetics.
  4. Jain, Akshay O., et al. “Vertical accuracy evaluation of SRTM-GL1, GDEM-V2, AW3D30 and CartoDEM-V3. 1 of 30-m resolution with dual frequency GNSS for lower Tapi Basin India.” Geocarto international 33.11 (2018): 1237-1256Shuttle Radar Topography Mission (SRTM-GL1), Advanced Space Borne Thermal Emission and Reflection Radiometer- Global DEM (GDEM-V2), recently released Advanced Land Observing Satellite (‘DAICHI’) DEM (AW3D30) and Indian National Cartosat-1 DEM v3 (CartoDEM-V3.1) provide free topographic data at a 30-m resolution for Indian peninsula. In this research study, the vertical accuracy of DEM is evaluated for above data-sets and compared with high accuracy dual frequency GNSS of a millimetre accuracy. The extensive field investigation is carried out using a stratified random fast static DGPS survey for collecting 117 high accuracy ground control points in a predominantly agriculture catchment. Further, the effect of land cover, slope and low-lying coastal zone on DEM vertical accuracy was also analysed and presented in this study. The results for RMSE of terrain elevation are 2.88m, 5.46m, 2.45m and 3.52m for SRTM-GL1, GDEM-V2, AW3D30 and CartoDEM-V3.1 respectively. 
  5. Zhang, Keqi, et al. “Accuracy assessment of ASTER, SRTM, ALOS, and TDX DEMs for Hispaniola and implications for mapping vulnerability to coastal flooding.” Remote sensing of environment 225 (2019): 290-306Digital elevation models (DEMs) derived from remote sensing data provide a valuable and consistent data source for mapping coastal flooding at local and global scales. Mapping of flood risk requires quantification of the error in DEM elevations and its effect on delineation of flood zones. The ASTERSRTMALOS, and TanDEM-X (TDX) DEMs for the island of Hispaniola were examined by comparing them with GPS and LiDAR measurements. The comparisons were based on a series of error measures including root mean square error (RMSE) and absolute error at 90% quantile (LE90). When compared with >2000 GPS measurements with elevations below 7 m, RMSE and LE90 values for ASTER, SRTM, ALOS, TDX DEMs were 8.44 and 14.29, 3.82 and 5.85, 2.08 and 3.64, and 1.74 and 3.20 m, respectively. In contrast, RMSE and LE90 values for the same DEMs were 4.24 and 6.70, 4.81 and 7.16, 4.91 and 6.82, and 2.27 and 3.66 m when compared to DEMs from 150 km2 LiDAR data, which included elevations as high as 20 m. The expanded area with LiDAR coverage included additional types of land surface, resulting in differences in error measures. Comparison of RMSEs indicated that the filtering of TDX DEMs using four methods improved the accuracy of the estimates of ground elevation by 20–43%. DTMs generated by interpolating the ground pixels from a progressive morphological filter, using an empirical Bayesian kriging method, produced an RMSE of 1.06 m and LE90 of 1.73 m when compared to GPS measurements, and an RMSE of 1.30 m and LE90 of 2.02 m when compared to LiDAR data. Differences in inundation areas based on TDX and LiDAR DTMs were between −13% and −4% for scenarios of 3, 5, 10, and 15 m water level rise, a much narrower range than inundation differences between ASTER, SRTM, ALOS and LiDAR. The TDX DEMs deliver high resolution global DEMs with unprecedented elevation accuracy, hence, it is recommended for mapping coastal flood risk zones on a global scale, as well as at a local scale in developing countries where data with higher accuracy are unavailable.
  6. Dai, Chunli, et al. “Coastline extraction from repeat high resolution satellite imagery.” Remote Sensing of Environment 229 (2019): 260-270.  This paper presents a new coastline extraction method that improves water classification accuracy by benefitting from an ever-increasing volume of repeated measurements from commercial satellite missions. The widely-used Normalized Difference Water Index (NDWI) method is tested on a sample of around 12,600 satellite images for statistical analysis. The core of the new water classification method is the use of a water probability algorithm based on the stacking of repeat measurements, which can mitigate the effects of translational offsets of images and the classification errors caused by clouds and cloud shadows. By integrating QuickBirdWorldView-2 and WorldView-3 multispectral images, the final data product provides a 2 m resolution coastline, as well as a 2 m water probability map and a repeat-count measurement map. Improvements on the existing coastline (GSHHS-the Global Self-consistent, Hierarchical, High-resolution Shoreline Database, 50 m–5000 m) in terms of resolution (2 m) is substantial, thanks to the combination of multiple data sources.