Thongchai Thailand

Archive for July 2020

 

PLANETARY ENVIRONMENTALISM AND ITS IMPLIED ROLE OF HUMANS AS CARETAKERS OF THE PLANET EARTH IN THE ANTHROPOCENE.

 

In his paper “Geology of Mankind”, geologist Paul Crutzen calls on geologists to use the term ‘Anthropocene’ for the current “human-dominated” geological epoch, that sits piggy-back on the Holocene [LINK] . Since then there have been a number of papers, mostly by Will Steffen, on the Anthropocene as seen in the bibliography below. A succinct summary of this concept is provided by Noam Chomsky in the video below. It describes a state of the world in which humans are in control of the planet and are now its keepers and caretakers. The fate of the planet now depends on how well humans take care of it. This is the extent to which global environmentalism has been taken and and the context in which the ozone crisis and the climate crisis of our time should be understood.

 

 

THE IMPOSSIBILITY OF PLANETARY ENVIRONMENTALISM

In this post we argue that the concept of the Anthropocene and of human caused planetary catastrophe by way of things like the industrial economy running on fossil fuels are inconsistent with the relative insignificance of humans on a planetary scale.

Consider for example, that even as humans are worried about things like carbon pollution and the population bomb in terms of the planet being overwhelmed by the sheer number of humans on earth, humans, like all life on earth, are carbon life forms created from the carbon that came from the mantle of the planet but a rather insignificant portion of itIn terms of total weight, humans constitute 0.05212% of the total mass of life on earth. Yet we imagine that our numbers are so huge that the planet will be overwhelmed by our population bomb. All the life on earth taken together is 0.000002875065% of the crust of the planet by weight. The crust of the planet where we live and where we have things like land, ocean, atmosphere, climate, and carbon life forms, is 0.3203% of the planet by weight. The other 99.6797% of the planet, the mantle and core, is a place where we have never been and will never be and on which we have no impact whatsoever. In terms of the much feared element carbon that is said to cause planetary devastation by way of climate change and ocean acidification, a mass balance shows that the crust of the planet where we live contains 0.201% of the planet’s carbon with the other 99.8% of the carbon inventory of the planet  being in the mantle and core.

 

CONCLUSIONS

  1. The crust of the planet where we live is an insignificant portion of the planet.
  2. Life on earth is an insignificant portion of the crust of the planet. 
  3. Humans are an insignificant portion of life on earth. 

Although it is true that humans must take care of their environment, we propose that the environment should have a more rational definition because the mass balance above does not show that humans are a significant force on a planetary scale or that they are in a position to either save it or to destroy it even with the much feared power of their fossil fueled industrial economy. And that implies that it is not possible that there is such a thing as an Anthropocene in which humans are the dominant geological force of the planet.

Like ants and bees, humans are social creatures that live in communities of humans so that when they look around all they see are humans. This is the likely source of our human oriented view of the world. Paul Ehrlich’s overpopulation theory is derived from his first visit to India which he described as “people people people people people!” It is this biased view of the planet that makes it possible for us to extrapolate Calcutta to the planet and come up with the fearful image described by Jeff Gibbs as “Have you every wondered what would happen if a single species took over an entire planet?”

 

 

RELATED POSTS

  1. THE HUMANS MUST SAVE THE PLANET [LINK]  
  2. THE ISSUE IS FOSSIL FUELS NOT CLIMATE CHANGE:  [LINK]
  3. NO EVIDENCE THAT ATMOSPHERIC COMPOSITION IS RESPONSIVE TO FOSSIL FUEL EMISSIONS:  [LINK] 
  4. NO EVIDENCE THAT SURFACE TEMPERATURE IS RESPONSIVE CUMULATIVE FOSSIL FUEL EMISSIONS AS CLAIMED IN THE “TRANSIENT CLIMATE RESPONSE TO CUMULATIVE EMISSIONS” [LINK] 
  5. NO EVIDENCE THAT CLIMATE ACTION WILL CHANGE THE RATE OF WARMING: [LINK]  

geologic-epochs

 

ANTHROPOCENE BIBLIOGRAPHY

  1. Anthropocene doomsday scenario: Steffen 2018: Steffen, Will, et al. “Trajectories of the Earth System in the Anthropocene.” Proceedings of the National Academy of Sciences (2018): 201810141. {We explore the risk that self-reinforcing feedbacks could push the Earth System toward a planetary threshold that, if crossed, could prevent stabilization of the climate at intermediate temperature rises and cause continued warming on a “Hothouse Earth” pathway even as human emissions are reduced. Crossing the threshold would lead to a much higher global average temperature than any interglacial in the past 1.2 million years and to sea levels significantly higher than at any time in the Holocene. We examine the evidence that such a threshold might exist and where it might be. If the threshold is crossed, the resulting trajectory would likely cause serious disruptions to ecosystems, society, and economies. Collective human action is required to steer the Earth System away from a potential threshold and stabilize it in a habitable interglacial-like state. Such action entails stewardship of the entire Earth System—biosphere, climate, and societies—and could include decarbonization of the global economy, enhancement of biosphere carbon sinks, behavioral changes, technological innovations, new governance arrangements, and transformed social values.}
  2. Anthropocene doomsday scenario: Steffen 2015: Steffen, Will, et al. “The trajectory of the Anthropocene: the great acceleration.” The Anthropocene Review 2.1 (2015): 81-98. {The ‘Great Acceleration’ graphs, originally published in 2004 to show socio-economic and Earth System trends from 1750 to 2000, have now been updated to 2010. In the graphs of socio-economic trends, where the data permit, the activity of the wealthy (OECD) countries, those countries with emerging economies, and the rest of the world have now been differentiated. The dominant feature of the socio-economic trends is that the economic activity of the human enterprise continues to grow at a rapid rate. However, the differentiated graphs clearly show that strong equity issues are masked by considering global aggregates only. Most of the population growth since 1950 has been in the non-OECD world but the world’s economy (GDP), and hence consumption, is still strongly dominated by the OECD world. The Earth System indicators, in general, continued their long-term, post-industrial rise, although a few, such as atmospheric methane concentration and stratospheric ozone loss, showed a slowing or apparent stabilisation over the past decade. The post-1950 acceleration in the Earth System indicators remains clear. Only beyond the mid-20th century is there clear evidence for fundamental shifts in the state and functioning of the Earth System that are beyond the range of variability of the Holocene and driven by human activities. Thus, of all the candidates for a start date for the Anthropocene, the beginning of the Great Acceleration is by far the most convincing from an Earth System science perspective.}
  3. Anthropogenic doomsday scenario: McGill 2015  : McGill, Brian J., et al. “Fifteen forms of biodiversity trend in the Anthropocene.” Trends in ecology & evolution 30.2 (2015): 104-113. {Humans are transforming the biosphere in unprecedented ways, raising the important question of how these impacts are changing biodiversity. Here we argue that our understanding of biodiversity trends in the Anthropocene, and our ability to protect the natural world, is impeded by a failure to consider different types of biodiversity measured at different spatial scales. We propose that ecologists should recognize and assess 15 distinct categories of biodiversity trend. We summarize what is known about each of these 15 categories, identify major gaps in our current knowledge, and recommend the next steps required for better understanding of trends in biodiversity.}
  4. Anthropocene doomsday scenario: Dirzo, 2014  : Dirzo, Rodolfo, et al. “Defaunation in the Anthropocene.” science 345.6195 (2014): 401-406. {We live amid a global wave of anthropogenically driven biodiversity loss: species and population extirpations and, critically, declines in local species abundance. Particularly, human impacts on animal biodiversity are an under-recognized form of global environmental change. Among terrestrial vertebrates, 322 species have become extinct since 1500, and populations of the remaining species show 25% average decline in abundance. Invertebrate patterns are equally dire: 67% of monitored populations show 45% mean abundance decline. Such animal declines will cascade onto ecosystem functioning and human well-being. Much remains unknown about this “Anthropocene defaunation”; these knowledge gaps hinder our capacity to predict and limit defaunation impacts. Clearly, however, defaunation is both a pervasive component of the planet’s sixth mass extinction and also a major driver of global ecological change.}
  5. Anthropocene doomsday scenario: Braje 2013  : Braje, Todd J., and Jon M. Erlandson. “Human acceleration of animal and plant extinctions: A Late Pleistocene, Holocene, and Anthropocene continuum.” Anthropocene 4 (2013): 14-23. {One of the most enduring and stirring debates in archeology revolves around the role humans played in the extinction of large terrestrial mammals (megafauna) and other animals near the end of the Pleistocene. Rather than seeking a prime driver (e.g., climate change, human hunting, disease, or other causes) for Pleistocene extinctions, we focus on the process of human geographic expansion and accelerating technological developments over the last 50,000 years, changes that initiated an essentially continuous cascade of ecological changes and transformations of regional floral and faunal communities. Human hunting, population growth, economic intensification, domestication and translocation of plants and animals, and landscape burningand deforestation, all contributed to a growing human domination of earth’s continental and oceanic ecosystems. We explore the deep history of anthropogenic extinctions, trace the accelerating loss of biodiversity around the globe, and argue that Late Pleistocene and Holocene extinctions can be seen as part of a single complex continuum increasingly driven by anthropogenic factors that continue today.}
  6. Anthropocene doomsday scenario: Steffen 2011: Steffen, Will, et al. “The Anthropocene: From global change to planetary stewardship.” Ambio 40.7 (2011): 739. {Over the past century, the total material wealth of humanity has been enhanced. However, in the twenty-first century, we face scarcity in critical resources, the degradation of ecosystem services, and the erosion of the planet’s capability to absorb our wastes. Equity issues remain stubbornly difficult to solve. This situation is novel in its speed, its global scale and its threat to the resilience of the Earth System. The advent of the Anthropence, the time interval in which human activities now rival global geophysical processes, suggests that we need to fundamentally alter our relationship with the planet we inhabit. Many approaches could be adopted, ranging from geo-engineering solutions that purposefully manipulate parts of the Earth System to becoming active stewards of our own life support system. The Anthropocene is a reminder that the Holocene, during which complex human societies have developed, has been a stable, accommodating environment and is the only state of the Earth System that we know for sure can support contemporary society. The need to achieve effective planetary stewardship is urgent. As we go further into the Anthropocene, we risk driving the Earth System onto a trajectory toward more hostile states from which we cannot easily return.}
  7. Anthropocene doomsday scenario: Wagler 2011  : Wagler, Ron. “The anthropocene mass extinction: An emerging curriculum theme for science educators.” The American Biology Teacher 73.2 (2011): 78-83. {There have been five past great mass extinctions during the history of Earth. There is an ever-growing consensus within the scientific community that we have entered a sixth mass extinction. Human activities are associated directly or indirectly with nearly every aspect of this extinction. This article presents an overview of the five past great mass extinctions; an overview of the current Anthropocene mass extinction; past and present human activities associated with the current Anthropocene mass extinction; current and future rates of species extinction; and broad science-curriculum topics associated with the current Anthropocene mass extinction that can be used by science educators. These broad topics are organized around the major global, anthropogenic direct drivers of habitat modification, fragmentation, and destruction; overexploitation of species; the spread of invasive species and genes; pollution; and climate change.}
  8. Anthropocene doomsday scenario: Zalasiewicz 2010  : Zalasiewicz*, Jan, et al. “The new world of the Anthropocene.” (2010): 2228-2231. {Global events such as mass extinctions, the onset of Ice Ages, and changes in geochemistry linked with changes in atmospheric chemistry are timeposts in geological strata. In the timeline for Earth history, they allow segmentation of its 4.6 billion year existence into eons, eras, periods, and epochs. As human activity makes its recently initiated yet globally extensive mark that is leading to mass extinctions, changes in atmospheric and marine chemistry, and altering terrestrial features, should a new epoch be declared? Can such an Anthropocene be geologically standardized in strata? Zalasiewicz et al make their case in this article featured in ES&T’s April 1, 2010 print issue recognizing the 40th Anniversary of Earth Day.}
  9. Anthropocene doomsday scenario: Saxon 2008  : Saxon, Earl. “Noah’s Parks: A partial antidote to the Anthropocene extinction event.” Biodiversity 9.3-4 (2008): 5-10. {Climate change will rapidly alter the abiotic environment of many localities leading to significant losses of biodiversity in ecosystems unable to adapt quickly. However, local extirpation will be least likely where environmental change is slowest. Such locations will offer refugia for species with narrow environmental ranges, provide persistent sources of colonists, offer transitory homes for dispersers and serve as platform sites on which new community assemblages develop. Consequently, networks of protected areas that include such sites will conserve more biodiversity. Conventional protected area network selection algorithms give priority to areas with the lowest current cost. I added projected environmental change as a cost factor. I applied the modified algorithm in three arctic ecoregions where climate change is predicted to be extremely rapid and to 20 tropical ecoregions where the pace of climate change will be slower but many species are vulnerable to small changes. I identified protected area networks that protect places where change will be slowest in all ecoregions. These climate-adaptive protected area networks differ substantially from both current protected area networks and near-optimal networks that are based only on current costs. The modified method will help protected area planners to acquire potential climate refugia and to help implement adaptive conservation strategies for potential refugia that are already protected. It will also help reduce the risk that projected refugia are unknowingly allocated to land uses incompatible with their critical role in biodiversity conservation.}
  10. Anthropocene doomsday scenaro: Steffen 2007: Steffen, Will, Paul J. Crutzen, and John R. McNeill. “The Anthropocene: are humans now overwhelming the great forces of nature.” AMBIO: A Journal of the Human Environment 36.8 (2007): 614-621. {We explore the development of the Anthropocene, the current epoch in which humans and our societies have become a global geophysical force. The Anthropocene began around 1800 with the onset of industrialization, the central feature of which was the enormous expansion in the use of fossil fuels. We use atmospheric carbon dioxide concentration as a single, simple indicator to track the progression of the Anthropocene. From a preindustrial value of 270–275 ppm, atmospheric carbon dioxide had risen to about 310 ppm by 1950. Since then the human enterprise has experienced a remarkable explosion, the Great Acceleration, with significant consequences for Earth System functioning. Atmospheric CO2 concentration has risen from 310 to 380 ppm since 1950, with about half of the total rise since the preindustrial era occurring in just the last 30 years. The Great Acceleration is reaching criticality. Whatever unfolds, the next few decades will surely be a tipping point in the evolution of the Anthropocene.}

denier-3

denier-2

denier-1

2012 SkS Weekly Digest #48

WITH THANKS TO ERIC WORRAL  AT WUWT  [LINK] WHO ALERTED US TO THIS TOPIC 

 

 

 

A CRITICAL REVIEW OF A GUARDIAN ARTICLE ON CLIMATE DENIERS  [LINK]

THE AUTHOR IS DAMIAN CARRINGTON, the Guardian’s Environment editor

 

 

PART-1: WHAT THE GUARDIAN ARTICLE SAYS

The four types of climate denier, and why you should ignore them all, by Damian Carrington: (1)The shill, (2)The grifter, (3)The egomaniac and (4) The ideological fool. Each distorts the urgent global debate in their own way. Serious debates about what to do about the climate crisis are turning into action. The deniers have nothing to contribute to this.

  1. A new book (on climate denial), described as “deeply and fatally flawed” by an expert reviewer, recently reached the top of Amazon’s bestseller list for environmental science and made it into a weekly top 10 list for all nonfiction titles.
  2. How did this happen? Because, as Brendan Behan put it, “there’s no such thing as bad publicity”. In an article promoting his book, Michael Shellenberger – with jaw-dropping hubris – apologises on behalf of all environmentalists for the “climate scare we created over the last 30 years.
  3. Shellenberger was named a hero of the environment by Time magazine in 2008 and is a loud advocate of nuclear power, but the article was described by six leading scientists as “cherry-picking”, “misleading” and containing “outright falsehoods”.
  4. The article was widely republished, even after being removed from its first home, Forbes, for violating the title’s editorial guidelines on self-promotion, adding further heat to the storm. And this is why all those who deny the reality or danger of the climate emergency should be ignored. Obviously, I have broken my own rule here, but only to make this vital point once and for all.
  5. The science is clear, the severity understood at the highest levels everywhere, and serious debates about what to do are turning into action. The deniers have nothing to contribute to this.
  6. However infuriating they are, arguing with them or debunking their theories is likely only to generate publicity or money for them. It also helps to generate a fake air of controversy over climate action that provides cover for the vested interests seeking to delay the end of the fossil fuel ageBut the deniers are not all the same. They tend to fit into one of four different categories: the shill, the grifter, the egomaniac and the ideological fool.
  7. The shill:  is the easiest to understand. He, and it almost always is he, is paid by vested interests to emit clouds of confusion about the science or economics of climate action. This uncertainty creates a smokescreen behind which polluters can lobby against measures that cut their profits.
  8. The grifter: A sadder case is that of the grifters. They have found themselves earning a living by grinding out contrarian articles for right-wing media outlets. Do they actually believe the guff they write? It doesn’t matter: they just warm their hands on the outrage, count the clicks and wait for the pay cheque.
  9. The egomaniac:  The egomaniacs are also tragic figures. They are disappointed, frustrated people whose careers have stalled and who can’t understand why the world refuses to give full reverence to their brilliance. They are desperate for recognition, and, when it stubbornly refuses to arrive, they are drawn to make increasingly extreme pronouncements, in the hope of finally being proved a dogma-busting, 21st-century Galileo.
  10. The ideological fool: The ideological fool is the fourth type of climate denier, and they can be intelligent. But they are utterly blinded by their inane, no-limits version of the free-market creed. The climate emergency requires coordinated global action, they observe, and that looks horribly like communism in disguise. They could explore the many credible climate action plans being pursued, including by those on the political right. But their cognitive dissonance forces them to the conclusion that because state intervention is wrong, acting to avert climate danger cannot be right. Intellectual gymnastics to “expose” climate alarmism then follow naturally.
  11. But why do I say ignore them all? The climate crisis is urgent, and we need debate to drive action. However, vigorous debates over action are already taking place in good faith all over the world, from the tops of governments to the smallest local action groups.
  12. Every nation in the world signed up to the 2015 Paris climate deal, pledging to keep global heating below 2C and ideally to 1.5C. The IPCC involves thousands of international scientists and is arguably the greatest scientific endeavour in history. It has spent three decades spelling out in painstaking detail how humanity is causing global heating, how catastrophic that threatens to be – and how drastic action is required to avert the worst.
  13. The world of finance and business is catching up fast with the science, and almost all the technology needed already exists. In short, no sane or serious actor can countenance denial of climate danger. Bad-faith arguments motivated by greed, egomania or ideology have nothing to add.
  14. Which brings me to the US president, Donald Trump. Political leaders are the exception to the rule. Their climate idiocy should be challenged, as they hold actual power. But even in this case, reality is fast debunking their proclamations.
  15. In the US, coal is dying, because green energy is cheaper and cleaner, however great Trump claims he will make the miners. Even if Trump, and Brazil’s president, Jair Bolsonaro, persist, other nations will begin to ostracise them via trade sanctions and border taxes.
  16. As for the shill, grifter, egomaniac and ideological fool, the reality of increasing climate impacts and successful action is fast exposing them as well. Those willing to employ the shills and the grifters are dwindling.
  17. The book I started with has now been knocked off the environmental bestsellers list, fittingly enough by one published by the environmental hero Rachel Carson, in 1951. I can’t profess to know what Shellenberger’s motivation was, but one thing is clear: the egomaniacs and ideological fools will get the place in history they so lust for. It will be a small footnote marking the useful idiots of the climate war.
  18. We’ve never had a better chance to make a greener world. Covid-19 has delivered unusual environmental benefits: cleaner air, lower carbon emissions, a respite for wildlife. Now the big question is whether we can capitalise on this moment. The Guardian aims to lead the debate from the front.
  19. In the weeks and months ahead, our journalism will investigate the prospects for a new green settlement. We will showcase the big thinkers and protagonists and amplify the arguments for authorities everywhere to consider as they lead us out of coronavirus. Our credentials suit us well to the task: we are independent, we have no owners, no paymasters or oligarchs pulling the strings. We have committed to carbon neutrality by 2030, divested from the oil and gas sectors and renounced fossil fuel advertising. But at this crucial moment, news organisations like ours are facing a daunting financial challenge. As businesses everywhere feel the pinch, the advertising revenue that has long helped to sustain our work has plummeted. We need you to help fill the gap. You’ve read 52 articles in the last nine months. Our journalism is open to all because we believe everyone deserves access to factual information, regardless of where they live or what they can afford to pay. If you can afford to, we hope you will consider supporting our journalism today.
  20. The Guardian believes that the climate crisis we face is systemic. We will inform our readers about threats to the environment based on scientific facts, not driven by commercial or political interests. We will keep reporting on the efforts of individuals and communities around the world who are fearlessly taking a stand for future generations and the preservation of human life on earth. We want their stories to inspire hope. We need your support to keep delivering this kind of open, committed independent journalism. Every reader contribution, however big or small, is so valuable.

 

 

PART-2: CRITICAL COMMENTARY

  1. This piece by the Guardian was motivated by the Shellenberger book and starts out as a critical analysis of the book with vague statements and charges that include (1) deeply and fatally flawed according to an expert reviewer, (2) the author is an advocate for nuclear power, (3) it is not credible because it relies on cherry picking. (4) the content of the book is misleading, (5) It contains outright falsehoods, and (6) the book is guilty of self promotion. This is all the Guardian has to say to discredit the book in terms of climate denialism. No details for these charges are provided and no argument against those details are made.
  2. Yet, only on the basis of these vague unspecified charges against the Shellenberger book, the Guardian leaves the Shellenberger subject altogether and turns to the subject of climate denialism in general to classify four types of climate deniers as (1)The shill, (2)The grifter, (3)The egomaniac and (4) The ideological fool with no connection or relevance provided for the Shellenberger book that he had started to write about. At this point we have left the Shellenberger book, which apparently has served its purpose, to launch into an attack on climate denialism with climate denialism defined as anyone who disagrees with climate scientists, the UN, and the IPCC. At this point we have completely left the topic of Shellenberger’s book against which many charges have been made without any supporting evidence or commentary for any of those charges. The topic is now changed to climate denialism.
  3. Climate denialism is then attacked in terms of the description of climate deniers as either shill, grifter, egomaniac, and ideological fool without providing any details of denialist claims and why those claims are wrong. Name calling is apparently all the evidence the Guardian needs to prove deniers wrong. The statement that “the science is clear” is too vague an argument against climate denial. Some specific charges are necessary to show why they are wrong.
  4. As for the reliance on the statement “the science is clear”, perhaps the Guardian can explain or contest some of the oddities in the clear and settled science. For example, climate science says that the global warming we see is a creation of the industrial economy such that the amount of warming caused by the industrial economy must be measured from a reference pre-industrial temperature that existed prior to the industrial revolution. Yet, the clear science is vague on what this reference temperature is and in which year it was measured. Here the question is “in what year did anthropogenic global warming driven by the industrial economy start? The answer to this question is provided by climate science as follows:
  5. Callendar 1938: It started in 1900 and warmed steadily from 1900 to 1938 with the warming driven by rising CO2 which in turn is attributable to fossil fuel emissions.
    Hansen 1988: It started in 1950 because in the 30-year period 1950-1980 there is a strong measurable warming rate with 99% probability for human cause.
    IPCC 2001: It started in 1750 when the Industrial Revolution kicked in and atmospheric CO2 began to rise.
    IPCC 2015: It started in 1850 by when sufficient fossil fuel carbon had entered the atmosphere for a measurable response of temperature to CO2.
    NASA 2020 : It started in 1950 because from then the relationship between CO2 and temperature we see in the climate models closely matches the observational data.
    Climate Scientist Peter Cox 2018: It started in the 1970s because it is since then that we see a measurable responsiveness of surface temperature to atmospheric CO2 concentration according to the theory of the greenhouse effect of CO2.
  6. What we find in this case is that though the science is clear, it is a science in which the “pre-industrial” reference plays a crucial role but given its uncertainty, it is not clear exactly what that role is.
  7. Yet another critical issue is the basis for planning climate action to attenuate the continuation of future global warming so that we can limit the temperature rise to a temperature that is below critical and dangerous level that may cause mass extinctions and the collapse of civilization. The important question in this regard is the the critical amount of warming that can occur before the horrific impacts of climate change that must be avoided. Surprisingly, the answer to that question is not as clear as assumed.
  8. In the IPCC 2001 report we were told that: Fossil fuel emissions is causing atmospheric CO2 concentration to go up and that in turn is causing global mean temperature to go up. If we don’t take climate action to reduce and eliminate emissions, the temperature will continue to go up and when it warms 5C above pre-industrial, warming will become irreversible and out of control making it impossible for us to save the planet.
  9. In the IPCC 2007 report we were told that if we don’t take climate action to reduce and eliminate emissions, the temperature will continue to go up and when it warms 4C above pre-industrial, warming will become irreversible and out of control making it impossible for us to save the planet.
  10. In the IPCC 2013 report we were told that if we don’t take climate action to reduce and eliminate emissions, the temperature will continue to go up and when it warms 3C above pre-industrial, warming will become irreversible and out of control making it impossible for us to save the planet.
  11. In the IPCC 2015 report we were told that if we don’t take climate action to reduce and eliminate emissions, the temperature will continue to go up and when it warms 2C above pre-industrial, warming will become irreversible and out of control making it impossible for us to save the planet.
  12. In the IPCC SPECIAL REPORT OF 2018 we were told that if we don’t take climate action to reduce and eliminate emissions, the temperature will continue to go up and when it warms 1.5C above pre-industrial, warming will become irreversible and out of control making it impossible for us to save the planet.
  13. The large uncertainty in this crucial value that forms the basis of the climate action being promoted by the Guardian does not support the assumption by the Guardian that the science is clear and that we  therefore know how this climate change will evolve and what we must do in terms of climate action to control it.
  14. Yet another consideration in terms of the degree of clarity in the science that the Guardian says is clear, is the significant issue of climate change impacts in the form of extreme weather and forest fires. These impact claims serve the important function in climate science of providing the incentive for us to take climate action. The causal connection between anthropogenic global warming and extreme weather events is made post hoc (after the fact) using a procedure called “Event Attribution Science”that clearly suffers from an extreme form of conformation bias that would renders its findings as something other than science. However, the greater issue with Event Attribution Science is that the study involves extremely short time spans of a year or less and an extremely localized portion of the globe that is usually less than 1% of the globe. The issue here is that anthropogenic global warming (AGW) is a theory about long term trends (longer than 30 years) in global mean temperature. Its interpretation in terms of such localized events over brief time periods is not possible because under these conditions the internal variability of climate, unrelated to AGW, dominates. As explained in a related post [LINK] , “Internal variability in the climate system confounds assessment of human-induced climate change and imposes irreducible limits on the accuracy of climate change projections, at regional and decadal scales”. The relevant implication is that extreme weather events cannot be understood in terms of AGW climate change because of the internal variability issue.
  15. Finally, we address the scientific method of inquiry that requires data analysis and their hypothesis tests to be constructed from the point of view of a denier. In hypothesis tests, the null hypothesis is the that of the denier. Unbiased and objective data analysis must show the denialism null hypothesis to be wrong with sufficient statistical power to accept the alternative hypothesis that forms the basis to prove the denier wrong. This is how science  works. In the absence of denialism and in the context of the 97% consensus valued by climate science, the critical and denialist approach in the scientific method becomes corrupted by confirmation bias.
  16. A specific example is that of the injection of relatively small quantities of fossil fuel emissions into nature’s carbon cycle that involves a number of different CO2 flows that are an order of magnitude larger than fossil fuel emissions and with very large uncertainties in their flow rates. The assumed effect of fossil fuel emissions on atmospheric composition can be ascertained only if uncertainties in carbon cycle flows are ignored and that mass balance does show that half of the fossil fuel emissions accumulate in the atmosphere. However, when uncertainties in carbon cycle flows are taken into account, no impact of fossil fuel emissions can be detected [LINK] . The absence of critical evaluation of data made from a denier’s perspective required by the scientific method is absent in the way climate science carries out this test such that the procedure does support the climate science position that atmospheric composition is responsive to fossil fuel emissions. [LINK] [LINK] .
  17. These serious issues cannot be evaluated in terms of the Shellenberger book and cannot be appreciated from an activism point of view that sets out to prove the catastrophic global warming hypothesis as the null hypothesis. In its extreme eagerness to push the idea of climate action against fossil fuels, climate science has had to overlook or bypass the scientific method and to vilify the critical evaluation of climate science as an undesirable activity called climate denial.

1*LPaVhuzXXv2J1sc-d82xtA

The Global Impacts of Rapidly Disappearing Arctic Sea Ice - Yale E360

All Sea Ice Surrounding Alaska Has Melted for 150 Miles

Measuring Arctic ice melt down due to human activities

 

THIS POST IS A CRITICAL REVIEW OF A PHYS.ORG ARTICLE [LINK] ABOUT ARCTIC SEA ICE  MELT IN SEPTEMBER 2018  

 

 

PART-1: WHAT THE ARTICLE SAYS

A snapshot of melting Arctic sea ice during the summer of 2018 by Cell Press: Arctic sea ice concentration and thickness on September 23 2018. Credit: Juhi Yadav

A snapshot of melting Arctic sea ice during the summer of 2018

  1. As sea ice in the Arctic retreats further and melts faster every decade, scientists are racing to understand the vulnerabilities of one of the world’s most remote and unforgiving places. A study appearing July 29 in the journal Heliyon details the changes that occurred in the Arctic in September of 2018, a year when nearly 10 million kilometers of sea ice were lost over the course of the summer  (do they mean 10 million square km?). Their findings give an overview at different timescales of how sea ice has receded over the 40 years of the satellite era and show how the summer’s extensive decline is linked to global atmospheric processes in the tropics.
  2. At the peak of its melting season, in July 2018, the Arctic was losing sea ice at a rate of 105,500 square kilometers per day—an area bigger than Iceland or the state of Kentucky,. “On the ground, I am sure it would have looked like an excellent summer month in the Arctic, in general, but over the past four decades, September sea-ice loss has accelerated to a rate of 12.8% per decade and 82,300 square kilometers per year ,” says co-author Avinash Kumar, a senior scientist at the National Centre for Polar and Ocean Research (NCPOR) in India.
  3. The researchers followed the warm water currents of the Atlantic north to the Arctic Ocean and tracked the ice as it subsequently retreated through the Chukchi, East Siberian, Laptev, Kara, and Barents seas. Thanks to higher temporal resolution and greater satellite coverage than had previously been available, they could also measure the ice’s decline through variables such as its thickness, concentration, and volume in addition to its extent throughout the Arctic. This dramatic loss of sea ice culminated at the end of the boreal summer, when in September, the ice had been reduced to a mere third of its winter extent.
  4. Then, the team compared the decline to the previous four decades of data. “In the summer of 2018, the loss of sea ice was three times higher than the reported loss at the beginning of the satellite era,” says Kumar. “Our study shows that both the minimum sea-ice extent and the warmest September records occurred in the last twelve years.”
  5. Every year, news pops up of a new record of high temperature or fastest loss of sea ice in the Arctic region, but in the global system, each portion of the planet receiving climate feedback will lead to changes in the other parts as well,” Kumar says. “If the sea-ice decline continues at this pace, it can have a catastrophic impact by raising air temperatures and slowing down global ocean circulation.” These global impacts are partly why he became interested in trying to decipher the mysteries of the polar regions as a doctoral student studying the coastal zone in India. Now, he works at NCPOR, whose scientific programs, he says, are “truly trans-hemispheric, cutting across from north to south.”
  6. The researchers also turned their attention to the atmosphere, where they were able to gain insight into the processes that contribute to the loss of Arctic sea ice. They found not only that September of 2018 was the third warmest on record, but that there was a temperature difference within the Arctic itself: the temperature of the air above the Arctic Ocean (~3.5°C) was slightly higher than that of the Arctic land (~2.8°C).
  7. Their findings provide further evidence that ocean warming around the globe has influenced the natural cycle of the wind and pressure patterns in the Arctic. El Niños, or warm phases in long-term temperature cycles stemming from tropical regions, have long been known to drive extreme weather events around the world and are occurring with greater frequency as the world warms. El Niño cycles in the equatorial Pacific Ocean can carry warm air and water from tropical circulations to the Arctic, spurring the sea ice to melt. As the ice retreats, it cascades the Arctic into a positive feedback loop known as Arctic amplification, whereby the reduced ice extent gives way to darker ocean waters that absorb more of the sun’s radiation. As it retains more heat, temperatures rise and more ice melts, causing the Arctic region to heat up faster—about four times so—than the rest of the world.
  8. “If the decline of sea ice continues to accelerate at a rate of 13% per decade in September, the Arctic is likely to be free of ice within the next three decades,” Kumar says. And just as sea-ice retreat is largely the result of anthropogenic pressures from across the globe, its impacts will be felt worldwide: this work adds to the mounting body of evidence that changes in the Arctic sea ice could be detrimental to weather patterns spanning the globe. He says, “The changes taking place in the Arctic can lead to other changes in lower latitudes, such as extreme weather conditions. The world should be watching tropical countries like India, with our research center saddled close to the beaches of Goa, and trying to understand—even in a small way—more about climate change and the polar regions.”

 

 

PART-2: CRITICAL COMMENTARY

  1. The article says that over the 40 years of satellite data for Arctic sea ice (extent and volume) and 40 years of satellite data for the lower troposphere temperature above the Arctic ocean, we find that over this period the temperature above he Arctic Ocean has been rising and this rise is attributed to anthropogenic global warming (AGW) that is thought to be a creation of the fossil fuel emissions of humans and that it can therefore be moderated by reducing or eliminating the use of fossil fuels.
  2. At the same time, we find that over these 40 years, Arctic sea ice extent and volume have been declining.
  3. Based on this observed relationship between temperature and sea ice data, the article concludes that the evidence therefore shows that AGW warming has caused the decline in Arctic sea ice.
  4. This attribution is tested with detrended correlation analysis at an annual time scale in a related post on this site [LINK] . The need for detrended correlation arises from the creation of spurious correlations by shared trends and also because the time scale for the proposed causation must be specified in the correlation test. Spurious correlations in time series data not corrected for shared trends with detrended correlation analysis is demonstrated by Tyler Vigen in his book and also on his website [LINK] [LINK]    pbs.twimg.com/profile_images/602656547215376384...

Spurious Correlations

5.  Tyler’s work implies that to establish a causation relationship between two time series the time scale of the causation must be specified, the two time series must be detrended, and a statistically significant detrended correlation between the two time series at the specified time scale must be shown.

6. Detrended correlation analysis at an annual time scale between the AGW temperature above the Arctic Ocean and sea ice volume measured with the PIOMAS procedure is shown in a related post [LINK] . The charts below show the findings of this analysis. SEAICE-GIF

DECLINERATE

CORR-DETCORR

7. The first chart shows the sea ice volume for the study period 1979-2019 for each year 1979-2019 and each calendar month from January to September. The second chart shows the average rate of sea ice decline for each calendar month over the study period 1979-2019 – and it shows that the highest rate of decline is seen in July and the lowest rate of decline is found in April.

8. The results of correlation analysis is displayed in third chart. Correlations are shown in blue and detrended correlation in red. The hypothesis is that rising temperatures cause decline sea ice volume and that requires a statistically significant detrended negative correlation between temperature and sea ice volume. Here we see what appears to be strong correlations in the source data (blue line) from corr=-0.4 to corr=-0.6. However, the corresponding detrended correlation (red line) are close to zer and even positive. This means that the correlation seen in the blue line are spurious correlations that are creations of shared trends and not indicative of responsiveness at an annual time scale.

9. We conclude from this analysis as follows:  The data do show declining Arctic sea ice  volume during a period of rising temperature but without evidence for the assumed causation of sea ice decline by AGW and therefore without support for the claim that these changes are driven by AGW and that therefore they can be attenuated with climate action in the form of reducing or eliminating fossil fuel emissions.

10. A MORE DETAILED ANALYSIS IS PROVIDED IN A RELATED POST ON THIS SITE [LINK] . SIMILAR ANALYSIS FOR SEA ICE EXTENT AND AREA ARE PROVIDED IN YET ANOTHER RELATED POST ON THIS SITE [LINK] . No evidence is found that the observed decline in Arctic sea ice extent or volume is driven by the rising temperature  of anthropogenic global warming above the Arctic Ocean.

2017 Arctic sea ice minimum comes in at eighth smallest on record ...

Arctic wildfires: How bad are they and what caused them? - BBC News

Arctic wildfires: How bad are they and what caused them? - BBC News

bandicam 2020-07-29 20-05-03-126

bandicam 2020-07-29 20-06-39-823

THIS POST IS A BIBLIOGRAPHY ON ARCTIC WILDFIRES OF JUNE, JULY, AND AUGUST WHEN THE SUN SHINES 24 HOURS A DAY ON THE ARCTIC. THE BIBLIOGRAPHY PROVIDES A CONTEXT FOR THE MEDIA REPORTS ABOUT THESE FIRES IN TERMS OF CLIMATE CHANGE AND THE PRESUMED NEED FOR CLIMATE ACTION IMPLIED BY ARCTIC TUNDRA FIRES. 

 

ARCTIC WILDFIRES: THE RELEVANT BIBLIOGRAPHY

IMAGE FROM HIGUERA ET AL 2008. ANCIENT TUNDRA FIRES

  1. Wein, Ross W. “Frequency and characteristics of arctic tundra fires.” Arctic (1976): 213-222bandicam 2020-07-29 20-19-59-491
  2. Jones, Benjamin M., et al. “Recent Arctic tundra fire initiates widespread thermokarst development.” Scientific reports 5 (2015): 15865. Fire-induced permafrost degradation is well documented in boreal forests, but the role of fires in initiating thermokarst development in Arctic tundra is less well understood. Here we show that Arctic tundra fires may induce widespread thaw subsidence of permafrost terrain in the first seven years following the disturbance. Quantitative analysis of airborne LiDAR data acquired two and seven years post-fire, detected permafrost thaw subsidence across 34% of the burned tundra area studied, compared to less than 1% in similar undisturbed, ice-rich tundra terrain units. The variability in thermokarst development appears to be influenced by the interaction of tundra fire burn severity and near-surface, ground-ice content. Subsidence was greatest in severely burned, ice-rich upland terrain (yedoma), accounting for ~50% of the detected subsidence, despite representing only 30% of the fire disturbed study area. Microtopography increased by 340% in this terrain unit as a result of ice wedge degradation. Increases in the frequency, magnitude and severity of tundra fires will contribute to future thermokarst development and associated landscape change in Arctic tundra regions. [FULL TEXT]
  3. Hu, Feng Sheng, et al. “Arctic tundra fires: natural variability and responses to climate change.” Frontiers in Ecology and the Environment 13.7 (2015): 369-377.  Anthropogenic climate change may result in novel disturbances to Arctic tundra ecosystems. Understanding the natural variability of tundra‐fire regimes and their linkages to climate is essential in evaluating whether tundra burning has increased in recent years. Historical observations and charcoal records from lake sediments reveal a wide range of fire regimes in Arctic tundra, with fire‐return intervals varying from decades to millennia. Analysis of historical data shows strong climate–fire relationships, with threshold effects of summer temperature and precipitation. Projections based on 21st‐century climate scenarios suggest that annual area burned will approximately double in Alaskan tundra by the end of the century. Fires can release ancient carbon from tundra ecosystems and catalyze other biogeochemical and biophysical changes, with local to global consequences. Given the increased likelihood of tundra burning in coming decades, land managers and policy makers need to consider the ecological and socioeconomic impacts of fire in the Far North.
  4. Higuera, Philip E., et al. “Frequent fires in ancient shrub tundra: implications of paleorecords for arctic environmental change.” PloS one 3.3 (2008): e0001744.  Understanding feedbacks between terrestrial and atmospheric systems is vital for predicting the consequences of global change, particularly in the rapidly changing Arctic. Fire is a key process in this context, but the consequences of altered fire regimes in tundra ecosystems are rarely considered, largely because tundra fires occur infrequently on the modern landscape. We present paleoecological data that indicate frequent tundra fires in northcentral Alaska between 14,000 and 10,000 years ago. Charcoal and pollen from lake sediments reveal that ancient birch-dominated shrub tundra burned as often as modern boreal forests in the region, every 144 years on average (+/− 90 s.d.; n = 44). Although paleoclimate interpretations and data from modern tundra fires suggest that increased burning was aided by low effective moisture, vegetation cover clearly played a critical role in facilitating the paleofires by creating an abundance of fine fuels. These records suggest that greater fire activity will likely accompany temperature-related increases in shrub-dominated tundra predicted for the 21st century and beyond. Increased tundra burning will have broad impacts on physical and biological systems as well as on land-atmosphere interactions in the Arctic, including the potential to release stored organic carbon to the atmosphere.
  5. Chen, Yaping, Mark Jason Lara, and Feng Sheng Hu. “A robust visible near-infrared index for fire severity mapping in Arctic tundra ecosystems.” ISPRS Journal of Photogrammetry and Remote Sensing 159 (2020): 101-113 Tundra fires are projected to increase with anthropogenic climate change, yet our ability to assess key wildfire metrics such as fire severity remains limited. The Normalized Burn Ratio (NBR) is the most commonly applied index for fire severity mapping. However, the computation of NBR depends on short-wave infrared (SWIR) data, which are not commonly available from historical and contemporary high-resolution (≤4 m) optical imagery. The increasing availability of visible near-infrared (VNIR) measurements from proximal to spaceborne sensors/platforms has the potential to advance our understanding of the spatiotemporal patterns of fire severity within tundra fires. Here we systematically assess the feasibility of using VNIR data for fire severity mapping in ten Alaskan tundra fires (cumulatively burned ~1700 km2). We compared the accuracy of 10 published VNIR-based fire indices using both uni-temporal (post-fire image) and bi-temporal (pre-fire and post-fire image difference) assessments against ground-based fire severity data (Composite Burn Index, CBI) at 109 tundra sites. The Global Environmental Monitoring Index (GEMI) had the highest correspondence with CBI (R2 = 0.77 uni-temporal; R2 = 0.85 bi-temporal), with similar performance to NBR (R2 = 0.77 uni-temporal; R2 = 0.83 bi-temporal). Tundra vegetation types affected NBR but not GEMI, as SWIR reflectance was influenced to a greater extent in shrub than graminoid tundra. We applied GEMI to contemporary high-resolution (i.e. Quickbird 2) and historical meso-resolution imagery (i.e. Landsat Multispectral Scanner) to demonstrate the capability of GEMI for resolving fine-scale patterns of fire severity and extending fire severity archives. Results suggest that GEMI accurately captured the heterogeneous patterns of tundra fire severity across fire seasons, ecoregions, and vegetation types.
  6. Mack, Michelle C., et al. “Carbon loss from an unprecedented Arctic tundra wildfire.” Nature 475.7357 (2011): 489-492Arctic tundra soils store large amounts of carbon (C) in organic soil layers hundreds to thousands of years old that insulate, and in some cases maintain, permafrost soils1,2. Fire has been largely absent from most of this biome since the early Holocene epoch3, but its frequency and extent are increasing, probably in response to climate warming4. The effect of fires on the C balance of tundra landscapes, however, remains largely unknown. The Anaktuvuk River fire in 2007 burned 1,039 square kilometres of Alaska’s Arctic slope, making it the largest fire on record for the tundra biome and doubling the cumulative area burned since 1950 (ref. 5). Here we report that tundra ecosystems lost 2,016 ± 435 g C m−2 in the fire, an amount two orders of magnitude larger than annual net C exchange in undisturbed tundra6. Sixty per cent of this C loss was from soil organic matter, and radiocarbon dating of residual soil layers revealed that the maximum age of soil C lost was 50 years. Scaled to the entire burned area, the fire released approximately 2.1 teragrams of C to the atmosphere, (=0.0021 gigatonnes) an amount similar in magnitude to the annual net C sink for the entire Arctic tundra biome averaged over the last quarter of the twentieth century7. The magnitude of ecosystem C lost by fire, relative to both ecosystem and biome-scale fluxes, demonstrates that a climate-driven increase in tundra fire disturbance may represent a positive feedback, potentially offsetting Arctic greening8 and influencing the net C balance of the tundra biome.
  7. Rocha, Adrian V., et al. “The footprint of Alaskan tundra fires during the past half-century: implications for surface properties and radiative forcing.” Environmental Research Letters 7.4 (2012): 044039.  Recent large and frequent fires above the Alaskan arctic circle have forced a reassessment of the ecological and climatological importance of fire in arctic tundra ecosystems. Here we provide a general overview of the occurrence, distribution, and ecological and climate implications of Alaskan tundra fires over the past half-century using spatially explicit climate, fire, vegetation and remote sensing datasets for Alaska. Our analyses highlight the importance of vegetation biomass and environmental conditions in regulating tundra burning, and demonstrate that most tundra ecosystems are susceptible to burn, providing the environmental conditions are right. Over the past two decades, fire perimeters above the arctic circle have increased in size and importance, especially on the North Slope, indicating that future wildfire projections should account for fire regime changes in these regions. Remote sensing data and a literature review of thaw depths indicate that tundra fires have both positive and negative implications for climatic feedbacks including a decadal increase in albedo radiative forcing immediately after a fire, a stimulation of surface greenness and a persistent long-term (>10 year) increase in thaw depth. In order to address the future impact of tundra fires on climate, a better understanding of the control of tundra fire occurrence as well as the long-term impacts on ecosystem carbon cycling will be required.
  8. Jones, Benjamin M., et al. “Fire behavior, weather, and burn severity of the 2007 Anaktuvuk River tundra fire, North Slope, Alaska.” Arctic, Antarctic, and Alpine Research 41.3 (2009): 309-316.  In 2007, the Anaktuvuk River Fire (ARF) became the largest recorded tundra fire on the North Slope of Alaska. The ARF burned for nearly three months, consuming more than 100,000 ha. At its peak in early September, the ARF burned at a rate of 7000 ha d−1. The conditions potentially responsible for this large tundra fire include modeled record high summer temperature and record low summer precipitation, a late-season high-pressure system located over the Beaufort Sea, extremely dry soil conditions throughout the summer, and sustained southerly winds during the period of vegetation senescence. Burn severity mapping revealed that more than 80% of the ARF burned at moderate to extreme severity, while the nearby Kuparuk River Fire remained small and burned at predominantly (80%) low severity. While this study provides information that may aid in the prediction of future large tundra fires in northern Alaska, the fact that three other tundra fires that occurred in 2007 combined to burn less than 1000 ha suggests site specific complexities associated with tundra fires on the North Slope, which may hamper the development of tundra fire forecasting models.  [FULL TEXT]
  9. Hu, Feng Sheng, et al. “Arctic tundra fires: natural variability and responses to climate change.” Frontiers in Ecology and the Environment 13.7 (2015): 369-377.  Anthropogenic climate change may result in novel disturbances to Arctic tundra ecosystems. Understanding the natural variability of tundra‐fire regimes and their linkages to climate is essential in evaluating whether tundra burning has increased in recent years. Historical observations and charcoal records from lake sediments reveal a wide range of fire regimes in Arctic tundra, with fire‐return intervals varying from decades to millennia. Analysis of historical data shows strong climate–fire relationships, with threshold effects of summer temperature and precipitation. Projections based on 21st‐century climate scenarios suggest that annual area burned will approximately double in Alaskan tundra by the end of the century. Fires can release ancient carbon from tundra ecosystems and catalyze other biogeochemical and biophysical changes, with local to global consequences. Given the increased likelihood of tundra burning in coming decades, land managers and policy makers need to consider the ecological and socioeconomic impacts of fire in the Far North.

 

 

SUMMARY AND CONCLUSION

 

What we find in the literature for high latitude tundra fires is that the long history of these fires does not imply that they are a creation of anthropogenic global warming and climate change (AGW). Significant works in the literature did investigate a possible link between AGW and the severity and extent of these fires but without firm conclusions. In that regard the only attribution we find is that in the long term, perhaps a hundred years from now, if AGW continues to intensify, tundra fires may become more severe with a possibility of feedbacks from soil carbon. More importantly, the literature does not support claims in the media of the oddity of high Arctic tundra fires in 2020 and of their attribution to AGW. It is also noted that the polar region, where hours of sunshine varies from close to zero in winter to almost 24-hours in summer, undergoes an extreme seasonal temperature range of more than 30C compared with 8C in the tropics and 14C in the temperate zone. Therefore the Arctic should be understood not just as a very cold and icy place but also in terms of its extreme seasons. An additional consideration is the relatively high level of geological activity in that region as described in a related post [LINK] that includes, for example, a “river of molten iron” under Russia [LINK] [LINK]  shown  below. Molten iron river discovered speeding beneath Russia and Canada ...

With 'unstoppable' momentum, Paris climate pact set for early ...

Ban Ki-moon Archives - Jan S. Gephardt's Artdog Studio

bandicam 2020-07-29 08-10-27-299

 

THIS POST IS A CRITICAL COMMENTARY ON THE GUARDIAN REPORT [LINK] THAT FORMER UN BUREAUCRAT BAN-KI-MOON IS “BEWILDERED THAT PRESIDENT TRUMP WOULD IMPERIL AMERICA BY ABANDONING THE PARIS AGREEMENT.  

 

 

THE TEXT OF THE GUARDIAN ARTICLE WITH CRITICAL COMMENTARY INSERTED

  1. CLAIM: The Paris agreement to tackle climate change is an extraordinary opportunity. In a remarkable display of unity, almost every nation on Earth has agreed to make critical changes that will help humanity avoid disaster. By aiming to limit global warming to 1.5C, it represents the world’s best chance of adapting to a crisis that threatens our planet’s very existence. But Donald Trump is walking away. RESPONSE:  What is being referred to as an agreement to tackle climate change is actually a collection of Intended Nationally Determined Contributions (INDC) that don’t agree and that are not derived from global emission reduction needed to meet a given warming target. It should also be mentioned that an international agreement for emission reduction to limit warming is not an adaptation strategy. It is a mitigation strategy. If the Guardian favors adaptation over mitigation, it has more in common with the Trump administration than it realizes. The climate science position is that adaptation later will be more costly than mitigation now. It should also be mentioned that it is not possible for the climate to destroy the planet. Details here: [LINK] .
  2. CLAIM:  This decision is politically shortsighted, scientifically wrong and morally irresponsible. By leaving the Paris agreement, he is undermining America’s future.  RESPONSE:  To claim that global warming can be attenuated by reducing fossil fuel emissions and that this idea is scientifically correct; and that to ignore it is scientifically wrong, it must be shown that atmospheric composition is responsive to the rate of fossil fuel emissions by submitting empirical evidence that is scientifically and statistically correct. No such evidence exists. What climate science assumes to be empirical evidence for this relationship is scientifically wrong and a creation of circular reasoning as explained in these related posts [LINK] [LINK] [LINK] [LINK] .
  3. CLAIM:  Every single day, we see the effects of climate change across the US. From catastrophic forest fires in California to rising sea levels in Miami and devastating flooding in Texas, these changes are a real and present danger. Our climate is visibly changing and the consequences will be disastrous for everyone.  RESPONSE:  The attribution of extreme weather events to AGW climate change involves the use of event attribution analysis and the interpretation of localized short term climate data in terms of global warming. As described in a related post, recent research has found that such attribution is not possible because of what is termed “internal climate variability” [LINK] .  Anthropogenic global warming is a theory about long term (more than 30 years) trends in  global mean temperature. Its interpretation in terms of short terms climate events (less than 30 years) and localized climate events with a span that is less than global and less than significant latitudinal spans of the globe, is not possible. As explained in these papers, under these limiting conditons, “Internal variability in the climate system confounds assessment of human-induced climate change and imposes irreducible limits on the accuracy of climate change projections, especially at regional and decadal scales. A new collection of initial-condition large ensembles generated with seven Earth system models under historical and future radiative forcing scenarios provides new insights into uncertainties due to internal variability versus model differences”. Therefore, the attribution of forest fires in California and other localized climate events in the USA to anthropogenic global warming assumed by the Guardian, is not possible. 
  4. CLAIM:  Despite this, the president is closing his eyes to reality. He is turning away from the only opportunity to save humanity from the effects of rising temperatures. Far from making America great again, his decision leaves it isolated – as everyone else comes together to face this great challenge.  RESPONSE:  Being “isolated” does not mean being wrong. It is possible to be isolated with a correct decision when everyone else is wrong. That he is isolated proves only that he is unpopular and not that therefore he is wrong. Climate science cannot be decided by polls or by a popularity contest. If it is a science, it must be evaluated in terms of the scientific method whether or not the results are popular.
  5. CLAIM;  President Trump’s stance is all the more bewildering because climate change does not respect borders. This crisis will not bypass America because he chooses to ignore it. Fires will burn just as wildly and rising seas continue to threaten coastal cities. No country is an island and America cannot pull up the drawbridge to escape a crisis enveloping the whole world.  RESPONSE:  As explained in a related post on the Paris Agreement, [LINK] , the so called “Agreement” turned out to be nothing but a collection of INDCs that don’t actually agree to or compose some kind of global emission reduction target. There is no greater evidence for its failure than the language of the UN bureaucrats that created it. What we see is that the UN is no longer counting on a coordinated global effort for a global emission reduction target. Instead it is goading individual nations to have something called “climate ambition” and thereby to create something called “climate  momentum” that would somehow do the job of moderating the rate of warming. Yet, as we point out in the related post on this issue [LINK] , national emission reduction policies contain a fatal economics flaw. Climate action by an individual nation state will not lead to global emission reduction because its climate action plan will increase the economic cost of production and make the climate action nation less competitive in international trade and hand over a cost advantage to nations that do not have a national climate action plan. The cost advantage of non-climate-action takers will cause their production and exports to rise by virtue of demand from climate action taking nations. The net result will be that economic activity and fossil fuel emissions will decline in climate action taking nations but with a corresponding rise in economic activity and fossil fuel emissions in non-climate-action taking nations. It is not likely that in the net there will be global emission reduction. The Trump administration’s decision must be analyzed in this context and not in the context of a global emission reduction plan because there is no such plan. The Paris Agreement is not a global emission reduction plan but a collection of national intentions independently composed and not coordinated. 
  6. CLAIM:  Walking away will do nothing to stop the consequences of climate change arriving on America’s doorstep. According to the World Bank, the effects of rising temperatures could force 1.4 million people to abandon their homes in Mexico and Central America, where one-third of all jobs remain linked to agriculture. Many of these climate refugees will head to the USRESPONSE;  The reference to the mass migration of Central Americans to the USA in 2019 is based on the claim in climate science that these people are climate migrants or climate refugees because the migration was the result of a climate impact in Central America that created cycles of extreme rainfall and drought at 2-year or 3-year time scales. However, as described in a related post [LINK] , it is not possible to attribute these short term climate events in a highly localized region to global warming  because of the limitations posed by internal climate variability [LINK] . This attribution is one of convenience and a grossly unscientific, illogical, racist, and cruel  attempt to use the plight of poor people to sell the climate agenda.
  7. CLAIM;  Tackling climate change is an international problem that needs an international solution. The Paris agreement is the result of decades of careful work and a solution that will benefit everyone – including America – long-term. We need a low-carbon strategy for everything from food and water systems to transport plans and we must design climate resilience into our infrastructure. By investing in climate-adaptation strategies now, we can protect against the worst impacts of the risks and dangers that lie ahead.  RESPONSE;  Once again, we find the Guardian confused about the terms “adaptation” and “mitigation“. Mitigation means to take climate action and thereby to control the amount of warming whereas adaptation means to allow the warming to occur but to make changes that will make it easier for us to live with climate change. However, we strongly agree that if climate change has a solution in terms of reducing and eliminating the use of fossil fuels,  it has to be a global effort. However, as noted above, the so called “Paris Agreement” is not an international solution. It is a flawed and failed effort by the UN that the UN itself has tacitly admitted in its new strategy to promote national level climate action “ambition”. The greater issue here is that whether or not climate change is an international problem is not the issue. The issue is whether climate action will change the rate of rise in atmospheric CO2 concentration. No evidence for that assumed relationship has been provided by climate science probably because no such evidence exists [LINK] .
  8. CLAIM:  A Global Commission on Adaptation report found that investing $1.8tn globally in adaptation by 2030 could yield $7.1tn in net benefits. Planning now and prospering, rather than delaying and paying for the consequences later, will sort the winners from the losers in this crisis response. RESPONSE;  Once again we find that the Guardian appears to be confused about the difference between adaptation and mitigation and oddly promoting adaptation while at the same time championing mitigation by way of the Paris Agreement.
  9. CLAIM: There is a brutal irony in that the world at large is finally waking up to the climate crisis as President Trump ignores the science. The EU is creating a Green Deal for a more sustainable economy and China is greening its infrastructure spending as leaders across the globe realise that we are running out of options. Without the Paris agreement, America will start sliding backwards just as everyone else accelerates. RESPONSE:  The other irony here is the admission by a leading climate alarmist publishing house that “the world at large is finally waking up to the climate crisis” in the year 2020, 32 years after the alarm was raised in the Congressional Testimony of James Hansen in 1988 [LINK] . This kind of delayed response does not speak to the severity or the credibility of the alarm.
  10. CLAIM: History does not look kindly on leaders who do not lead when disaster threatens. There is a moral bankruptcy in looking away in a time of crisis, which resonates down the decades. This is all the more poignant as, across America, we can see many local efforts to try to plug the gap in the country’s climate strategy. Many Americans understand what their leader does not: we are running out of time to try to stem disaster, and their very lives may be under threat.  RESPONSE:   History would surely look less kindly on leaders that depend on alarmist publications like the Guardian in making significant decisions of this sort.
  11. CLAIM: Politicians from across the US political divide can also see what is coming – and what is necessary to avert disaster. In Boston, city leaders have launched Climate Ready Boston to help create a more resilient future by redesigning buildings and waterfront parks, and elevating pathways. In Miami, the Miami Forever Bond includes nearly $200m for climate-change adaptation, countering sea-level rise through measures such as planting mangroves along the waterfront and raising sea walls.  RESPONSE;  Yet again we find the Guardian unable to distinguish between the mitigation and adaptation options for climate action while at the same time assuming the role of climate advisor the the President of the United States of America. In any case, that there is an opposition to the President’s climate policy implies only that America is a democracy where alternate viewpoints are the norm and not some kind of dictatorship where no opposition is tolerated.
  12. CLAIM;  Politicians from across the US political divide can also see what is coming – and what is necessary to avert disaster – from Republicans such as Miami’s mayor, Francis Suarez, to the Democrats, who have presented a Green New Deal. But this international crisis cannot be solved by local action, important though that is. We need the US to show leadership and place the whole might of US innovation and expertise behind this most important of endeavors.  RESPONSE:  Once again, that there are different views in the USA political spectrum implies only that the USA is a well functioning democracy and not that therefore the President has made a mistake.
  13. CLAIM:  President Trump has made a grave mistake in withdrawing from the Paris agreement at this critical juncture. His actions lessen America, a country that has always taken pride in doing the right thing, at the right time, and seized opportunities for technological and economic transformation. But it is not yet too late to find a way back and this is one error that can be undone. We can only hope that America recognizes this before it is too late.  RESPONSE: As described in detail in previous responses, the Paris Agreement is a confused, conflicted, and failed effort by the UN to put together a global agreement for global emission reduction targets. More specifically, it is NOT that global agreement. As admitted by the UN bureaucrats themselves, the Paris agreement implies that emission reduction must be achieved by the “ambition” and “momentum” of individual nations and that therefore they have failed to repeat their Montreal Protocol success in the area of climate action.
  14. CLAIM;  Ban Ki-moon was the eighth secretary general of the United Nations and is chair of the Global Center on Adaptation. We’ve never had a better chance … to make a greener world. Covid-19 has delivered unusual environmental benefits: cleaner air, lower carbon emissions, a respite for wildlife. Now the big question is whether we can capitalise on this moment. The Guardian aims to lead the debate from the front. RESPONSE: Once again, we find that the climate alarmist from Britain pretending to be a credible climate adviser to the President of the United States, is apparently unaware that adaptation does not mean taking climate action to slow down the rate of warming. It means to take action to adapt to the observed rate of warming. Climate action to slow down the rate of warming is mitigation not adaptation.

 

Unification Church Profile: The Fall of the House of Moon | The ...

Amazon.com: Change of Blood Lineage through Ritual Sex in the ...

21 Surreal Photos From A Moonie Mass Wedding

 

 

Guatemalan Climate Change Refugees Pouring Over U.S. Border – And ...

We Won't Solve Immigration If We Don't Solve Climate Change | Fortune

Climate Change Is Driving People Out of Central America

How Climate Change Is Driving Central American Migrants to the ...

Photos portray suffering caused by climate change - but offer hope ...

 

THE NYT TIMES DESCRIBES THE CREATION OF CLIMATE MIGRANTS

  1. CLAIM:  Early in 2019, a year before the world shut its borders completely, Jorge A. knew he had to get out of Guatemala. The land was turning against him. For five years, it almost never rained. Then it did rain, and rained and rained, and Jorge rushed his last seeds into the ground. The corn sprouted into healthy green stalks, and there was hope — until, without warning, the river flooded his fields. Jorge waded chest-deep into his fields searching in vain for cobs he could still eat. Soon he made a last desperate bet, signing away the tin-roof hut where he lived with his wife and three children against a $1,500 advance in okra seed. But after the flood, the rain stopped again, and everything died. Jorge knew then that if he didn’t get out of Guatemala, his family might die, too. So Jorge, along with hundreds of thousands of Guatemalans fled north for salvation in the United States. The odd weather phenomena that drove them to the USA, the drought and the El Niño have beend made more frequent by human caused global warming (AGW) that is turning semiarid parts of Guatemala into a desert.
  2. RESPONSE PART-1; ANTHROPOGENIC GLOBAL WARMING AND CLIMATE CHANGE (AGW)  is a theory about long term warming trends in global mean temperature. Therefore only those climate events that relate to long term trends in global mean temperature can be related to AGW in terms of causation and mitigation. This analysis may also apply to significant latitudinal sections of the globe but it cannot be understood in terms of localized or short term climate events that tend to be driven mostly by what has been termed “internal climate variability”. Details of the internal climate variability issue may be found in a related post [LINK] .
  3. RESPONSE PART-2; On the basis of the internal climate variability issue, it is not possible to understand the cycle of extreme rainfall and extreme dry weather over a period of a few years (less than 30 years) in a highly localized region described either as Guatemala or Central America, in terms of AGW. This means that AGW cannot be proposed as the cause of this short term climate variability and it cannot be proposed that such climate events can be mitigated by taking climate action in the form of reducing or eliminating the use of fossil fuels.
  4. RESPONSE PART-3; In summary, the short term localized climate events that forced these people to migrate do not imply that the migration was caused by AGW or that such events can be mitigated by taking climate action. Therefore, these migrants cannot be described as climate change migrants or as climate change refugees, however tragic their situation may have been.

We Won't Solve Immigration If We Don't Solve Climate Change | Fortune

Guatemalan Climate Change Refugees Pouring Over U.S. Border – And ...

The Svalbard Archipelago and its permanent settlements (red dots ...

Norway's Arctic islands at risk of 'devastating' warming - report ...

 

THIS POST IS A CRITICAL REVIEW OF A PHYS.ORG ARTICLE  [LINK]  ON HIGH TEMPERATURES RECORDED IN SVALBARD IN JULY 2020  

 

 

 

PART-1 WHAT THE ARTICLE SAYS

 

Highest-ever temperature recorded in the Svalbard archipelago

  1. The Svalbard Archipelago is known for its polar bears, which a recent study predicts could all but disappear within the span of a human lifetime due to Climate Change.
  2. The Svalbard archipelago on Saturday recorded its highest-ever temperature, the country’s meteorological institute reported.
  3. According to a scientific study, global warming in the Arctic is happening twice as fast as for the rest of the planet.
  4. For the second day in a row, the Svalbard archipelago registered 21.2C in the afternoon, just under the 21.3C recorded in 1979. Later in the afternoon however, it recorded 21.7C at 6pm, a new all-time record.
  5. The island group, dominated by Spitzbergen the only inhabited isle in the Svalbard Archipelago, sits 1,000 kilometres from the North Pole. The relative heatwave, expected to last until Monday, is a huge spike of normal temperatures in July, the hottest month in the Arctic.
  6. The Svalbard islands would normally expect temperatures of 5C to 8C at this time of year. The region has seen temperatures 5C above normal since January, peaking at 38C in Siberia in mid-July, just beyond the Arctic Circle.
  7. According to the report “The Svalbard climate in 2100,” the average temperatures for the archipelago between 2070 and 2100 will rise by 7C-10C, due to the levels of greenhouse gas emissions. Changes are already visible. From 1971 to 2017 3C to 5C of warming have been observed, with the biggest rise in the winter, according to the report.
  8. Svalbard, known for its polar bear population, houses both a coal mine, digging out the most global warming of all energy sources, and a “doomsday’ seed vault which has since 2008 collected stocks of the world’s agricultural bounty in case of global catastrophe. The vault required 20 million euros ($23.3 million) worth of work after the infiltration of water due to thawing permafrost in 2016.

 

 

 

PART-2: CRITICAL COMMENTARY

  1. Anthropogenic global warming and climate change (AGW) is a theory about very long term warming trends over 30 years or more in global mean temperature. It’s extension to specific regions of the globe is possible to a limited extent such that the climatology of large latitudinal sections of the globe can be interpreted in terms of AGW. For example, the Arctic latitudinal section of the globe defined as above 60 degrees North latitude, can be studied in terms of the impacts of AGW for phenomena that span more than 30 years.
  2. Recent studies have found that localized weather events in more limited geographical regions and over shorter time spans cannot be understood in terms of AGW because of what has been termedInternal Climate Variability that cannot be understood or interpreted in terms of AGW. Details of this issue can be found in a related post [LINK] .
  3. The findings of the study are that “Internal variability in the climate system confounds assessment of human-induced climate change and imposes irreducible limits on the accuracy of climate change projections, especially for regional climate or over limited time scales less than 30 years.” and thereby makes it impossible to relate the climatology of limited geographical regions or in limited time spans to AGW.
  4. Here we find that, in the context of internal climate variability, the high temperatures observed in the Svalbard Archipelago, specifically in the town of Spitzbergen, in July of 2020 are too localized to be generalized to the Arctic region as a whole. As well, the time spans of these data are too short to be interpreted in terms of AGW.
  5. Therefore these data presented in the Phys.Org article cannot be interpreted in terms of AGW and therefore they do not imply that these temperature events are the creation of fossil fuel emissions or that they can be moderated by taking climate action in the form of reducing or eliminating the use of fossil fuels.

 

POSTSCRIPT

With apologies for the random insertion of weird fonts by WordPress, a new feature of WordPress over which I have no control.

THIS POST IS A CRITICAL REVIEW OF A NATIONAL GEOGRAPHIC MAGAZINE ARTICLE ON FOSSIL FUELS PUBLISHED IN APRIL 2019 [LINK]  

 

 

CRITICAL EVALUATION OF WHAT THE NATGEO ARTICLE SAYS

  1. CLAIM: WHAT ARE FOSSIL FUELS? Decomposing plants and other organisms, buried beneath layers of sediment and rock, have taken millennia to become the carbon-rich deposits we now call fossil fuels.  RESPONSE: The word millennia sounds a lot like millions of years but it means thousands of years. Organic matter does not turn into fossil fuels in thousands of years. It takes millions of years.
  2. CLAIM: These non-renewable fuels, which include coal, oil, and natural gas, supply about 80 percent of the world’s energy. They provide electricity, heat, and transportation, while also feeding the processes that make a huge range of products, from steel to plastics. RESPONSE: It is a popular misinformation fed by climate science that fossil fuel is non-renewable such that it can be made only once and once made it can only be depleted. This was the failed model that led to the now discredited assessment by the Club of Rome back in the 1970s of its imminent depletion in terms of peak oil and end oil forecasts. These forecasts have been proven false but the “non-renewable” idea lingers even in esteemed magazines like the NatGeo. The process that turns organic matter into fossil fuel is a continuous feature of nature in which large stores of organic matter are continually being recycled back to the surface as fossil fuels. The more important issue is the assumption but not the evidence that what we call fossil fuels are not geological carbon. In terms of geological carbon one should note that the middle of the planet contains a relatively infinite source carbon. 99.8% of the planet’s carbon is in the mantle and core and what we are dealing with up here on the surface, fossil fuels, carbon life forms and all, represents about 0.2% of the planet’s carbon. It is not likely that our use of carbon fuels stored under the ground will run out any time soon as they are replenished by geological carbon flows.
  3. CLAIM;  When fossil fuels are burned, they release carbon dioxide and other greenhouse gases, which in turn trap heat in our atmosphere, making them the primary contributors to global warming and climate change.  RESPONSE: What climate science actually says is that the burning of fossil fuels changes atmospheric composition by increasing its carbon dioxide content. In that context, if the global mean temperature of earth is understood as a logarithmic function of atmospheric CO2 concentration, as the CO2 concentration of the atmosphere goes up, global mean surface temperature will respond by also going up. In other words, rising atmospheric CO2 concentration causes warming – and if the rise in atmospheric CO2 is attributable to fossil fuel emissions of humans, then the warming can be described as human caused or “anthropogenic”. However, the crucial step in this logic, that the observed rise in atmospheric CO2 concentration is attributable to fossil fuel emissions and that it can be attenuated by reducing or eliminating the use of fossil fuels, is an assumption without empirical evidence as shown in these related posts: [LINK] [LINK] [LINK] .
  4. CLAIM: FOSSIL FUEL TYPES: There are several main groups of fossil fuels, including: (1) Coal: coal supplies a third of all energy worldwide, with the top coal consumers and producers in 2018 being China, India, and the United States. Coal is classified into four categories—anthracite, bituminous, sub-bituminous, and lignite–depending on its carbon content. (2)  Oil: Crude oil, a liquid composed mainly of carbon and hydrogen, is often black, but exists in a variety of colors and viscosities depending on its chemical composition. Much of it formed during the Mesozoic period, between 252 and 66 million years ago, as plankton, algae, and other matter sank to the bottom of ancient seas and was eventually buried. Extracted from onshore and offshore wells, crude oil is refined into a variety of petroleum products, including gasoline, diesel, and heating oil. The top oil-producing countries are the U.S., Saudi Arabia, and Russia, which together account for nearly 40 percent of the world’s supply. Petroleum use accounts for nearly half the carbon emissions in the U.S. and about a third of the global total. In addition to the air pollution released when oil is burned, drilling and transport have led to several major accidents, such as the Exxon Valdez spill in 1989, the Deepwater Horizon disaster in 2010, the devastating Lac Megantic oil train derailment in 2013, and thousands of pipeline incidents. Nonetheless, oil demand continues to rise, driven not only by our thirst for mobility, but for the many products—including plastics—made using petrochemicals, which are generally derived from oil and gas.  (3) Natural gas: An odorless gas composed primarily of methane, natural gas often lies in deposits that, like those for coal and oil, formed millions of years ago from decaying plant matter and organisms. Both natural gas and oil production have surged in the U.S. over the past two decades because of advances in the drilling technique most people know as fracking. By combining fracking—or hydraulic fracturing—with horizontal drilling and other innovations, the fossil-fuel industry has managed to extract resources that were previously too costly to reach. As a result, natural gas has surpassed coal to become the top fuel for U.S. electricity production, and the U.S. leads the world in natural gas production, followed by Russia and Iran. Natural gas is cleaner than coal and oil in terms of emissions, but nonetheless accounts for a fifth of the world’s total emissions, not counting the so-called fugitive emissions that escape from the industry, which can be significant. Not all of the world’s natural gas sources are being actively mined. Undersea methane hydrates, for example, where gas is trapped in frozen water, are being eyed as a potential gas resourceRESPONSE: Good summary of fossil fuels although the assumption about their creation from biological decay (i.e. fossils) is unproven. There is no way to tell fossil fuel carbon apart from geological carbon.
  5. CLAIM:  Reducing emissions from fossil fuels:  Governments around the world are now engaged in efforts to ramp down greenhouse gas emissions from fossil fuels to prevent the worst effects of climate change. At the international level, countries have committed to emissions reduction targets as part of the 2015 Paris Agreement, while other entities—including cities, states, and businesses—have made their own commitments. These efforts generally focus on replacing fossil fuels with renewable energy sources, increasing energy efficiency, and electrifying sectors such as transportation and buildings. However, many sources of carbon emissions, such as existing power plants that run on natural gas and coal, are already locked in. Considering the world’s continuing dependence on fossil fuels, many argue that in addition to efforts aimed at replacing them, we also need to suck carbon from the air with technologies such as carbon capture, in which emissions are diverted to underground storage or recycled before they reach the atmosphere. A handful of commercial-scale projects around the world already capture carbon dioxide from the smokestacks of fossil fuel-fired plants, and while its high costs have prevented wider adoption, advocates hope advances in the technology will eventually make it more affordable.  RESPONSE: The so called Paris Agreement of 2015 was an agreement to submit intended nationally determined contributions (INDC) that were independently composed by the participating nations to express their intention and to independently determine what their contribution might be with no commitment to actually deliver on that “intention”. The logical fallacy of treating the collection of INDCs that don’t agree as an agreement is a puzzling contradiction as described in a related post [LINK] . What is called the Paris Agreement, is more accurately understood as a desperate attempt by the United Nations, that is credited with the success of the Montreal Protocol in solving the ozone depletion problem and was expected to repeat that success in the Kyoto Protocol to solve the climate change problem. The UN bureaucrats could not fathom the enormous difference between changing refrigerants and overhauling the world’s energy infrastructure.  After 20 attempts in Conference of Parties (COP) to repeat the Montreal Protocol success, they simply gave up and in the 21st meeting they were desperate to produce something they could claim to be a repeat of the Montreal Protocol. So they asked the delegates to sign whatever they wanted to sign. This mishmash collection of INDCs is called an Agreement but its treatment as such leads to gross conceptual errors as seen in a related post here [LINK] . The truth about the Paris Agreement is that it was in Paris in 2015 at the COP21 that the UN bureaucrats gave up on the idea of a global agreement for global emission reduction targets and changed their language to words such as “ambition” and “momentum” to goad individual nation states to pursue climate action, not as a global agreement, but as national goals [LINK] . And so it was in Paris where the idea of a global agreement on global emission reduction was finally written off as a failed effort. Paris is where Montreal died.
  6. POSTSCRIPT: Even as nation states are being goaded to pursue their own national emission reduction plans based on “ambition” and “momentum” calls by UN cheerleaders, economists have pointed out a fatal flaw in this emission reduction effort as described in a related post [LINK] . The heroic climate action nations being cheered on by UN bureaucrats are up against an economics trap created by the non-global nature of the emission reduction program because although the world of humans is separated into nation states, the nation states are connected by economics. This connection is vast and complex and involves cross border investments, stocks, bonds, monetary policy, technology, intellectual property rights, and so on and so forth but most importantly in this respect, the nations of the world are connected by trade. International trade is so important, that even though we think of our civilization in terms of the nation states, we are really one huge global economy tied together by trade. Because nation states are independent nations in some respects but global in terms of trade, a climate action decision by an individual nation state will not lead to global emission reduction. This is because any national climate action plan by a single nation state will increase the economic cost of production and make that nation state less competitive in international trade and hand over a cost advantage to nations that do not have a national climate action plan. The cost advantage of non-climate-action takers will cause their production and exports to rise by virtue of demand from climate action taking nations. The net result will be that economic activity {and fossil fuel emissions} will decline in climate action taking nations but with a corresponding rise in economic activity {and fossil fuel emissions} in non-climate-action taking nations. In the net there may be no emission reduction. This is the Catch-22 of national level emission reduction plans being cheered on by UN bureaucrats with buzzwords like ambition and momentum.

GUTERRES

INDC ภารกิจต่อไป บนเส้นทางลดโลกร้อน :: TGO | องค์การบริหารจัดการ ...

Why fossil fuel emissions are increasing — again

Carbon emissions will reach 37 billion tonnes in 2018, a record high

Are Certain Humans Crying Climate Change “Wolf”? | by Michael ...

 

 

 

 

 

 

 

 

 

Temperature climbs above 40 C again as Japan heat wave continues

Japanese Heat Wave Leaves 15 Dead, Thousands Hospitalized | Time

Japan heatwave declared natural disaster as death toll mounts ...

Unprecedented' Japan heatwave kills 65 in one week

 

THIS POST IS A  CRITICAL REVIEW OF A SLATE ARTICLE ON EVENT ATTRIBUTION SCIENCE TITLED “The First Undeniable Climate Change Deaths. In 2018 in Japan, more than 1,000 people died during an unprecedented heat wave and Scientists prove that it would have been impossible without climate changeJULY 23 2019[LINK]

 

 

 

PART-1: WHAT THE ARTICLE SAYS

  1. July 23, 2018, was a day unlike any other seen before in Japan. It was the peak of a weeks long heat wave that smashed previous temperature records across the historically temperate nation. The heat started on July 9, on farms and in cities that only days earlier were fighting deadly rains, mudslides, and floods. As the waters receded, temperatures climbed. By July 15, 200 of the 927 weather stations in Japan recorded temperatures of 35 degrees Celsius, about 95 degrees Fahrenheit, or higher. Food and electricity prices hit multiyear highs as the power grid and water resources were pushed to their limits. Tens of thousands of people were hospitalized due to heat exhaustion and heatstroke. On Monday, July 23, the heat wave reached its zenith. The large Tokyo suburb of Kumagaya was the epicenter, and around 3 p.m., the Kumagaya Meteorological Observatory measured a temperature of 41.1 degrees Celsius, or 106 F. It was the hottest temperature ever recorded in Japan, but the record was more than a statistic. It was a tragedy: Over the course of those few weeks, more than a thousand people died from heat-related illnesses. These people are the first provable deaths of climate change.  On July 24, the day after the peak of the heat wave, the Japan Meteorological Agency declared it a natural disaster. A disaster it was. But a natural one? Not so much.
  2. In early 2019, researchers at the Japan Meteorological Agency started looking into the circumstances that had caused the unprecedented, deadly heat wave. They wanted to consider it through a relatively new lens—through the young branch of meteorology called Attribution Science, which allows researchers to directly measure the impact of climate change on individual extreme weather events.
  3. Attribution science, at its most basic, calculates how likely an extreme weather event is in today’s climate-changed world and compares that with how likely a similar event would be in a world without anthropogenic warming. Any difference between those two probabilities can be attributed to climate change.
    Attribution science was first conceived in the early 2000s, and since then, researchers have used it as a lens to understand the influence of climate change on everything from droughts to rainfall to coral bleaching. As scientists have long predicted, the vast majority of extreme weather events studied to date have been made more likely because of climate change. But the 2018 Japan heat wave is different. As people who lived in Japan knew at the time, the oppressive temperatures were more than unusual. They were unprecedented. In fact, without climate change, they would have been impossible. “We would never have experienced such an event without global warming,” says Yukiko Imada of the Japan Meteorological Agency.

 

 

PART-2: CRITICAL COMMENTRY

  1. Event Attribution Science is a methodology of using climate model simulations to attribute extreme weather events such as heat waves, floods, and droughts, post hoc, (after the fact) to anthropogenic global warming thought to be driven by fossil fuel emissions and thereby ultimately to attribute the extreme weather event to the use of fossil fuels with the presumption that such extreme weather events can be avoided with climate action in the form of overhauling the world’s energy infrastructure away from fossil fuels.
  2. The procedure suffers from confirmation bias, circular reasoning, and the extreme localization in time and space in the interpretation of a theory about long term trends in global mean temperature. The localization issue is described in the literature as “internal variability” of climate.
  3. Internal Climate Variability: The localization issue refers to the impossibility of separating the natural from the anthropogenic in what is described as internal climate variability. This observation derives from the finding that although climate models can relate long term global trends to fossil fueled anthropogenic global warming, this relationship falls apart at brief time scales of 30 years or less and with localization of climate to geographical regions less than large latitudinal extents. Global warming theory is a global issue and its interpretation is not possible in specific regions, particularly when the region is selected post hoc.
  4. On June 7, 2019, Imada, Masahiro Watanabe, and others published an attribution study of the 2018 Japan heat wave in the journal Scientific Online Letters on the Atmosphere. the study found that the deadly event of the previous summer “could not have happened without human-induced global warming.” This heat wave is not the first extreme event found to be only possible because of climate change. But it is the first short-lived event, and the first to have direct impacts on human health. Given that tens of thousands were hospitalized and more than a thousand died due to the heat wave, in a sense, these people are the first provable deaths of climate change. For Watanabe, the result wasn’t unexpected. It was more of a grim inevitability. “It was not that surprising,” he says of his unprecedented result. An event like this was “naturally expected as global mean temperature continued to rise.”
  5. But for both Watanabe and Imada, it holds real historical significance. “It is very sensational for me because human activity has created a new phenomenon. Human activity has created a new phase of the climate,” says Imada. Attribution science is giving us the ability to watch, in real time, the consequences of our actions. The future that the heat wave of 2018 represents is one we knew was coming. It is here, today, and attribution science gives scientists and the world the ability to say so with conviction. There’s another way in which the new field might prove useful. At the end of our conversation, Watanabe paused to reflect on the work he has done. Attribution science compares the world of today with a world without climate change. In some ways, he’s started to see his work as a signpost in history, reminding us of a world that used to exist, but no longer does.
  6. In a related post on Internal Variability [LINK] we find that “at short time scales of 30 years or less, or in limited geographical extents, internal variability of climate confounds assessment of human-induced climate change and imposes irreducible limits on the accuracy of climate change projections and climate change impacts“.
  7. The Dark Bureaucratic Origins of Event Attribution  [LINK]  : Event Attribution Analysis is best understood in the context of its origins. A necessary and assumed catastrophic nature of AGW is needed as the rationale for the UNFCCC policy that requires Annex I countries to reduce emissions by changing their energy infrastructure from fossil fuels to renewables. This line of reasoning is weakened by an inability of climate science to produce empirical evidence that relates extreme weather disasters to emissions. Of particular note in this regard is that claims made by the IPCC in 2007 with regard to the effect of AGW on the frequency and intensity of tropical cyclones, droughts, and floods were retracted in their next Assessment Report in 2014. Thus, climate scientists, though convinced of the causal connection between AGW and extreme weather events, are nevertheless unable to provide acceptable empirical evidence to support what to them is obvious and “unequivocal” but for which climate science has neither empirical evidence nor a methodology that could serve as the tool for presenting such evidence.
  8. A breakthrough came for climate science in 2013 when the Warsaw International Mechanism (WIM) was signed [LINK] . This mechanism has to do with the complex classification of nation states in the Kyoto Protocol and the UNFCCC in which poor developing nations of the Global South are classified as (Non-Annex countries) with no climate action obligations. Rich developed Western countries of the Global North (Annex-1 countries) are assigned the entire burden of global emission reduction along with the additional burden of providing financial compensation to the non-Annex countries of the Global South for extreme weather impacts of climate change.
  9. When the Annex-1 providers of climate impact compensation funds requested evidence to separate extreme weather events that are natural from those caused AGW climate change, the United Nations organized the meeting in Warsaw in 2013 to discuss and resolve this issue. The Warsaw International Mechanism (WIM) of 2013 has redefined climate change adaptation funding as a form of compensation for “loss and damage” suffered by nonAnnex countries because of sea level rise or extreme weather events caused by fossil fuel emissions which are thought to be mostly a product of Annex-1 countries. Accordingly, the WIM requires that loss and damage suffered by the nonAnnex countries for which compensation is sought from climate adaptation funds must be attributable to fossil fuel emissions.
  10. A probabilistic methodology was devised to address the need for attribution in the WIM and It has gained widespread acceptance in both technical and policy circles as a tool for the allocation of limited climate adaptation funds among competing needs of the non-Annex countries. The probabilistic event attribution methodology (PEA) uses a large number of climate model experiments with multiple models and a multiplicity of initial conditions. A large sample size is used because extreme weather events are rare and their probability small by definition. The probability of an observed extreme weather event with anthropogenic emissions and the probability without anthropogenic emissions are derived from climate model experiments as P1 and P0. If the probability with emissions (P1) exceeds the probability without emissions (P0), the results are interpreted to indicate that emissions played a role in the occurrence of the event in question and that therefore it is fundable. Otherwise the event is assumed to be a product of natural variation alone. The probability that fossil fuel emissions played a role in the extreme weather event is represented as P = (P1-P0)/P0. The procedure serves the bureaucratic needs of the UN but is mired in procedural issues such as confirmation bias and uncertainty.
  11. A contentious issue in PEA analysis is that of uncertainty in the values of P0 and P1 and in the model results themselves. Policy analysts fear that the large uncertainties of climate models provide sufficient reason to question the reliability of PEA to serve its intended function as a criterion for access to climate adaptation funds. Mike Hulme and others argue that much greater statistical confidence in the PEA test is needed to justify denial of adaptation funding for loss and damage from weather extremes that do not pass the PEA test.
  12. The greater concern is that climate science assumes the relationship between AGW and extreme weather impacts but suffers from a critical need for a methodology to provide evidence for it. It is in this context that climate science seized upon the bureaucratic PEA procedure of the WIM, extended the interpretation of PEA results beyond their intended function of fund allocation, renamed it as Event Attribution Science, and adopted it as the climate science methodology that can relate extreme weather events to Anthropogenic Global Warming (AGW).
  13. This enthusiastic innovation in climate science was initiated by climate scientist Friederike Otto (Oxford). For the purpose of this extension of the PEA procedure of the WIM to a form of climate science, its name was changed from PEA to Event Attribution Analysis and then elevated by Scientific American to Event Attribution Science in an article extolling its virtues. In a related post  [LINK] .
  14. Yet, the methodology suffers from confirmation bias and as mentioned above, the procedure has no validity in the context of the internal climate variability issue because extreme weather events are by definition localized in time and space such that their causal relationship to long term trends in global mean temperature cannot be ascertained.

wim-1

wim-2

 

 

ARCCSS | Australian Research Council's (ARC) Centre of Excellence ...

Centre of Excellence for Climate Extremes launched | UNSW Newsroom

bandicam 2020-07-25 19-56-34-266

IMG_5292-3

THIS POST IS A CRITICAL REVIEW OF AN UNPUBLISHED MANUSCRIPT ON ECS UNCERTAINTY BY STEVEN SHERWOOD AND HIS LARGE TEAM OF CLIMATE SCIENTISTS AT THE UNIVERSITY OF NEW SOUTH WALES. THE FULL TEXT OF THE UNPUBLISHED MANUSCRIPT IS AVAILABLE FOR DOWNLOAD IN PDF FORMAT  [LINK] . THE WORK IS BASED ON WAYS OF REDUCING ECS UNCERTAINTY SUGGESTED BY BJORN STEVENS OF THE MAX PLANCK INSTITUTE OF METEOROLOGY IN GERMANY. 

STEVEN SHERWOOD           BJORN STEVENS

 

PART-1: WHAT THE MANUSCRIPT SAYS

  1. ABSTRACT: We assess evidence relevant to Earth’s equilibrium climate sensitivity per doubling of atmospheric CO2, characterized by an effective sensitivity=S. This evidence includes feedback process understanding, the historical climate record, and the paleoclimate record. An S value lower than 2K is difficult to reconcile with any of the three lines of evidence. The amount of cooling during the Last Glacial Maximum provides strong evidence against values of S greater than 4.5 K. Other lines of evidence in combination also show that this is relatively unlikely. We use a Bayesian approach to produce a probability density function for S given all the evidence, including tests of robustness to difficult-to-quantify uncertainties and different priors. The 66% range is 2.6-3.9 K for our
    Baseline calculation, and remains within 2.3-4.5 K under the robustness tests;
    corresponding 5-95% ranges are 2.3-4.7 K. This indicates a stronger constraint on S than reported in past assessments, by lifting the low end of the range. This
    narrowing occurs because the three lines of evidence agree and are judged to be largely independent, and because of greater confidence in understanding feedback processes and in combining evidence. We identify promising avenues for further narrowing the range in S, in particular using comprehensive models and process understanding to address limitations in the traditional forcing-feedback paradigm for interpreting past changes.
  2. PLAIN LANGUAGE SUMMARY:  Earth’s global climate sensitivity is a fundamental quantitative measure of the susceptibility of Earth’s climate to human influence. A landmark report in 1979 (Jules Charney) concluded that it probably lies
    between 1.5-4.5℃ per doubling of atmospheric carbon dioxide, assuming that other influences on climate remain unchanged. In the 40 years since, it has appeared difficult to reduce this uncertainty range. In this report we thoroughly assess all lines of evidence including some new developments. We find that a large volume of consistent evidence now points to a more confident view of a climate sensitivity near the middle or upper part of this range. In particular, it now appears extremely unlikely that the climate sensitivity could be low enough to avoid substantial climate change well in excess of 2℃ warming under a high-emissions future scenario. We remain unable to rule out that the sensitivity could be above 4.5℃ per doubling of carbon dioxide levels although this is not likely. Continued research is needed to further reduce the uncertainty and we identify some of the more promising possibilities in this regard.
  3. INTRODUCTION: The ECS, defined as the steady-state global temperature increase for a doubling of CO2, has long been taken as the starting point for understanding global climate changes. It was quantified specifically by Charney in 1979 as the equilibrium warming as seen in a model with ice sheets and vegetation fixed at present-day values and with a proposed range of 1.5-4.5 K based on the information at the time, but did not attempt to quantify the probability that the sensitivity was inside or outside this range. The IPCC 2013 report asserted the same now-familiar range, but more precisely dubbed it a >66% likely credible interval, implying an up to one in three chance of being outside that range. It has been estimated that, in an ideal world where the information would lead to optimal policy responses, halving the uncertainty in a measure of climate sensitivity would lead to an average savings of US$10 trillion in today’s dollars. Apart from this, the sensitivity of the world’s climate to external influence is a key piece of knowledge that humanity should have at its fingertips. So how can we narrow this range? Quantifying ECS is challenging because the available evidence consists of diverse strands, none of which is conclusive by itself. This requires that the strands be combined in some way. Yet, because the underlying science spans many disciplines within the Earth Sciences, individual scientists generally only fully understand one or a few of the strands. Moreover, the interpretation of each strand requires structural assumptions that cannot be proven, and sometimes ECS measures have been estimated from each strand that are not fully equivalent. This complexity and uncertainty thwarts rigorous, definitive calculations and gives expert judgment and assumptions a potentially large role. Our assessment was undertaken under the auspices of the World Climate Research Programme’s Grand Science Challenge on Clouds, Circulation and Climate Sensitivity  {2015workshop at Ringberg Castle in Germany}. It tackles the above issues, addressing three questions: (1) Given all the information we now have, acknowledging and respecting the uncertainties, how likely are very high or very low climate sensitivities outside the presently accepted likely range of 1.5-4.5 K: (2) What is the strongest evidence against very high or very low values?: (3) Where is there potential to reduce the uncertainty?  In addressing these questions, we follow Stevens et al. (2016, hereafter SSBW16) who laid out a strategy for combining lines of evidence and transparently considering uncertainties. The lines of evidence we consider, as in SSBW16, are modern observations and models of system variability and feedback processes; the rate and trajectory of historical warming, and the paleoclimate record. The core of the combination strategy is to lay out all the circumstances that would have to hold for the climate sensitivity to be very low or high given all the evidence (which SSBW16 call “storylines”). A formal assessment enables quantitative probability statements given all evidence and a prior distribution, but the “storyline” approach allows readers to draw their own conclusions about how likely the storylines are, and points naturally to areas with
    226 greatest potential for further progress. Recognizing that expert judgment is unavoidable, we attempt to incorporate it in a transparent and consistent way. Combining multiple lines of evidence will increase our confidence and tighten the range of likely ECS if the lines of evidence are broadly consistent. If uncertainty is underestimated in any individual line of evidence, inappropriately ruling out or discounting part of the ECS range—this will make an important difference to the final outcome (see example in Knutti et al., 2017). {Blogger’s note: There are two citations for Knutti etal 2017 in the list of citations}. Therefore it is vital to seek a comprehensive estimate of the uncertainty of each line of evidence that accounts for the risk of unexpected errors or influences on the evidence. This must ultimately be done subjectively. We will therefore explore the uncertainty via sensitivity tests and by considering ‘what if’ cases in the sense of Bjorn Stevens, including what happens if an entire line of evidence is dismissed. The most recent reviews (Collins et al., 2013, Knutti et al., 2017 (which one?) have considered the same three main lines of evidence considered here, and have noted they are broadly consistent with one another, but did not attempt a formal quantification of the probability distribution function of ECS. Formal Bayesian quantifications have been done based on the historical warming record (see Bodman and Jones 2016 for a recent review), the paleoclimate record (PALAEOSENS, 2012), a combination of historical and last millennium records (Hegerl et al., 2006), and multiple lines of evidence from instrumental and paleo records (Annan and Hargreaves, 2006). An assessment based only on a subset of the evidence will yield too wide a range if the excluded evidence is consistent (e.g. Annan and Hargreaves, 2006), but if both subsets rely on similar information or assumptions, this co-dependence must be considered when combining them (Knutti and Hegerl 2008). Therefore, an important aspect of our assessment is to explicitly assess how uncertainties could affect more than one line of evidence and to assess the sensitivity of calculated PDFs to reasonable allowance for interdependencies of the evidence {blogger’s note: i.e. violation of the independence assumption}. Another key aspect of our assessment is that we explicitly consider process understanding via modern observations and process models as a newly robust line of evidence. Such knowledge has occasionally been incorporated implicitly (via the prior on ECS) based on the sample distribution of ECS in available climate models (Annan and Hargreaves, 2006) or expert judgments (Forest et al., 2002), but climate models and expert judgments do not fully represent existing knowledge or uncertainty relevant to climate feedbacks, nor are they fully independent of other evidence (in particular that from the historical temperature record, see Kiehl, 2007). Process understanding has recently blossomed, however, to the point where substantial statements can be made without simply relying on climate model representations of feedback processes, creating a new opportunity exploited here. Climate models (GCMs) nonetheless play an increasing role in calculating what our observational data would look like under various hypothetical ECS values in effect translating from evidence to ECS. Their use in this role is now challenging long held assumptions, for example showing that 20th-century warming could have been relatively weak even if ECS were high, that paleoclimate changes are strongly affected by factors other than CO2, and that climate may become more sensitive to greenhouse gases in warmer states. GCMs are also crucial for confirming how modern observations of feedback processes are related to ECS. Accordingly, another novel feature of this assessment will be to use GCMs to refine our expectations of what observations should accompany any given value of ECS and thereby avoid biases now evident in some estimates of ECS based on the historical record using simple energy budget or energy balance model arguments. GCMs are also275 used to link global feedback strengths to observable phenomena. However, for reasons noted above, we avoid relying on GCMs to tell us what values to expect for key feedbacks except where 277 the feedback mechanisms can be calibrated against other evidence. Since we use GCMs in some way to help interpret all lines of evidence, we must be mindful that any errors in doing this could reinforce across lines. We emphasize that this assessment begins with the evidence on which previous studies were based, including new evidence not used previously, and aims to comprehensively synthesize the implications for climate sensitivity both by drawing on key literature and by doing new calculations. In doing this, we will identify structural uncertainties that have caused previous studies to report different ranges of ECS from (essentially) the same evidence, and account for this when assessing what that underlying evidence can tell us. An issue with past studies is that different or vague definitions of ECS may have led to perceived, un-physical discrepancies in estimates of ECS that hampered abilities to constrain its range and progress understanding. Bringing all the evidence to bear in a consistent way requires using a specific measure of ECS, so that all lines of evidence are linked to the same underlying quantity. We denote this quantity as S. The implications for S of the three strands of evidence are examined separately in sections 3-5, and anticipated dependencies between them are discussed in section 6. To obtain a quantitative probability distribution function of S, we follow Bjorn Stevens and many other studies by adopting a Bayesian formalism, which is outlined in sections 2.2-2.6. The results of applying this to the evidence are presented in section 7, along with the implications of our results for other measures of climate sensitivity and for future warming. The overall conclusions of our assessment are presented in section 8.
  4. SECTION 8: PREAMBLE TO CONCLUSIONS:  There are subjective elements in this study but there are also objective ones, in particular, enforcing mathematical rules of probability to ensure that our beliefs about climate sensitivity are internally consistent and consistent with our beliefs about the individual pieces of evidence. All observational evidence must be interpreted using some type of model that relates underlying quantities to the data, hence there is no such thing as a purely observational estimate of climate sensitivity. Uncertainty associated with any evidence therefore comes from three sources: observational uncertainty, potential model error, and unknown influences on the evidence such as unpredictable variability. By comparing past studies that used different models for interpreting similar evidence we find that the additional uncertainty associated with the model itself is considerable compared with the stated uncertainties typically obtained in such studies assuming one particular model. When numerical Global Climate Models (GCMs) {blogger’s note: The acronym GCM stands for General Circulation Model} are used to interpret evidence, they reveal deficiencies in the much simpler models used traditionally—in particular the failure of these models to adequately account for the effects of non-homogeneous warming. This insight is particularly important for the historical temperature record, which is revealed by GCMs to be compatible with higher climate sensitivities than previously inferred using simple models. In general, many published studies appear to have overestimated the ability of a particular line of evidence to constrain sensitivity, leading to contradictory conclusions. When additional uncertainties are accounted for, single lines of evidence can sometimes offer only relatively weak constraints on the sensitivity. The effective sensitivity S analyzed here is defined based on the behavior during the first 150 years after a step change in forcing, which is chosen for several practical reasons. While our study also addresses other measures of sensitivity (the Transient Climate Response TCR) and long-term equilibrium sensitivity, the calculations of these were not optimal and future studies could apply a methodology similar to that used here to quantify them, or other quantities perhaps more relevant to medium-term warming, more rigorously. After extensively examining the evidence qualitatively and quantitatively we followed a number of past studies and used Bayesian methods to attempt to quantify the
    implications and probability distribution function for S. It must be remembered that every step of this process involves judgments or models, and results will depend on assumptions and assessments of structural uncertainties that are hard to quantify hard to quantify. Thus we emphasize that a solid qualitative understanding of how the evidence stacks up is at least as important as any probabilities we assign. Nonetheless, sensitivity tests suggest that our results are not very sensitive to reasonable assumptions in the statistical approach.
  5. SECTION 8: THE CONCLUSIONS:   (1) Each line of evidence considered here—process knowledge, the historical warming record, and the paleoclimate record—accords poorly with values outside the traditional “Charney” range of 1.5-4.5 K for climate sensitivity. (2) But when these lines of evidence are taken together, because of their mutual reinforcement, we find the “outside” possibilities for S to be substantially reduced. Whatever the true value of S is, it must be reconcilable with all pieces of evidence; if any one piece of evidence effectively rules out a particular value of S, that value does not become likely again just because it is consistent with some other, weaker, piece of evidence as long as there are other S values consistent with all the evidence. If on the other hand every value of S appeared inconsistent with at least one piece of evidence, the evidence would need reviewing to look for mistakes. But we do not find this situation. Instead we find that the lines are broadly consistent in the sense that there is plenty of overlap between the ranges of S each supports. This strongly affects our judgment of S: if the true S were 1 K, it would be highly unlikely for each of several lines of evidence to independently point toward values around 3 K. And this statement holds even when each of the individual lines of evidence is thought to be prone to errors. We asked the following question (following Bjorn Stevens): what would it take, in terms of errors or unaccounted-for factors, to reconcile an outside value of S with the totality of the evidence? A very low sensitivity (S ~ 1.5 K or less) would require all of the following: Negative low-cloud feedback. This is not indicated by evidence from satellite or process model studies and would require emergent constraints on GCMs to be wrong. Or, a strong and unanticipated negative feedback from another cloud type such as cirrus, which is possible due to poor understanding of these clouds but is neither credibly suggested by any model, nor by physical principles, nor by observations. Cooling of climate by anthropogenic aerosols over the instrumental period at the extreme weak end of the plausible range (near zero or slight warming) based both on direct estimates and attribution results using warming patterns. Or, that forced ocean surface warming will be much more heterogeneous than expected and cooling by anthropogenic aerosols is from weak to middle of the assessed range.  Warming during the mid-Pliocene Warm Period well below the low end of the range inferred from observations, and cooling during the Last Glacial Maximum also below the range inferred from observations. Or, that S is much more state-dependent than expected in warmer climates and forcing during these periods was higher than estimated. In other words, each of the three lines of evidence strongly discounts the possibility of S around 1.5 K or below: the required negative feedbacks do not appear achievable, the industrial-era global
    warming of nearly 1 K could not be fully accounted for, and large global temperature changes through Earth history would also be inexplicable. A very high sensitivity (S > 4.5 K) would require all of the following to be true: Total cloud feedback stronger than suggested by process-model and satellite studies, Cooling by anthropogenic aerosols near the upper end of the plausible range. Or, that 4184 future feedbacks will be much more positive than they appear from this historical record because the mitigating effect of recent SST patterns on planetary albedo has been at the high end of expectations, Much weaker-than-expected negative forcing from dust and ice sheets during the Last Glacial Maximum Or, a strong asymmetry in feedback state-dependence, significantly less positive feedback in cold climates than in the present, but relatively little difference in warmer paleoclimates). Thus, each of the three lines of evidence also argues against very high S, although not as strongly as they do against low S. This is mainly because of uncertainty in how strongly “pattern effects” may have postponed the warming from historical forcing, which makes it difficult to rule out the possibility of warming accelerating in the future based on what has happened so far. Indeed, we find that the paleoclimate record (in particular, the Last Glacial Maximum) now provides the strongest evidence against very high S, while all lines provide more similar constraints against low S (paleo slightly less than the others). An important question governing the probability of low or high S is whether the lines of evidence are independent, such that multiple chance coincidences would be necessary for each of them to be wrong in the same direction. For the most part, the various elements in low- and high-S scenarios do appear superficially independent. For example, while possible model errors are identified that (if they occurred) could affect historical or paleo evidence, they mostly appear unrelated to each other or to global cloud feedback or model-predicted S. Some key unknowns act in a compensating fashion, i.e., where an unexpected factor would oppositely affect two lines of evidence, effectively cancelling out most of its contributed uncertainty. Even in the one identified possibility where an unknown could affect more than one line of evidence in the same direction, modelling indicates a relatively modest impact on the probability distribution function. The IPCC AR5 concluded that climate sensitivity is likely (≥ 66% probability) in the range 1.5-4.5 K. The probability of S being in this range is 93% in our Baseline calculation, and is no less than 82% in all other “plausible” calculations considered as indicators of reasonable structural uncertainty. Although consistent with IPCC’s “likely” statement, this indicates considerably more confidence than the minimum implied by the statement. We also find asymmetric probabilities outside this range, with negligible probability below 1.5 K but up to an 18% chance of being above 4.5 K . This is consistent with all three lines of evidence arguing against low sensitivity fairly confidently, which strengthens in combination. Given this consensus, we do not see how any reasonable interpretation of the evidence could assign a significant chance to S < 1.5 K. Moreover our plausible sensitivity experiments indicate a less-than 5% chance that S is below 2 K: our Baseline 5-95% range is 2.3-4.7 K and remains within 2.0 and 5.7 K under reasonable structural changes. Since the extreme tails of the probability distribution function of S are more uncertain and possibly sensitive to “unknown unknowns” and mathematical choices, it may be safer to focus on 66% ranges (the minimum for what the IPCC terms “likely”). This range in our Baseline case is 2.6-3.9 K, a span less than half that of AR5’s likely range, and is bounded by 2.3 and 4.5 K in all plausible alternative calculations considered. Although we are more confident in the central part of the distribution, the upper tail is important for quantifying the overall risk associated with climate change and so does need to be considered. We also note that allowing for “surprises” in individual lines of evidence via “fat-tailed” likelihoods had little effect on results, as long as such surprises affect the evidence lines independently. Our S is not the true equilibrium sensitivity ECS, which is expected to be somewhat higher than S due to slowly emerging positive feedback. Values are similar, however, because we define S for a quadrupling of CO2 while ECS is defined for a doubling, which cancels out most of the expected effect of these feedbacks .We find that the 66% ECS range, at 2.6-4.1 K bounded by 2.4 and 4.6 K, is not very different from that of S, though slightly higher. Thus, our constraint on the upper bound of the ‘likely’ range for ECS is close to that of the IPCC AR5 and previous assessments, which formally adopt an equilibrium definition. The constraint on the lower bound of the “likely” range is substantially stronger than that of AR5 regardless of the measure used. The uncertainties in ECS and S assessed here are similar because each is somewhat better constrained than the other by some subset of the evidence. Among the plausible alternate calculations, the one producing the weakest high end constraint on S uses a uniform-S-inducing prior, which shifts the ranges upward to 2.8-4.5 K (66%) and 2.4-5.7 K (90%). Our Baseline calculation assumes feedbacks are independent (or that dependence is unknown), which predicts a non-uniform prior probability distribution function for S; to predict a uniform one requires instead assuming a known, prior dependence structure among the feedbacks. Although lack of consensus on priors remains a leading-order source of spread in possible results, we still find that sensitivity to this is sufficiently modest that strong constraints are possible, especially at the low end of the S range. The main reason for the stronger constraints seen here in contrast to past assessments is that new analysis and understanding has led us to combine lines of evidence in a way the community was not ready to do previously. We also find that the three main lines of evidence are more consistent than would be expected were the true uncertainty to be as large as in previous assessments. While some individual past studies have assigned even narrower ranges, as discussed above, past studies have often been overconfident in assigning uncertainty so not too much weight should be given to any single study. We note that although we did not use GCM “emergent constraint” studies using present-day climate system variables in our base results, our results are nonetheless similar to what those studies suggest in the aggregate. New models run for CMIP6 are showing a broader range of S than previous iterations of CMIP.  Our findings are not sensitive to GCM S distributions since we do not directly rely on them. The highest and lowest CMIP6 S values are much less consistent with evidence analyzed here than those near the middle of the range. Some of the  effects quantified in this paper with the help of GCMs were looked at only with pre-CMIP6 models, and interpretations of evidence might therefore shift in the future upon further analysis of newer models, but we would not expect such shifts to be noteworthy.

 

 

PART-2; CRITICAL COMMENTARY

  1. The extremely verbose and confused rant of statements about climate sensitivity intermingled with a multiplicity of interpretations and disclaimers along with a duality in the definition of climate sensitivity as ECS and as S, does not provide useful information on the subject.
  2. Also a research question stated as what the width of the climate sensitivity confidence interval should be and whether it can be reduced from the width suggested by Charney is inappropriate in an unbiased and objective scientific inquiry. The issue is not the width of the confidence interval or what the probability in the confidence interval should be but only what the mean and variance of the estimate are. Confidence intervals are simply a way of expressing mean and variance. An extreme form of bias is contained in the research question stated as We identify promising avenues for further narrowing the range in S“.
  3. The Equilibrium Climate Sensitivity described by Charney is derived from climate model simulations of CO2 forcing only that refers specifically to the correlation between temperature and the natural logarithm of atmospheric CO2 concentration in the absence of other forcings. However, temperature forecasts are made with a portfolio of forcings that includes but is not restricted to CO2 forcing. Some of the complexity of the presentation appears to derive from the lack of clarity in this distinction.
  4. As suggested at the end of the paper, but not used in the analysis that precedes it, the understanding of warming and the forecast of future warming should be based on the complete portfolio of forcings that includes ECS CO2 forcing. The forcings portfolio can then be tested against observed temperatures and evaluated according to the fit as demonstrated in related posts at this site: [LINK] [LINK] .
  5. The analysis also displays the oddity in climate science of understanding  variance not as degradation of the information content of the mean but as how extreme the the values are that define the confidence interval such that the low information content of large variances is understood not as uncertainty but as the certainty of how extreme the the values COULD be. Such odd interpretations of variance likely derives from confirmation bias in climate science that looks at confidence intervals not as measures of uncertainty (not knowing) but measures of knowing how extreme it COULD be [LINK] .
  6. The extensive research efforts and their interpretation presented in the manuscript appear to be products of inappropriate research questions that derive from a flawed interpretation of a confidence interval and the confirmation bias of the researchers expressed as a research objective not of discovering an unbiased estimate of the mean and variance of climate sensitivity to atmospheric CO2 but of finding ways to reduce the width of the confidence interval from the large interval proposed by Jules Charney.
  7. The authors mention the relevance of the TCRE (transient climate response to cumulative emissions) but do not address its many anomalous interpretations. For example, the TCRE shows that cumulative emissions of one teratonne will cause 1.5C of warming within a small uncertainty band. The corresponding increase in atmospheric CO2 implies a climate sensitivity and the corresponding uncertainty in the TCRE implies a climate sensitivity uncertainty and its 95% confidence interval. A study of climate sensitivity and its uncertainty should be able to explain the TCRE.
  8. The authors do not do that writing only that “The Transient Climate Response (TCR, or warming at the time of CO2 doubling in an idealized 1% per year increase scenario), has been proposed as a better measure of warming over the near- to
    medium-term; it may be more generally related to peak warming, and better constrained by historical warming, than S. It may also be better at predicting high-latitude warming. But 21st-century global-mean trends under high emissions are better predicted by S than by TCR perhaps because of non-linearities in forcing or response or because TCR estimates are affected by noise. TCR is less directly related to the other lines of evidence than is S” 
  9. With this brief and mysterious assessment, the authors dismiss the topic altogether. In fact TCRE is not any of these things and TCRE is not affected by noise. In fact the TCRE coefficient is derived from a near perfect correlation between temperature and cumulative emissions. The authors cite the Knutti 2017 paper in which Reto Knutti and co-authors extol the virtues of the TCRE and propose that the TCRE should replace climate sensitivity as our way of understanding the warming effect of fossil fuel emissions. It is clear from the authors’ language that they either did not study the TCRE sufficiently or chose to dismiss it without a sufficient explanation of why it was dismissed. 
  10. In summary, we find that this study derives from biased research questions and a poor understanding of variance as a measure of uncertainty. It does not present a useful analysis of the climate sensitivity issue in climate science specifically having to do with the understanding of climate sensitivity in the context of all forcings and of being able to relate the sensitivity issue to the TCRE.