Archive for December 2020
THIS POST IS A CRITICAL REVIEW OF A LECTURE ON CARBON BUDGETS FOR THE CLIMATE ACTION OF REDUCING AND ELIMINATING FOSSIL FUEL EMISSIONS SO THAT HUMANS CAN CONTROL THE CLIMATE AND SAVE THE PLANET. LINK TO SOURCE: https://futureoflife.org/2019/10/29/not-cool-ep-18-glen-peters-on-the-carbon-budget-and-global-carbon-emissions/

PART-1: WHAT CLIMATE SCIENTIST DR. GLEN PETERS SAYS
GLEN PETERS, CLIMATE SCIENTIST, AND CARBON BUDGET EXPERT. Topic: We’ll learn what the carbon budget is and why it’s hard to calculate, why some causes of carbon emissions are harder to address than others, how the phrase “carbon footprint” is so often misused and why it’s also hard to calculate, how responsibility for emissions is attributed to different countries. Glen Peters is a Research Director at the CICERO, the Center for International Climate Research in Oslo. Most of his research is on past, current, and future trends in energy consumption and greenhouse gas emissions. He studies human drivers of global change, the global carbon cycle, bioenergy, scenarios, sustainable consumption, international trade and climate policy, and emission metrics. THE LECTURE WAS GIVEN IN A QUESTION AND ANSWER FORMAT AND IS PRESENTED IN THAT FORMAT BELOW.
QUESTION: WHAT IS THE GLOBAL CARBON BUDGET? ANSWER: In the global carbon budget, we try and look at all the sources of carbon into the atmosphere and the sort of sinks of carbon and try and understand where carbon is going. You could think about it a bit like a bathtub, where you try and look what’s going into the bathtub and see what’s going out of the bathtub and make sure they match. The carbon budget generally has two components: the source component, so what’s going into the atmosphere; and the sink component, so the components which are more or less going out of the atmosphere. So in terms of sources, we have fossil fuel emissions; so we dig up coal, oil, and gas and burn them and emit CO2. We have cement, which is a chemical reaction, which emits CO2. That’s sort of one important component on the source side. We also have land use change, so deforestation. We’re chopping down a lot of trees, burning them, using the wood products and so on. And then on the other side of the equation, sort of the sink side, we have some carbon coming back out in a sense to the atmosphere. So the land sucks up about 25% of the carbon that we put into the atmosphere and the ocean sucks up about 25%. So for every ton we put into the atmosphere, then only about half a ton of CO2 remains in the atmosphere. So in a sense, the oceans and the land are cleaning up half of our mess, if you like. The other half just stays in the atmosphere. Half a ton stays in the atmosphere; the other half is cleaned up. It’s that carbon that stays in the atmosphere which is causing climate change and temperature increases and changes in precipitation and so on. This 50% is a pretty robust number, and this is one of the great mysteries of the carbon cycle. Mystery is not really the right word, but it’s quite curious that no matter how much we’re putting in the atmosphere, this fraction that stays in the atmosphere, about 50%, has remained relatively constant. So if we go back 50 years in time for example, we still had about 50% stay in the atmosphere. Today when we emit, there’s about 50% that stays in the atmosphere, but then there’s a question of will this continue forever if we start to see the impacts of climate change — changing precipitation, maybe for example the land sink is not as good, so maybe tropical forests don’t take up as much carbon. Then we may see this share drop and more stay in the atmosphere. We know — there is some evidence that it may change, so when there’s an El Nino and it causes hotter and drier weather in the tropics, less carbon is taken up by the forests, and so we see a greater increase in the atmosphere. This is sort of natural variability, but this natural variability gives us an idea of what may happen if temperatures increase.
QUESTION: WHAT IS THE CARBON BUDGET IMBALANCE ISSUE IN CLIMATE SCIENCE? HOW DOES THAT RELATE TO WHAT YOU JUST SAID? ANSWER: The carbon budget is like a balance, so you have something coming in and something going out, and in a sense by mass balance, they have to equal. So if we go out and we take an estimate of how much carbon have we emitted by burning fossil fuels or by chopping down forests and we try and estimate how much carbon has gone into the ocean or the land, then we can measure quite well how much carbon is in the atmosphere. So we can add all those measurements together and then we can compare the two totals — they should equal. But they don’t equal. And this is sort of part of the science, if we overestimated emissions or if we over or underestimated the strength of the land sink or the oceans or something like that. And we can also cross check with what our models say. So this carbon imbalance is basically the balance between what we think is happening and whether those two things agree. And they don’t, but the good thing is that sometimes we overestimate the balance, sometimes we underestimate it — which means that they’re sort of bouncing above and below zero, if you like, so it averages out. Just like the weather, we can have hot years, dry years, and so on and so forth. So when you think about the global climate, we also have some years that are a little bit warmer, some years that are a little bit cooler, and this is propagating to natural variability in the carbon cycle. First of all, we can’t perfectly measure everything; and second, our models can’t protect all the natural variability that happens. But if you average over a longer time period, over decades or whatever, this carbon imbalance averages out to zero, which is nice. It means that on average, we’ve got the science right. It’s just some of the details we’re missing. It’s like we’re not sure whether it’s going to rain next Thursday or not. So it’s that sort of variability that we can’t detect. But the big scale changes in the system we can detect well.

PART-2: THE AIRBORNE FRACTION ISSUE IN CLIMATE SCIENCE
THE THEORY OF AGW: A foundational relationship in the theory of AGW (Anthropogenic Global Warming) is that fossil fuel emissions and atmospheric CO2 concentration are causally related such that the observed rise in atmospheric CO2 concentration is explained as a creation of fossil fuel emissions. The climate action of zero fossil fuel emissions demanded by climate scientists to stop this rise in CO2 {and thereby to stop the warming} is understood in this context.
THE EVIDENCE FOR THIS CAUSATION: The evidence for this causation that also serves as the evidence of the effectiveness of the climate action demanded is that since the Industrial Revolution, humans have been burning fossil fuels; and over the same period we find that atmospheric CO2 concentration has been going up. In climate science, this concurrence of emissions and rising atmospheric CO2 over a centennial time scale is taken as evidence of causation in support of the hypothesis that fossil fuel emissions cause atmospheric CO2 concentration to rise; and serves as the rationale for the climate action logic that if we stop burning fossil fuels it will stop the warming. The empirical evidence for this causation is provided as the Airborne Fraction as approximately 50%, that is to say that about half of the fossil fuel emissions is removed by the carbon cycle and the other half remains airborne in the atmosphere.
CIRCULAR REASONING IN THIS CAUSATION HYPOTHESIS TEST: In terms of research methodology, what we see here is that the airborne fraction hypothesis was derived from the data and was then tested with the sdame data. This kind of emiprical test is not possible and its results are not credible because the procedure involves circular reasoning. A hyothesis derived from the data cannot be tested with the same data. That kind of hypothesis test suffers from an eextreme form of circular reasoning called the Texas Sharpshooter Fallacy where you shoot first and draw the target circle later. Theefore, the Airborne Fraction argument of climate science is not credible and can be rejected on this basis.

UNCERTAINTY IN THE AIRBORNE FRACTION: As seen in the chart below, although the long term average of the airborne fraction averages out to about 50% or maybe a little higher, the annual mass balance yields a large range for the airborne fraction from 20% to 90%. This variance is defended by the climate science lecture presented above as something that “balances out in the long run” and that it is therefore a statistically valid form of evidence for the causation – that fossil fuel emissions cause atmospheric CO2 to rise. BUT the long run balance is not the issue. In climate science, fossil fuel emissions cause atmospheric CO2 to rise at an annual time scale and therefore this issue must be studied at an annual time scale.

THE AIRBORNE FRACTION: In the lecture above, the climate scientist concedes that there is a mass balance problem with the causation hypothesis that fossil fuel emissions cause atmospheric CO2 concentration to rise. The mass balance shows that the assumed equality of annual fossil fuel emissions and annual rise in atmospheric CO2 is not found in the data. What we find instead is that annual emissions tend to be greater than the annual emissions needed to explain the observed annual change in atmospheric CO2. The explanation for this paradox offered by climate science is that the excess emissions are somehow removed from the atmosphere by carbon cycle flows so that not all the emissions end up in the atmosphere but no mechanism and no empirical evidence for this removal are offered. The portion of annual emissions used to explain the annual change in atmospheric CO2 concentration is called the “Airborne Fraction“.
That the excess annual emissions not explained by annual change in atmospheric CO2 must therefore go somewhere else and if we look through the large carbon cycle flows maybe we can find a way to explain this paradox with the possibility but not the evidence that the missing CO2 goes into carbon cycle flows is a case of circular reasoning and confirmation bias.
These data are interpreted as evidence that about half of the annual emissions stays in the atmosphere {The Airborne Fraction}and causes atmospheric CO2 to rise {to cause warming} and that the other half must therefore be absorbed by nature’s carbon cycle flows to one or more of the sinks in the carbon cycle system.
CIRCULAR REASONING AND CONFIRMATION BIAS: The problem is that this airborne fraction explanation of the emissions mass balance anomaly is a case of circular reasoning and confirmation bias as follows. The airborne fraction was not independently determined from theory but found in the data. A hypothesis was then derived from the data that the excess emissions not explained by the change in atmospheric CO2 are removed by the carbon cycle. This hypothesis was then tested with the same data used to construct the hypothesis. This kind of hypothesis test contains the circular reasoning fallacy. A hypothesis derived from the data cannot be tested with the same data.


PART-3: DETRENDED CORRELATION ANALYSIS OF ANNUAL CHANGES IN MAUNA LOA CO2 CONCENTRATIONS AGAINST ANNUAL FOSSIL FUEL EMISSIONS.
LINK: https://tambonthongchai.com/2020/11/11/annual-changes-in-mlo-co2/


CORRELATION = 0.75





DISCUSSION AND CONCLUSION: THE SOURCE DATA SHOW A STRONG STATISTICALLY SIGNIFICANT CORRELATION OF CORR=0.75 BETWEEN ANNUAL CHANGES IN MLO CO2 AND ANNUAL EMISSIONS. THIS CORRELATION APPEARS TO SUPPORT THE USUAL ASSUMPTION THAT CHANGES IN ATMOSPHERIC CO2 CONCENTRATION ARE CAUSED BY FOSSIL FUEL EMISSIONS AND THAT THEREFORE THESE CHANGES CAN BE MODERATED WITH CLIMATE ACTION TO CONTROL AND REDUCE THE RATE OF WARMING.
HOWEVER, IT IS KNOWN THAT SOURCE DATA CORRELATION BETWEEN TIME SERIES DATA DERIVE FROM TWO SOURCES. THESE ARE (1) SHARED TRENDS WITH NO CAUSATION IMPLICATION AND (2) RESPONSIVENESS AT THE TIME SCALE OF INTEREST. HERE THE TIME SCALE OF INTEREST IS ANNUAL BECAUSE THE THEORY REQUIRES THAT ANNUAL CHANGES IN ATMOSPHERIC CO2 CONCENTRATION ARE CAUSED BY ANNUAL FOSSIL FUEL EMISSIONS. THIS TEST IS MADE BY REMOVING THE SHARED TREND THAT IS KNOWN TO HAVE NO CAUSATION INFORMATION OR IMPLICATION. HERE WE FIND THAT WHEN THE SHARED TREND IS REMOVED THE OBSERVED CORRELATION DISAPPPEARS. THE APPARENT CORRELATION BETWEEN EMISSIONS AND CHANGES IN ATMOSPHERIC CO2 CONCENTRATION IS THUS FOUND TO BE SPURIOUS.
THE DATA FOR ANNUAL FOSSIL FUEL EMISSIONS AND ANNUAL CHANGES IN ATMOSPHERIC CO2 CONCENTRATION DO NOT SHOW THAT FOSSIL FUEL EMISSIONS CAUSE ATMOSPHERIC CO2 CONCENTRATION TO CHANGE. THE FINDING IMPLIES THAT THERE IS NO EMPIRICAL EVIDENCE IN SUPPORT OF THE THEORY OF CLIMATE ACTION. THIS THEORY HOLDS THAT MOVING THE GLOBAL ENERGY INFRASTRUCTURE FROM FOSSIL FUELS TO RENEWABLES WILL MODERATE THE RATE OF INCREASE IN ATMOSPHERIC CO2 AND THEREBY MODERATE THE RATE OF WARMING.

DETRENDED CORRELATION ANALYSIS#2: RESPONSIVENESS OF ATMOSPHERIC COMPOSITION TO FOSSIL FUEL EMISSIONS. LINK: https://tambonthongchai.com/2020/06/14/responsiveness-of-atmospheric-co2-to-fossil-fuel-emissions/

CONCLUSIONS: A key relationship in the theory of anthropogenic global warming (AGW) is that between annual fossil fuel emissions and annual changes in atmospheric CO2. The proposed causation sequence is that annual fossil fuel emissions cause annual changes in atmospheric CO2 which in turn intensifies the atmosphere’s heat trapping property. It is concluded that global warming is due to changes in atmospheric composition attributed to human activity and is therefore a human creation and that therefore we must reduce or eliminate fossil fuel emissions to avoid climate catastrophe (Parmesan, 2003) (Stern, 2007) (IPCC, 2014) (Flannery, 2006) (Allen, 2009) (Gillett, 2013) (Meinshausen, 2009) (Canadell, 2007) (Solomon, 2009) (Stocker, 2013) (Rogelj, 2016). A testable implication of the proposed causation sequence is that annual changes in atmospheric CO2 must be related to annual fossil fuel emissions at an annual time scale. This work is a test of this hypothesis. We find that detrended correlation analysis of annual emissions and annual changes in atmospheric CO2 does not support the anthropogenic global warming hypothesis because no evidence is found that changes in atmospheric CO2 are related to fossil fuel emissions at an annual time scale. These results are consistent with prior works that found no evidence to relate the rate of warming to the rate of emissions (Munshi, The Correlation between Emissions and Warming in the CET, 2017) (Munshi, Long Term Temperature Trends in Daily Station Data: Australia, 2017) (Munshi, Generational Fossil Fuel Emissions and Generational Warming: A Note, 2016) (Munshi, Decadal Fossil Fuel Emissions and Decadal Warming: A Note, 2015) (Munshi, Effective Sample Size of the Cumulative Values of a Time Series, 2016) (Munshi, The Spuriousness of Correlations between Cumulative Values, 2016). The finding raises important questions about the validity of the IPCC carbon budget which apparently overcomes a great uncertainty in much larger natural flows to describe with great precision how flows of annual emissions are distributed to gains in atmospheric and oceanic carbon dioxide (Bopp, 2002) (Chen, 2000) (Davis, 2010) (IPCC, 2014) (McGuire, 2001). These carbon budget conclusions are inconsistent with the findings of this study and are the likely result of insufficient attention to uncertainty, excessive reliance on climate models, and the use of “net flows” (Plattner, 2002) that are likely to be subject to assumptions and circular reasoning (Edwards, 1999) (Ito, 2005) (Munshi, 2015a) (Munshi, 2016) (Munshi, An Empirical Study of Fossil Fuel Emissions and Ocean Acidification, 2015). However, it should be mentioned that though computed values of the “retained fraction” varies from 0.226 to 1.06, the mean value of 0.565 is statistically significant.
DETAILS OF THIS ANALYSIS PROVIDED IN A RELATED POST ON THIS SITE: LINK: https://tambonthongchai.com/2020/06/14/responsiveness-of-atmospheric-co2-to-fossil-fuel-emissions/

DETRENDED CORRELATION ANALYSIS#3: LONGER TIME SCALES. LINK: https://tambonthongchai.com/2018/12/19/co2responsiveness/
TIME SCALES FROM ANNUAL TO 5-YEARS ARE STUDIED. CHART: AIRBORNE FRACTION AT DIFFERENT TIME SCALES.

CHART: DETRENDED CORRELATION AT DIFFERENT TIME SCALES

FINDING We conclude that atmospheric composition specifically in relation to the CO2 concentration is not responsive to the rate of fossil fuel emissions. This finding is a serious weakness in the theory of anthropogenic global warming by way of rising atmospheric CO2 attributed to the use of fossil fuels in the industrial economy; and of the “Climate Action proposition of the UN that reducing fossil fuel emissions will moderate the rate of warming by slowing the rise of atmospheric CO2. The finding also establishes that the climate action project of creating Climate Neutral Economies, that is Economies that have no impact on atmospheric CO2, is unnecessary because the global economy is already Climate Neutral.

PART-4: MONTE CARLO SIMULATION OF MIXING CARBON CYCLE FLOWS & FOSSIL FUEL EMISSIONS
The Airborne Fraction issue is studied with Monte Carlo Simulation in two related posts on this site:
SIMULATION#1: :https://tambonthongchai.com/2020/06/10/a-monte-carlo-simulation-of-the-carbon-cycle/
SIMULATION#2: https://tambonthongchai.com/2018/05/31/the-carbon-cycle-measurement-problem/
SIMULATION#1: In SIMULATION#1 we show that the Airborne Fraction anomaly is a serious issue in climate science. As explained above and as seen in the chart below, this critical parameter is not a constant but tends to vary a lot. Specifically, it is not always 50% as claimed in climate science.

The issue in this regard is that to understand the impact of fossil fuel emissions on atmospheric CO2 in terms of carbon cycle flows, we need to be able to measure carbon cycle flows with great precision because these flows are an order of magnitude greater than the relatively small flow of fossil fuel emissions such that even a small random variance in the large carbon cycle flows will make the relatively small fossil fuel emissions undetectable net of the variance in carbon cycle flows.
But the reality is that carbon cycle flows cannot be measured. Their flows can only be inferred from relevant data on the biota and changes in oceanic chemistry etc. Listed below are some IPCC estimates of flow and variance in carbon cycle flows. Other than land use where a mean of 1.1 and a standard deviation of 0.8 pretty much makes that an unknown, the only useful carbon cycle flow data are found for photosynthesis CO2 flow from the atmosphere to the biota. Here, a standard deviation of 8 in a flow estimated to be 123 suggests a ratio where standard deviation is 6.5% of flow. We propose that photosynthesis is a low uncertainty case and therefore to be on the safe side of a Monte Carlo Simulation test of the causation relationship assumed by climate science , we use this percentage standard deviation for all IPCC declared carbon cycle flows to set up a Monte Carlo Simulation of the dynamics of combining the carbon cycle flows and fossil fuel emissions to assess the net impact of fossil fuel emissions on atmospheric composition.
The IPCC list of carbon cycle flows:
Natural: Ocean surface to atmosphere:Mean=78.4,SD=N/A.
Natural: Atmosphere to ocean:surface:Mean=80.0,SD=N/A
Human: Fossil fuel emissions:surface to atmosphere:Mean=7.8,SD=0.6
Human: Land use change:surface to atmosphere:Mean=1.1,SD=0.8
Natural: Photosynthesis:atmosphere to surface:Mean=123.0,SD=8.0
Natural: Respiration/fire:surface to atmosphere:Mean=118.7,SD=N/A
Natural: Freshwater to atmosphere:Mean=1.0,SD=N/A
Natural: Volcanic emissions surface to atmosphere:Mean=0.1,SS =N/A
Natural: Rock weathering:surface to atmosphere:Mean=0.3,SD=N/A
RESULTS OF THE MONTE CARLO SIMULATION#1: THE MONTE CARLO SIMULATION SHOWS THAT THE COMPUTED AIRBORNE FRACTION PROPOSED BY CLIMATE SCIENCE HAS NO INTERPRETATION IN TERMS OF ATMOSPHERIC COMPOSITION BECAUSE IT DOES NOT TAKE UNCERTAINTIES INTO ACCOUNT.
WHEN UNCERTAINTIES ARE TAKEN INTO ACCOUNT NO STATISTICALLY SIGNIFICANT AIRBORNE FRACTION IS FOUND IN THE DATA. THE AIRBORNE FRACTION ARGUMENT PROPOSED BY CLIMATE SCIENCE IS REJECTED ON THE BASIS THAT IT DOES NOT TAKE UNCERTAINTY IN CARBON CYCLE FLOWS ARE INTO ACCOUNT. WHEN THESE UNCERTAINTIES ARE INCLUDED, THE DATA DO NOT SHOW EVIDENCE OF HUMAN CAUSE IN THE OBSERVED CHANGES IN ATMOSPHERIC COMPOSITION.
THEREFORE THERE IS NO EVIDENCE OF HUMAN CAUSE AND NO EVIDENCE FOR THE EFFECTIVENESS OF CLIMATE ACTION.




SIMULATION#2: MONTE CARLO SIMULATION IN REVERESE.
THIS MONTE CARLO SIMULATION IS CARRIED OUT IN REVERESE TO DETERMINE THE MAXIMUM UNCERTANTY IN CARBON CYCLE FLOWS AT WHICH FOSSIL FUEL EMISSIONS CAN STILL BE DETECTED AND MEASURED. LINK: https://tambonthongchai.com/2018/05/31/the-carbon-cycle-measurement-problem/ THIS ANALYSIS IS REPRODUCED BELOW IMMEDIATELY FOLLOWING THE LIST OF CARBON CYCLE FLOWS.
LIST OF CARBON CYCLE FLOWS PROVIDED BY THE IPCC
- Natural: Ocean surface to atmosphere:Mean=78.4,SD=N/A.
- Natural: Atmosphere to ocean:surface:Mean=80.0,SD=N/A
- Human: Fossil fuel emissions:surface to atmosphere:Mean=7.8,SD=0.6
- Human: Land use change:surface to atmosphere:Mean=1.1,SD=0.8
- Natural: Photosynthesis:atmosphere to surface:Mean=123.0,SD=8.0
- Natural: Respiration/fire:surface to atmosphere:Mean=118.7,SD=N/A
- Natural: Freshwater to atmosphere:Mean=1.0,SD=N/A
- Natural: Volcanic emissions surface to atmosphere:Mean=0.1,SS =N/A
- Natural: Rock weathering:surface to atmosphere:Mean=0.3,SD=N/A

- A simple flow accounting of the mean values without consideration of uncertainty shows a net CO2 flow from surface to atmosphere of 4.4 GTC/y. The details of this computation are as follows. In the emissions and atmospheric composition data we find that during the decade 2000-2009 total fossil fuel emissions were 78.1 GTC and that over the same period atmospheric CO2 rose from 369.2 to 387.9 ppm for an increase of 18.7 ppm equivalent to 39.6 GTC in atmospheric CO2 or 4.4 GTC/y. The ratio of the observed increase in atmospheric carbon to emitted carbon is thus =39.6/78.2=0.51. This computation is the source of the claim that the so called “Airborne Fraction” is about 50%; that is to say that about half of the emitted carbon accumulates in the atmosphere on average and the other half is absorbed by the oceans, by photosynthesis, and by terrestrial soil absorption. The Airborne Fraction of AF=50% later had to be made flexible in light of a range of observed values.
- THE CHARTS ABOVE above show that a large range of values of the decadal mean Airborne Fraction of 0<DMAF<4 .5 for decades ending in 1860 to 2017. This sample period includes ice core CO2 data from the Law Dome for years prior to 1958. However, when the sample period is restricted to the more precise Mauna Loa data from 1958, a much smaller range of values are seen in the right frame of Figure 1 with 0.45<DMAF<0.65. These data appear to support the usual assumption in climate science that fossil fuel emissions have contributed about half of the decadal mean increase in atmospheric CO2 concentration since 1958; but as demonstrated in a related post [LINK] , without a correlation between emissions and changes in atmospheric CO2 concentration, airborne fractions can be computed but they have no interpretation in terms of cause and effect in the phenomenon being studied [LINK] .
- When uncertainties are not considered, the flow accounting appears to show an exact match of the predicted and computed carbon balance. It is noted, however, that this exact accounting balance is achieved, not with flow measurements, but with estimates of unmeasurable flows constrained by the circular reasoning that assigns flows according to an assumed flow balance.
- However, a very different picture emerges when uncertainties are included in the balance. Published uncertainties for three of the nine flows are available in the IPCC reports. Uncertainty for the other six flows are not known. However, we know that they are large because no known method exists for the direct measurement of these flows. They can only be grossly inferred based on assumptions that exclude geological carbon flows.
- Here, we set up a Monte Carlo simulation to estimate the highest value of the unknown standard deviations of carbon cycle flows at which we can still detect the presence of human emissions within the portfolio of carbon cycle flows. For the purpose of this test we propose that an uncertain flow account is in balance as long as the Null Hypothesis that the sum of the flows is zero cannot be rejected. The alpha error rate for the test is set to a high value of alpha=0.10 to ensure that any reasonable ability to discriminate between the flow account WITH Anthropogenic Emissions from a the flow account WITHOUT Anthropogenic Emissions is taken into evidence that the relatively small fossil fuel emissions can be detected in the presence of much larger and uncertain natural flows. The spreadsheet used in this determination is available for download from an online data archive Data Archive Link .
- In the simulation we assign different levels of uncertainty to the flows for which no uncertainty data are available and test the null hypothesis that the flows balance with anthropogenic emissions (AE) included and again with AE excluded. If the flows balance when AE are included and they don’t balance when AE are excluded then we conclude that the presence of the AE can be detected at that level of uncertainty. However, if the flows balance with and without AE then we conclude that the stochastic flow account is not sensitive to AE at that level of uncertainty because it is unable to detect their presence. If the presence of AE cannot be detected no role for their effect on climate can be deduced from the data at that level of uncertainty in natural flows.
- The balance is computed from the atmospheric perspective as Balance=Input-Output where Input is flow to the atmosphere and Output is flow from the atmosphere. The p-values for hypothesis tests for uncertainties in the natural flows from 1% of mean to 6.5% of mean are presented below both as a tabulation and as a line chart.


- In the tabulation the PCT column shows the assumed percent standard deviation in the natural flows for which no uncertainty information is available. In the”base case”, the blanket statement by the IPCC that the uncertainty is 20% is interpreted to mean that the width of the 95% confidence interval is 20% of the mean and the corresponding standard deviation computed as (20/2)/1.96 is almost identical to that in the 5% (5PC) row. The data in each row shows the p-values of two hypothesis tests labeled as WITH and WITHOUT. The WITH column shows p-values when the AE are included in the balance computation. The WITHOUT column shows the p-values when the AE are left out of the balance computation.
- We use a rather high critical p-value of alpha=0.1 for the test of the null hypothesis that Balance=0 to ensure that any reasonable chance that the effect of fossil fuel emissions can be detected is considered. Balance=0 means that the stochastic flow account is in balance. If the p-value is less than apha we reject the null hypothesis and conclude that the stochastic flow account is not in balance. If we fail to reject the null then we conclude the stochastic flow account is in balance.
- The p-values for WITH and WITHOUT in each row taken together tell us whether the stochastic flow system is sensitive to AE, that is whether the relatively small AE flow can be detected in the context of uncertainty in much larger natural flows. If we fail to reject the null hypothesis that Balance=0 in both WITH and WITHOUT columns, the stochastic flow account balances with and without the AE flows. In these cases the stochastic flow account is not sensitive to AE, that is it is unable to detect the presence of the AE flows. This is true for the five rows in which the uncertainty in natural flows is 3% of mean or higher.
- For the two lower uncertainty levels of 2% and 1% we find that the null hypothesis Balance=0 is not rejected when AE are included (the stochastic flow account is in balance) but rejected when AE are not included (the stochastic flow account is not in balance). Under these uncertainty conditions, the stochastic flow account is sensitive to the presence of AE, that is the flow account can detect the presence of the relatively small AE flows. The chart shows that the crossover uncertainty lies somewhere between 2% and 3% and in fact it is found by trial and error that the crossover occurs at 2.3%.
- These results imply that the IPCC carbon cycle stochastic flow balance is not sensitive to the presence of the relatively low flows from human activity involving fossil fuel emissions and land use change. The large natural flows of the carbon cycle cannot be directly measured and they can only be indirectly inferred. These inferred values contain uncertainties much larger than 2.3% of the mean. It is not possible to carry out a balance of the carbon cycle under these conditions.
- In the case of the conclusion by climate scientists that the observed increase in atmospheric CO2 concentration is caused by fossil fuel emissions, natural flows in the carbon cycle that are an order of magnitude larger than fossil fuel emissions and that cannot be directly measured are inferred with the implicit assumption that the increase in atmospheric CO2 comes from fossil fuel emissions. The flow balance can then be carried out and it does of course show that the increase in atmospheric CO2 derives from fossil fuel emissions The balance presented by the IPCC with inferred flows thus forces an exact balance by way of circular reasoning. Therefore, the IPCC carbon cycle balance does not contain useful information that may be used to ascertain the impact of fossil fuel emissions on the carbon cycle or on the climate system.
- A rationale for the inability to relate changes in atmospheric CO2 to fossil fuel emissions is described by Geologist James Edward Kamis in terms of natural geological emissions due to plate tectonics [LINK] . The essential argument is that, in the context of significant geological flows of carbon dioxide and other carbon based compounds, it is a form of circular reasoning to describe changes in atmospheric CO2 only in terms of human activity. It is shown in a related post, that in the context of large uncertainties in natural flows, changes in atmospheric CO2 is not responsive to the rate of emissions [LINK] .
- Circular reasoning in this case can be described in terms of the “Assume a spherical cow” fallacy [LINK] which refers to the use of simplifying assumptions needed to solve a problem that change the context of the problem so that the solution no longer answers the original research question. WE CONCLUDE THAT THE UNCERTAINTY IN CARBON CYCLE FLOWS ARE TOO LARGE TO MEASURE THE EFFECT OF RELATIVELY SMALL FLOWS OF FOSSIL FUEL EMISSIONS ON ATMOSPHERIC COMPOSITION.
CONCLUSION
DETRENDED CORRELATION ANALYSIS AND MONTE CARLO SIMULATION ARE USED TO STUDY TO RESPONSIVENESS OF ATMOSPHERIC COMPOSITION TO FOSSIL FUEL EMISSIONS. NO EVIDENCE IS FOUND TO SUPPORT THE ASSUMED CAUSATION IN CLIMATE SCIENCE WHERE THE OBSERVED RISE IN ATMOSPHERIC CO2 CONCENTRATION IS ATTRIBUTED TO FOSSIL FUEL EMISSIONS. THE FINDINGS PRESENTED ABOVE IMPLY THAT THE AIRBORNE FRACTION IS A CREATION OF CIRCULAR REASONING AND CONFRMATION BIAS.


IT IS NOTED IN A RELATED POST THAT GEOLOGICAL FLOWS OF CARBON AND CO2 ARE EXCLUDED IN THE STUDY OF THE RISE IN ATMOSPHERIC CO2 SINCE PRE-INDUSTRIAL WHERE THE INDUSTRIAL CAUSE IS SUBSUMED INTO THE STUDY METHODOLOGY. LINK: https://tambonthongchai.com/2019/08/27/carbonflows/ With respect to the argument that the absence of 13C and 14C isotopes identify fossil fuel carbon it should be noted that fossil fuel carbon and geological carbon cannot be distinguished from each other on this basis because neither contains these carbon isotopes.

RELATED POST ON CONFIRMATION BIAS IN SUPERSTITION. LINK: https://tambonthongchai.com/2018/08/03/confirmationbias/

THE HOLE IN THE SKY
Posted December 27, 2020
on:
THIS POST IS A CRITICAL REVIEW OF AN AUSTRALIAN GEOGRAPHIC ARTICLE ON FIXING THE HOLE IN THE SKY: LINK: https://www.australiangeographic.com.au/topics/science-environment/2020/01/fixing-the-hole-in-our-sky/

OVERVIEW:
WHAT WE SEE IN THE CLIMATE CHANGE MOVEMENT IS A PATTERN OF REPEATEDLY CITING THE SUCCESS OF THE MONTREAL PROTOCOL AS THE REASON THAT THE CLIMATE CHANGE MOVEMENT SHOULD ALSO SUCCEED. AN INTERPRETATION OF THIS LOGIC IS THAT THE MONTREAL PROTOCOL IS THE EVIDENCE THAT ENVIRONMENTAL MOVEMENTS WITHOUT SUFFICIENT EVIDENCE DO SUCCEED GIVEN A SUFFICIENT FEAR LEVEL AND THAT THEREFORE IT PROVIDES REASON TO CONTINUE THE CLIMATE BATTLE IN THE HOPE OF THE SAME KIND OF SUCCESS.

PART-1: WHAT THE AUSTRALIAN GEOGRAPHIC ARTICLE SAYS
Lessons from the Montreal Protocol, or how the people of the world can successfully respond as one to avert a global environmental catastrophe.
Balloons carrying sensors, have been launched every week since 2003 at Davis research station by the Australian Antarctic Division to track ozone levels in the atmosphere. Just as it seems that humans have done irreversible damage to the planet’s atmosphere and set Earth on an inexorable path to a climate-based Armageddon, there’s a sign high in the sky over Antarctica that offers hope. Kids during the last quarter of the 20th century grew up with the ozone hole looming large and ominously over their lives. When this potentially catastrophic ‘tear’ in the planet’s stratosphere, more than 10km above Antarctica, was discovered by scientists in 1985, it quickly set global alarm bells ringing.
Ozone is a gas that forms a kind of atmospheric blanket around the Earth to keep out much of the Sun’s UV radiation. Without it, life as we know it would never have evolved on this planet. The Antarctic hole quickly became recognised as the most extreme sign of a phenomenon scientists began finding evidence for worldwide during the 1980s: stratospheric ozone was being destroyed across the planet by human-produced gaseous chemicals, notably chlorofluorocarbons (CFCs), which were being used as propellants in aerosols and as highly effective refrigerants in refrigerators and air conditioners. It was feared that destruction of Earth’s ozone layer would mean we’d be bombarded during the 21st century by such high levels of cancer-causing UV radiation that life outdoors would be almost impossible for our species as well, of course, for most other species on the planet.
The US space agency NASA has been keeping a close watch on stratospheric ozone since the 1970s, producing these images, based on satellite data of the hole over Antarctica when it forms each year. They are finally showing a trend towards the hole closing. Dr Paul Fraser, now a CSIRO honorary fellow, has been involved with the ozone crisis since it began. He was just embarking on his career in the 1970s as an atmospheric chemist when he was drawn to what was then still a scientific theory that CFCs could damage Earth’s stratosphere. He became pivotal in setting up Australia’s atmospheric CFC monitoring station at Cape Grim in Tasmania in 1976 and it was samples he collected there, at the Mawson research station in Antarctica, and elsewhere that helped, in the 1980s, to confirm global ozone depletion was occurring due to CFCs.
The governments of the world reacted swiftly and almost universally. In 1987 the first of an eventual 197 signed an agreement – the Montreal Protocol – to stop producing and using ozone-depleting substances (ODS), such as CFCs. There were some voices of dissent who questioned the science. But even for governments not convinced by what the research was showing, the outcome of doing nothing was so potentially diabolical it was seen as better to err on the side of caution and heed the most widespread expert advice. Notably, two of the world’s long-term conservative governments of the day who wielded much influence worldwide – those of Margaret Thatcher in the UK and Ronald Reagan in the USA – were both outspoken supporters of the Montreal Protocol and continued to be through various amendments that gradually extended its reach and led to faster phasing out of ODS.
In the very early days the first significant step to reducing CFCs was banning their use in aerosols, and that industry quickly transferred to using non–ozone depleting propellants. the decision made by everyday people around the world to not use aerosols powered by CFCs rapidly helped reduce emissions. Overall, however, the actions of the Montreal Protocol took time to kick in. Through the 1990s ODS continued to increase in the atmosphere, and the Antarctic hole – which appears over the continent in early spring each year due to extreme winter stratospheric conditions there that accelerate the chemical reactions that destroy ozone – kept growing.
But the rise in stratospheric ODS eventually peaked in the late 1990s and has since been falling. Every four years, a report is issued by the UN about the status of the ozone hole and ozone depletion science. The recovery began around 2000, says Australian scientist Dr Matt Tully, who’s been working on the planet’s ozone issue for 15 years and has contributed to worldwide scientific assessments of ozone depletion since 2009. He’s responsible for the ongoing ozone program at the Australian Bureau of Meteorology (BOM) and is a member of the International Ozone Commission.
The healing of the ozone hole has been a slow trend, he says, and there is a lot of variation in its status from year to year, depending on conditions over Antarctica. But in the most recent assessment, released last year, scientists finally confirmed that long-term recovery of the hole is underway. “The assessments before weren’t prepared to call it,” Matt says. “There was too much variation from year to year, but in 2018 for the first time the assessment declared there were signs of recovery in the Antarctic ozone hole.” And last year had the smallest ozone hole in more than 20 years, although due to its fluctuating nature it may open wider again while still continuing on its overall downward trajectory. Measurements taken by scientists with the BOM, CSIRO and Australian Antarctic Division (AAD) are crucial in compiling these four-yearly assessments and also in keeping watch that all countries maintain their obligations, under the Montreal Protocol, not to use CFCs.
Australia’s Cape Grim Baseline Air Pollution Station has been critical to studies of the Earth’s atmosphere since the 1970s. These days Cape Grim is one of only three premier baseline air pollution stations in the world – the others are in Hawaii and the Canadian Arctic. In 2018–19 monitoring by this network documented and located a rogue producer of CFCs in China, flouting the terms of the Montreal Protocol. They were dealt with swiftly and shut down.
While the Montreal Protocol is widely regarded as the most successful environmental treaty ever, Paul warns that the world still has a way to go in reducing global levels of ODS. “We’ve only just gone through the worst part of it,” he says. It’s expected that it won’t be until the 2060s that levels of ODS in the stratosphere will be brought back to pre-1970s levels.
In the meantime, there are still impacts caused by the problem that are only just beginning to emerge. The AAD has been collecting ozone data in Antarctica on a weekly basis now for two decades. “The measurements themselves are very important in helping to give information on the amount of ozone in the stratosphere,” says AAD atmospheric scientist Dr Andrew Klekociuk. “But AAD researchers are also tracking the impact on Antarctic ecosystems of the ozone hole, and the higher levels of UV radiation that it allows in. We know from AAD research, for example, that krill are influenced by elevated UV, and go down deeper to cope in the water column.” While there is still much work to be done on evaluating the impact of this, these tiny crustaceans drive food webs in Antarctica and beyond so the impacts are potentially enormous.
Although there are significant differences between how and what causes rising emissions of ODS and greenhouse gases, it’s widely thought that the success of the Protocol offers hope for global warming and provides lessons in how that can also be brought under control. Leader of CSIRO’s Earth Systems and Climate Change Hub Professor David Karoly, has worked across research into the consequences of both elevated emissions of greenhouse gases and ODS. He says three key things made the Montreal Protocol work. First, it took a top-down approach, with the governments of all countries agreeing unilaterally to restrict their emissions. Second the companies that were making and selling ODS realised they could also make and sell the replacement chemicals,” David continues. “So they were potentially going to make more money because they had new patents for the replacements. “And third, the industry-led ‘merchants of doubt’ about the links between stratospheric ozone depletion and the ODS weren’t as successful [as they are being in regard to climate change] because the companies didn’t support them when they realised they could make money with replacement chemicals.”

PART-2: CRITICAL COMMENTARY
(1) THE HISTORICAL CONTEXT: Since 1969, alarmed by the space program and the development of high elevation supersonic aircraft, environmentalism had clutched on to a failed obsession with ozone depletion activism. It was almost comical in many respects – a case of the 1960s environmental movement gone berserk. Below is a brief history of this tragi-comical chapter in environmentalism that serves as the backdrop to the ozone depletion environmentalism described by the Australian Geographic. This history is summarized below.
- 1969: A plan to develop high altitude supersonic airliners with the Boeing 2707 as a concept vehicle. The very high cruising altitude of the SST raised environmental alarms that included both climate change and ozone depletion. First an alarm is raised that chemicals and aerosols in the exhaust of the SST jet engines will cause climate change.
- 1970 the climate change theory is quietly shelved after critical reviews by skeptics and deniers and a new alarm is raised. Water vapor in the SST jet exhaust will cause a 4% depletion of ozone in the ozone layer causing 40,000 additional cases of skin cancer every year in the USA alone. The water vapor theory is quietly forgotten after critical reviews by skeptics and deniers with data showing that higher levels of water in the stratosphere is coincident with higher levels of ozone.
- 1970: A new ozone depletion theory emerges. Nitric oxide (NOx) in the SST jet exhaust will cause ozone depletion because NOx acts as a catalyst to destroy ozone without being consumed in the process.
- 1971: A computer model is developed to assess the impact of NOx in SST exhaust on the ozone layer. The model predicts that there will be a 50% ozone depletion and a worldwide epidemic of skin cancer. Animals that venture out during daylight will become blinded by UV radiation. It was an apocalyptic scenario.
- 1971: NOx in the fireball of open air nuclear tests provide a ready laboratory to test the ozone depletion properties of NOx. The computer model predicted 10% ozone depletion by NOx from nuclear testing. Measurements showed no ozone depletion; but the model won and the ozone depletion scare endured.
- 1972: Death of the SST: We were so frightened by the ozone depletion scare that the SST program was canceled although America’s skies soon became filled with supersonic fighters and bombers spewing NOx without any evidence of ozone depletion or of skin cancer or of blindness in animals.
- 1973: Space Shuttle: Unperturbed by the skeptics and emboldened by their SST success, fear mongering ozone depletion scientists turned their attention to the proposed Space Shuttle program. The shuttle design included two solid fuel rockets that emit hydrogen chloride (HCl). The scientists calculated that 50 flights per year would deposit 5000 tons of HCl per year in the stratosphere that could cause a 10% ozone depletion over Florida and 1% to 2% elsewhere. Although the scare was hyped it never got to the SST levels and the space shuttle miraculously survived the ozone scare.

(2) THE ROWLAND MOLINA THEORY OF OZONE DEPLETION AND THE MONTREAL PROTOCOL
- 1973: James Lovelock discovered that air samples taken from the Middle of the Atlantic Ocean contained CFCs. He then published his now famous paper in which he said that these man made chemicals did not otherwise occur in nature and that they were were inert meaning that they could therefore accumulate in the atmosphere indefinitely.
- 1974: The Rowland Molina Theory of Ozone Depletion (RMTOD). Based on the work of Lovelock, Rowland and Molina propose the theory that the long life in the atmosphere of CFCs means that given enough time they could end up in the stratosphere. Once up there, the CFC molecule could be disintegrated by UV radiation to release highly reactive radicals that could destroy ozone. This theory forms the basis of the Montreal Protocol with the assumption and without the evidence that any of the long lived CFCs found by Lovelock did in fact make it all the way up to the stratosphere.
- 1985: Farman et al 1985: This research paper published by the British Antarctic Survey serves as the empirical evidence for RMTOD. What Farman found was that over a 5-year period ozone levels above the South Pole had dropped precipitously during the months of November and December and then recovered. This data was taken as empirical verification of RMTOD and to this day serves as the only empirical evidence in support of RMTOD although what RMTOD implies is a long term decline in global mean total column ozone.
- FARMAN ETAL 1985: The only empirical evidence presented in support of RMTOD is Farman etal 1985. The Farman study showed only that there was a brief and localized 5-year period of low ozone in the months of October and November above the South Pole that had recovered to normal levels after that 5-year period and this was taken as evidence of RMTOD. Yet, this episodal and localized low ozone event does not serve as evidence of the RMTOD theory of ozone depletion. RMTOD implies a long term declining trend in global mean total column ozone. No evidence for this trend has ever been presented and we show in a related post that none exists. LINK: https://tambonthongchai.com/2019/03/12/ozone1966-2015/ .
- THE OZONE HOLE: The South Polar periodic low ozone event that quickly recovers back to normal levels was sold to the general public as an “ozone hole” and claimed as evidence of RMTOD human caused global ozone depletion that could cause skin cancer in humans and blindness in animals all over the world from its perch above the South Pole. This is an outrageous lie. There is no “hole” in the ozone layer. There is no hole in the sky. What NASA has dubbed as a “hole” is only a brief episodal blip in ozone concentration that comes and goes without a trend of any kind in a tiny corner of the world constituting less than 3% of the world in terms of surface area. This event has been sold as a dangerous hole in the sky that lets in dangerous UV radiation that causes skin cancer in humans and blindness in animals around the world.
- THE MONTREAL PROTOCOL: Yet, this is all they had to go on to pass the Montreal Protocol and thereby to usher in new world order where the UN is installed as global environmental authority of some kind without legal authority and without constraints or oversight. Then at some point, it was declared with great fanfare that the UN brokered Montreal Protocol had solved the ozone depletion problem and that the ozone had recovered. No explanation is offered for the continuation of the South Polar ozone events that had been named ozone holes and that had served as the only empirical evidence of ozone depletion.
- OZONE DISTRIBUTION TO THE POLAR REGIONS: In a related post LINK: https://tambonthongchai.com/2020/11/30/the-unep-healed-the-ozone-hole/ we show that these South Polar events should be understood as ozone distribution events and not as ozone depletion. Ozone is both created and destroyed by UV radiation but ozone is created only above the tropics where sunlight is direct. It is distributed to the greater latitudes by the Brewer Dobson circulation and episodic changes in ozone levels at the higher latitudes can be understood in terms of the dynamics of this distribution but not in terms of long term ozone depletion due to the presence of ozone depleting substances in the stratosphere. The role of the Montreal Protocol is illusory. The only significant impact of what is claimed to be finally a proven case of ozone depletion and its apparent resolution by the UN is that it served to expand the role of the UN into global environmentalism. The Montreal Protocol story as told is best understood in this context.
- FEAR BASED ACTIVISM: This seemingly impossible task was made possible with fear based activism. Here is an example of the kind of that fear that was constantly in the media in those days: 1974: NY Times, September 26, a big day for Doomsday journalism. The Sept 26 1974: ozone depletion of 18% by 1990 and 50% by 2030 by CFCs will cause an epidemic of skin cancer, mutation of frogs, and blindness in animals and humans. March 10 1987: Skin cancer is increasing in the United States at a near epidemic rate, outstripping predictions made as recently as five years ago, a research physician testified Monday before a House panel examining threats to the Earth’s protective ozone layer. Malignant melanoma, the deadliest form of skin cancer, has increased 83 percent in the last seven years alone. Melanoma is increasing faster than any other cancer except lung cancer in women. March 12 1987: Consensus among scientists: If harmful UV radiation reached the Earth, it would cause monumental problems, including rampant skin cancer and eye cataracts, retarded crop growth, impairment of the human immune system and damaging radiation doses to all forms of life. Although many Americans and the people of other nations are still not listening or taking the ozone threat seriously, the Earth’s protective shield is getting thinner and developing mysterious holes. October 1, 1987: Ozone levels above Antarctica reached an all-time low since measurements began and scientists said that man-made Freon-type gases are to blame. Ozone filters out harmful amounts of ultraviolet radiation. For every one percent of ozone decrease there could be 20,000 more skin cancer cases annually in the United States alone. The whole world is frightened. The scare was very successful and it appeared in various forms almost every day in newspapers and on television. :

OZONE DEPLETION CHEMISTRY AND GLOBAL TOTAL COLUMN OZONE DATA
RELATED POST#1: OZONE CHEMISTRY: https://tambonthongchai.com/2018/04/01/ozone-depletion-and-ozone-holes/ :
The Montreal Protocol subsumes that without human intervention the amount of ozone in the stratosphere is invariant and that a decline in ozone over time is a trend and not part of long run cyclical phenomena. All observed depletions, even localized and time constrained events, are therefore assumed to be man-made and the causative agent is identified as CFC. Observed changes are thus interpreted as anomalies that require an explanation in terms of human cause, but the data are more complicated that this simplistic model
The ultraviolet spectrum in incident solar radiation comes in three frequency bands. The high energy band (200-240 nanometers) and the medium energy band (240-300 nanometers) are harmful to living matter and are absorbed in the ozone layer while the low energy band (300-480 nanometers) reaches the earth’s surface and causes tanning. With respect to the absorption of harmful UV radiation in the ozone layer, ozone is both created and destroyed in the absorption process. The high-energy band UV is absorbed by oxygen molecules causing them to break apart into extremely reactive oxygen atoms. A subsequent chance collision of these atomic particles with other oxygen molecules forms ozone which then absorbs the medium-energy UV band and disintegrates back into oxygen. The UV absorption process is cyclical. It begins and ends with oxygen. Ozone is a transient intermediate product of this process.
The reason that there is any ozone accumulation at all in the stratosphere is that, of the three reactions, the second is the slowest. Sunset finds the stratosphere with an excess of single oxygen atoms still looking for a date with an oxygen molecule. Overnight, with no radiation to destroy their product, these particles build up an inventory of ozone whose destruction will begin anew at sunrise. There is therefore, a diurnal cycle in the ozone content of the stratosphere whose amplitude is of the same order of magnitude as in the “ozone hole”, the reported ozone depletion over the South Pole that confirmed taken to confirm RMTOD and that caused Montreal Protocol to be invoked.
A longer but irregular cyclical pattern in stratospheric ozone coincides with the sunspot cycle. The period is 8 to 17 years with an average of 11 years. High-energy band UV increases by 6% to 10% during periods of high sunspot activity but the medium-energy UV emission is largely unaffected. High sunspot activity favors ozone accumulation and low sunspot activity is coincident with ozone depletion. A somewhat similar pattern exists in the case of polar ozone holes. The UV induced reactions described above occur only over the tropics where sunlight is direct. Ozone is formed over the equator and not over the poles. Equatorial ozone is distributed to the higher latitudes by the Brewer-Dobson Circulation (BDC) shown below.
BREWER DOBSON CIRCULATION (BDC):

The shape and position of the BDC changes seasonally and also shifts over a longer time cycle. Therefore, the efficiency of the BDC in transporting ozone to the greater latitudes changes seasonally and also over longer time cycles. These changes do not have an ozone depletion interpretation but they can create the polar ozone hole phenomenon. When the distribution of ozone is not efficient, localized “ozone depletion” appears to occur in the extreme latitudes in the form of what has come to be called an ozone hole. These holes come and go in natural cyclical changes and are not the creation of chemical ozone depletion and they do not serve as empirical evidence of the Roland Molina theory of ozone depletion by CFCs.
THE CASE AGAINST CFCs.

The case against CFCs is that when they get to the stratosphere by diffusion, they absorb high-energy band UV and form unstable and reactive chlorine atoms. The chlorine atom particles then participate as catalytic agents to convert ozone back to oxygen. In other words they mediate the reaction between atomic oxygen particles and ozone. It is alleged that the destruction of ozone by this mechanism exposes the surface of the earth to dangerous levels of medium-band UV because there is not enough ozone left in the stratosphere to absorb them.

Although these reactions can be carried out in the chemistry lab , there are certain rate constraints that make them irrelevant in the stratosphere. The air up there in the stratosphere is rather thin, containing less than one percent (1%) of the molecular density of air at sea level. It is not easy for a molecular particle in random thermal motion to find another particle to react with. Photochemical reactions occur instantaneously but those that require a collision of two particles take much much much longer. This difference in the reaction rate is the reason that ozone accumulates overnight and why there is an inventory of ozone in the ozone layer.
The atomic oxygen particles that react with oxygen molecules to form ozone could in theory react with an ozone molecule instead and cause its destruction or it could react with another atomic oxygen particle and form oxygen instead of ever forming any ozone. Some of the oxygen atoms do behave in this manner but these reactions proceed too slowly to be important to the chemistry of the stratosphere. The reason is that the stratospheric chemicals in question exist in minute quantities. One in a million particles is an ozone molecule or an atomic oxygen particle and one in a billion is CFC or chlorine generated from CFC. The accidental collision between chlorine atoms and ozone molecules or between chlorine atoms and oxygen atoms are rarer than those between two oxygen atoms or that between an oxygen atom and an ozone molecule. Therefore the latter collisions involving oxygen atoms are more important to ozone depletion than those mediated by chlorine. Considering that more than 200,000 out of a million molecular particles in the stratosphere are oxygen, it is far more likely that charged oxygen atoms will collide with oxygen molecules rather than with each other or with ozone. Therefore ozone rather than oxygen is formed. Ozone formation is a rate phenomenon.
Chlorine atoms are a thousand times rarer in the stratosphere than atomic oxygen particles. It is not likely that chlorine’s mediation in short circuiting ozone generation will occur sufficiently fast to be important. Nature already contains an ozone destruction mechanism that is more efficient than the CFC mechanism but ozone forms anyway.
However, the argument can be made that overnight after sunset, as charged oxygen atoms are used up, the charged chlorine atoms take on a greater role in ozone destruction and also when these chemicals are distributed to the greater latitudes where sunlight is less direct and too weak to be ionizers of oxygen, the only ozone destruction chemistry left is that of charged chlorine atoms colliding with ozone. The relative importance of these overnight and greater latitude reactions in making changes to latitudinally weighted mean global ozone can be checked only by examining its overall long term trends as well as its trend profiles. These data are shown in the data analysis summary of related posts on ozone depletion (LINKS provided at the end of this post).
The essential data are displayed graphically below: What we find in the data for total column ozone across the whole world from Pole to Pole is that data for global mean total column ozone over a long multidecadal time span do not show a long term declining trend. Moreover, the patterns in the data suggest that the occasional low levels of ozone seen over the South Pole that have been interpreted as evidence of ozone depletion and evidence of a hole in the ozone layer, is a figment of a pattern in the data that is likely the creation of natural variability in ozone distribution by the Brewer Dobson circulation.
We also find in the data that the range of observed ozone levels is a strong function of latitude. It reaches a minimum of about 20DU in the tropics and increases asymmetrically toward the two poles. The hemispheric asymmetry has two dimensions. The northward increase in range is gradual and the southward increase in range is steep. Also, the northward increase in range is achieved mostly with rising maximum values while southward increase in range is achieved mostly with falling minimum values. The midpoint between the HIGH and LOW values is symmetrical within ±45 from the equator but diverges sharply beyond 45 with the northern leg continuing a steady rise while the southern leg changes to a steep decline.
Hemispheric asymmetry in atmospheric circulation patterns is well known (Butchart, 2014) (Smith, 2014) and the corresponding asymmetry in ozone levels is also recognized (Crook, 2008) (Tegtmeier, 2008) (Pan, 1997). These asymmetries are also evident when comparing seasonal cycles among the ground stations (Figure 29). The observed asymmetries are attributed to differences in land-water patterns in the two hemispheres with specific reference to the existence of a large ice covered land mass in the South Pole (Oppenheimer, 1998) (Kang, 2010) (Turner, 2009). The climactic uniqueness of Antarctica is widely recognized (Munshi, Mass Loss in the Greenland and Antarctica Ice Sheets, 2015) (NASA, 2016) (NASA, 2015).
The only empirical evidence for anthropogenic ozone depletion is the finding by Farman et al in 1985 that ozone levels at HLB (above the South Pole) fell more than 100DU from the average value for October in 1957-1973 to the average value for October in 1980-1984.
In comparison, changes of ±5DU from Lustrum to Lustrum seem inconsequential. In that light. On this basis, if we describe ±5DU per Lustrum as representative of random natural variability, what we see in the data is that, except for the two Antarctica stations (AMS and HLB), no average change in monthly mean ozone from Lustrum to Lustrum falls outside this range. It is therefore not likely that the HLB data reported by Farman et al can be generalized globally. We conclude from this analysis that the Farman etal study, the only empirical evidence thought to validate the Rowland Molina theory of ozone depletion, is flawed and therefore does not serve as evidence of anthropogenic ozone depletion. And yet, Farman etal 1985 served and still serves to this day as the sole empirical support for the ozone crisis that created the role for the UN in global environmentalism.
These relationships imply that there is no empirical evidence to support the Rowland Molina Theory of Ozone Depletion and that therefore there is no evidence of human caused ozone depletion by way of CFC emissions. The occasional low ozone level over the South Pole described as an “ozone hole” and presented as evidence of ozone depletion is neither a hole in the ozone layer nor evidence of ozone depletion but natural variability understood in terms of the data presented above.
YET WHAT WE SEE IN THE CLIMATE CHANGE MOVEMENT IS A PATTERN OF REPEATEDLY CITING THE SUCCESS OF THE MONTREAL PROTOCOL AS THE REASON THAT THE CLIMATE CHANGE MOVEMENT SHOULD ALSO SUCCEED. AN INTERPRETATION OF THIS LOGIC IS THAT THE MONTREAL PROTOCOL IS THE EVIDENCE THAT ENVIRONMENTAL MOVEMENTS WITHOUT SUFFICIENT EVIDENCE DO SUCCEED GIVEN A SUFFICIENT FEAR LEVEL AND THAT THEREFORE IT PROVIDES REASON TO CONTINUE THE CLIMATE BATTLE IN THE HOPE OF THE SAME KIND OF SUCCESS.

EMPRICIAL TEST OF OZONE DEPLETION: GROUND STATION OZONE DATA: LIST OF GROUND STATIONS: AND TOTAL COLUMN OZONE DATA FROM AMS, HLB, LDR, PTH












TOTAL COLUMN OZONE DATA FROM SMO, MLO, WAI, BDR








TOTAL COLUMN OZONE DATA FROM CAR, BIR, FBK, BRW















THIS POST IS A LITERATURE REVIEW OF SUPERSIITION IN HUMANS IN THE CONTEXT OF CLIMATE CHANGE AND THE EVALUATION OF WEATHER AND WILDFIRE EVENTS AS CLIMATE CHANGE IMPACTS.

PART-1: THE PRECAUTIONARY PRINCIPLE IN CLIMATE CHANGE
The invocation of the precautionary principle in the climate change issue derives from its acceptance and its use in environmentalism where the greater harm argument has been used to legitimize the precautionary principle. The Martin etal 1997 paper in the bibliography below lays out the case for its use in climate change by describing climate change as an environmental issue where fossil fuel emissions are the pollution and climate action is the remedy. There are two significant issues in this logic.
First, that fossil fuel emissions are a form of air pollution and an environmental issue has already been established back in the 1970s and strict air quality regulations were established and enforced by the newly formed EPA that was itself a creation of this issue. The fossil fuel industry and the internal combustion engine industry responded and at great cost to the consumer, they were able to meet the new EPA emissions standards in terms of both the smog issue and the acid rain issue.
Climate change was not invoked as an issue in the matter of the environmental impact of fossil fuels. It should be noted that the history of climate change research linked to fossil fuels goes back to 1938 when the first research paper on fossil fueled climate change was published by the Royal Society in England. It was more than 20 years after the clean air laws were passed, enforced by the EPA, and complied with by the fossil fuel industry, that climate science environmentalism proposed climate change as a further environmental issue for fossil fuels. This was in the form of the Hansen 1988 research paper and Congressional Testimony in which climate science demanded not just clean air but the elimination of fossil fuels altogether.
The world was told and is still being told that we must stop using fossil fuels because the combustion of fossil fuels no matter how clean causes carbon dioxide that is millions of years old and not part of the current account of the carbon cycle to enter the atmosphere and that causes atmospheric carbon dioxide concentration to go up. The environmental harm of that change is proposed as rising surface temperature because surface temperature is a logarithmic function of atmospheric CO2 concentration. The warming of the earth since the end of the Little Ice Age, a 500-year cold period that ended at sometime in the 18th century when a 500-year cooling trend had reverted to a warming trend. This change coincided roughly with the Industrial Revolution when humans had begun to use fossil fuels in a sharply rising trend in its production and consumption. It was on this basis that the causal connection from fossil fuels to warming was proposed in Callendar 1938 and then 30 years later in Hansen 1988. The further assessment of the desctructiveness of climate change in terms of collapse of civilizations, mass extinctions, and the destruction of the planet itself led to the invocation of the precautionary principle which implied that that the rigorous and scientific evidence of the harm must not be demanded because of the greater harm of being wrong if the harm cannot be proven. Here we argue that the invocation of the precautionary principle facilitates the use of supersition and confirmation bias in climate change and the demand for climate action. That in turn weakens the credibility of the claimed harm of climate change and the demand for climate action.

PART-2: THE RELEVANT BIBLIOGRAPHY
Ono, Koichi. “Superstitious behavior in humans.” Journal of the experimental analysis of behavior 47.3 (1987): 261-271. Twenty undergraduate students were exposed to single response‐independent schedules of reinforcer presentation, fixed‐time or variable‐time, each with values of 30 and 60 s. The reinforcer was a point on a counter accompanied by a red lamp and a brief buzzer. Three color signals were presented, without consistent relation to reinforcer or to the subjects’ behavior. Three large levers were available, but the subjects were not asked to perform any particular behavior. Three of the 20 subjects developed persistent superstitious behavior. One engaged in a pattern of lever‐pulling responses that consisted of long pulls after a few short pulls; the second touched many things in the experimental booth; the third showed biased responding called sensory superstition. However, most subjects did not show consistent superstitious behavior. Reinforcers can operate effectively on human behavior even in the absence of a response‐reinforcer contingency and can, in some cases, shape stable superstitious patterns. However, superstitious behavior is not a consistent outcome of exposure of human subjects to response‐independent reinforcer deliveries.
Martin, Philippe H. “” If You Don’t Know How to Fix it, Please Stop Breaking it!” The Precautionary Principle and Climate Change.” Foundations of Science 2.2 (1997): 263-292. Taking precautions to prevent harm. Whether principe de précaution, Vorsorgeprinzip, føre-var prinsippet, or försiktighetsprincip, etc., the precautionary principle embodies the idea that public and private interests should act to prevent harm. Furthermore, the precautionary principle suggests that action should be taken to limit, regulate, or prevent potentially dangerous undertakings even in the absence of scientific proof. Such measures also naturally entail taking economic costs into account. With the environmental disasters of the 1980s, the precautionary principle established itself as an operational concept. On the eve of the 1997 Climate Summit in Kyoto, precaution, as the precautionary principle is often referred to, has now become a key legal principle in environmental law, in general, and in current international climate negotiations, in particular, attempts to understand why. It examines in turn the natural affinity between the precautionary principle and climate change, reviews a series of issues which the principle raises, and discusses avenues which it opens paper, climate change fulfills the theoretical requirements set for the application of the precautionary principle. It comes as no surprise that the actual application of the precautionary principle in the context of climate change raises high political stakes. As a result, climate change science, in particular, and science, in general, is under the fire of politically-motivated scientific skeptics. Thus, by way of the counter-measures which must be put into effect, the precautionary principle calls for a greater sense of responsibility on the part of scientists and the public at large. Specifically, from scientists, it demands perseverance in rigor, excellence in communication, and committment to education. However, even if special efforts are made to implement the precautionary principle in the context of climate change, the success of climate change mitigation will constitute no test of the validity, the usefulness, or the efficiency of the precautionary principle. Indeed, the degree to which climate change mitigation succeeds only provides a measure of our kind’s ability to manage responsibly the global commons which we inherited from our ancestors and which our generation enjoys, the global commons which we will pass on to today’s children and to generations to come.
Endfield, Georgina H., and David J. Nash. “Drought, desiccation and discourse: missionary correspondence and nineteenth‐century climate change in central southern Africa.” Geographical Journal 168.1 (2002): 33-47. This paper examines the role that representatives of the London Missionary Society in central southern Africa during the nineteenth century may have played in the development of geographical debates concerning the long‐term desiccation of the African continent. Observations on climate included within missionary documents are used to reconstruct a chronology of intra‐decadal climatic variability for the period 1815–1900. This reveals six drought periods and seven wet phases that affected large areas of the region, but identifies no evidence for progressive desiccation. The chronology is then used as a framework within which to view missionary perspectives on drought and desiccation. Major influences upon the development of desiccationist theory appear to include the prevalence of contemporary moral economic explanations of climatic variability, as well as the uptake and acceptance of indigenous understanding of climate change. Significantly, many of the key observations by eminent missionaries used as supporting evidence for progressive desiccation are identified as having been made during periods of severe drought. This is used to suggest that the most widely propagated evidence for desiccation may, therefore, simply be the end‐product of periods of short‐term drought rather than long‐term climatic deterioration.
Gosselin, Frédéric, and Philippe G. Schyns. “Superstitious perceptions reveal properties of internal representations.” Psychological science 14.5 (2003): 505-509. Everyone has seen a human face in a cloud, a pebble, or blots on a wall. Evidence of superstitious perceptions has been documented since classical antiquity, but has received little scientific attention. In the study reported here, we used superstitious perceptions in a new principled method to reveal the properties of unobservable object representations in memory. We stimulated the visual system with unstructured white noise. Observers firmly believed that they perceived the letter S in Experiment 1 and a smile on a face in Experiment 2. Using reverse correlation and computational analyses, we rendered the memory representations underlying these superstitious perceptions.
Case, Trevor I., et al. “Coping With Uncertainty: Superstitious Strategies and Secondary Control 1.” Journal of Applied Social Psychology 34.4 (2004): 848-871. The aim of the present studies was to investigate the relationship between primary and secondary control and the use of superstitious strategies under conditions of uncertainty and stress. In the first study, 78 participants completed a chance‐determined card‐guessing task in which they were permitted to use a psychic’s card selections instead of making their own card selections. Participants’ use of a superstitious strategy (a psychic’s selections) increased significantly with the perceived likelihood of failure, regardless of belief in psychic ability. A second study (N= 102) replicated these findings using a skill task. Overall, these data suggest that as the need to control outcomes becomes increasingly salient, the use of superstitious strategies may represent attempts at secondary control.
Beck, Jan, and Wolfgang Forstmeier. “Superstition and belief as inevitable by-products of an adaptive learning strategy.” Human Nature 18.1 (2007): 35-46. The existence of superstition and religious beliefs in most, if not all, human societies is puzzling for behavioral ecology. These phenomena bring about various fitness costs ranging from burial objects to celibacy, and these costs are not outweighed by any obvious benefits. In an attempt to resolve this problem, we present a verbal model describing how humans and other organisms learn from the observation of coincidence (associative learning). As in statistical analysis, learning organisms need rules to distinguish between real patterns and randomness. These rules, which we argue are equivalent to setting the level of α for rejection of the null hypothesis in statistics, are governed by risk management as well as by comparison to previous experiences. Risk management means that the cost of a possible type I error (superstition) has to be traded off against the cost of a possible type II error (ignorance). This trade-off implies that the occurrence of superstitious beliefs is an inevitable consequence of an organism’s ability to learn from observation of coincidence. Comparison with previous experiences (as in Bayesian statistics) improves the chances of making the right decision. While this Bayesian approach is found in most learning organisms, humans have evolved a unique ability to judge from experiences whether a candidate subject has the power to mechanistically cause the observed effect. Such “strong” causal thinking evolved because it allowed humans to understand and manipulate their environment. Strong causal thinking, however, involves the generation of hypotheses about underlying mechanisms (i.e., beliefs). Assuming that natural selection has favored individuals that learn quicker and more successfully than others owing to (1) active search to detect patterns and (2) the desire to explain these patterns mechanistically, we suggest that superstition has evolved as a by-product of the first, and that belief has evolved as a by-product of the second.
MacLean, Jason. “Principle 5–Precautionary Principle.” Jason MacLean (2009): 347-416. An absence of conclusive scientific evidence that serious and irreversible environmental harm will occur within their sphere of influence must not deter corporations from taking cost-effective precautionary measures. Furthermore, corporations bear the burden of proof of socially acceptable safety when they advocate potentially harmful projects.
LaFollette, Marcel Chotkowski. “Track Conditions: Upon Revisiting How Superstition Won and Science Lost.” Isis 110.4 (2019): 755-757. John Burnham’s decision in How Superstition Won and Science Lost to frame science popularization as a cultural competition with winners and losers—rather than as an organic process responding to external forces and evolving over time—reflected both his high regard for science and his disdain for those who promoted magic and superstition over scientific authority and authenticity. To Burnham, the outcome of such rivalry mattered, for reasons beyond whether popular culture offered transitory distraction or fact-based enlightenment. Medical and health knowledge could save lives, discoveries in physics and chemistry fueled social progress, while pseudoscientists and fakers peddled false hopes and cures, manipulating their auditors’ emotions for the sake of profit or power rather than the greater good. By abdicating responsibility for communicating to the public, and eventually abandoning center stage to professional popularizers and journalists, the scientific community had also, Burnham believed, left audiences vulnerable to superstition’s seductive attractions. It was a compelling argument, buttressed by daunting arrays of evidence, and persuasive within the context of the 1980s.
Ungar, Sheldon. “Knowledge, ignorance and the popular culture: climate change versus the ozone hole.” Public Understanding of Science 9.3 (2000): 297-312. This paper begins with the “knowledge-ignorance paradox”—the process by which the growth of specialized knowledge results in a simultaneous increase in ignorance. It then outlines the roles of personal and social motivations, institutional decisions, the public culture, and technology in establishing consensual guidelines for ignorance. The upshot is a sociological model of how the “knowledge society” militates against the acquisition of scientific knowledge. Given the assumption of widespread scientific illiteracy, the paper tries to show why the ozone hole was capable of engendering some public understanding and concern, while climate change failed to do so. The ozone threat encouraged the acquisition of knowledge because it was allied and resonated with easy-to-understand bridging metaphors derived from the popular culture. It also engendered a “hot crisis.” That is, it provided a sense of immediate and concrete risk with everyday relevance. Climate change fails at both of these criteria and remains in a public limbo.
Cruikshank, Julie. “Glaciers and climate change: perspectives from oral tradition.” Arctic (2001): 377-393. In northwestern North America, glaciers figure prominently in both indigenous oral traditions and narratives of geophysical sciences. These perspectives intersect in discussions about global warming, predicted to be extreme at Arctic and Subarctic latitudes and an area of concern for both local people and scientists. Indigenous people in northwestern North America have experienced climate variability associated with the latter phases of the Little Ice Age (approximately 1550-1850). This paper draws on oral traditions passed down from that period, some recorded between 1900 and the early 1950s in coastal Alaska Tlingit communities and others recorded more recently with elders from Yukon First Nations. The narratives concern human travel to the Gulf of Alaska foreshore at the end of the Little Ice Age from the Copper River, from the Alaska panhandle, and from the upper Alsek-Tatshenshini drainage, as well as observations about glacier advances, retreats, and surges. The paper addresses two large policy debates. One concerns the incorporation of local knowledge into scientific research. The second addresses the way in which oral tradition contributes another variety of historical understanding in areas of the world where written documents are relatively recent. Academic debates, whether in science or in history, too often evaluate local expertise as data or evidence, rather than as knowledge or theory that might contribute different perspectives to academic questions.
Hanekamp, Jaap C., and S. Wybren Verstegen. “The problem of the precautionary principle: The paternalism of the precautionary coalition.” Science vs Superstition. The Case for a New Scientific Enlightenment. (2006). In recent years, the traditional wisdom that ‘one can never be too careful’ has been formalized as a dominant legal doctrine, enshrined in international law as the Precautionary Principle. The first international endorsement of the precautionary principle was the acceptance in 1982 by the United Nations General Assembly of The World Charter for Nature, and it first appeared in an international treaty in the 1987 Montreal Protocol. It can now be found in a host of diverse national and international legislative treaties.1 In terms of international policy-making, the most influential enshrinement of the precautionary principle was its insertion into the 1992 Rio Declaration on Environment and Development. Although the principle has been defined in a host of different ways, leading to a variety of interpretations , its essence is expressed quite clearly in the Rio Declaration, which states that in relation to a given action or state of affairs:
“Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.” In other words, the precautionary principle suggests that if the result of a given action may be to cause irreversible damage of some sort, in the absence of scientific consensus that such harm will not ensue, we must proceed as if there is evidence that such harm will indeed ensue. The result is that the burden of proof falls not on the regulator, but on those who advocate taking the action.
Demeritt, David. “Science studies, climate change and the prospects for constructivist critique.” Economy and society 35.3 (2006): 453-479. Starting from the debates over the ‘reality’ of global warming and the politics of science studies, I seek to clarify what is at stake politically in constructivist understandings of science and nature. These two separate but related debates point to the centrality of modern science in political discussions of the environment and to the difficulties, simultaneously technical and political, in warranting political action in the face of inevitably partial and uncertain scientific knowledge. The case of climate change then provides an experimental test case with which to explore the various responses to these challenges offered by Ulrich Beck’s reflexive modernization, the normative theory of expertise advanced by Harry Collins and Robert Evans, and Bruno Latour’s utopian vision for decision-making by the ‘collective’ in which traditional epistemic and institutional distinctions between science and politics are entirely superseded.
*Foster, Kevin R., and Hanna Kokko. “The evolution of superstitious and superstition-like behaviour.” Proceedings of the Royal Society B: Biological Sciences 276.1654 (2009): 31-37. Superstitious behaviours, which arise through the incorrect assignment of cause and effect, receive considerable attention in psychology and popular culture. Perhaps owing to their seeming irrationality, however, they receive little attention in evolutionary biology. Here we develop a simple model to define the condition under which natural selection will favour assigning causality between two events. This leads to an intuitive inequality—akin to an amalgam of Hamilton’s rule and Pascal’s wager—-that shows that natural selection can favour strategies that lead to frequent errors in assessment as long as the occasional correct response carries a large fitness benefit. It follows that incorrect responses are the most common when the probability that two events are really associated is low to moderate: very strong associations are rarely incorrect, while natural selection will rarely favour making very weak associations. Extending the model to include multiple events identifies conditions under which natural selection can favour associating events that are never causally related. Specifically, limitations on assigning causal probabilities to pairs of events can favour strategies that lump non-causal associations with causal ones. We conclude that behaviours which are, or appear, superstitious are an inevitable feature of adaptive behaviour in all organisms, including ourselves.
Fforde, Adam. “Confirmation bias: methodological causes and a palliative response.” Quality & Quantity 51.5 (2017): 2319-2335. The paper advocates for changes to normative aspects of belief management in applied research. The central push is to argue for methodologically-required choice to include the possibility of adopting the view that a given dataset contains insufficient regularities for predictive theorising. This is argued to be related to how we should understand differences between predictive and non-predictive knowledges, contrasting Crombie and Nisbet. The proposed direction may also support management practices under conditions of uncertainty.
Druckman, James N., and Mary C. McGrath. “The evidence for motivated reasoning in climate change preference formation.” Nature Climate Change 9.2 (2019): 111-119. Despite a scientific consensus, citizens are divided when it comes to climate change — often along political lines. Democrats or liberals tend to believe that human activity is a primary cause of climate change, whereas Republicans or conservatives are much less likely to hold this belief. A prominent explanation for this divide is that it stems from directional motivated reasoning: individuals reject new information that contradicts their standing beliefs. In this Review, we suggest that the empirical evidence is not so clear, and is equally consistent with a theory in which citizens strive to form accurate beliefs but vary in what they consider to be credible evidence. This suggests a new research agenda on climate change preference formation, and has implications for effective communication.
Rivera-Ramírez, José Domingo. “Precautionary Principle of Science: Guideline of Ethics in Chemistry.” Open Journal of Philosophy 10.03 (2020): 374. 2020: ABSTRACT: Considering the two most applied ethical ideologies in science, the Value Neutrality and the Precautionary Principle, the latter is the ethical criterion that best fits the way in which chemistry has been developed and is currently executed. This work begins with a historical description of each ideology and a comparison of their fundamental statutes. After an analysis of the main problems that humanity has experienced through the chemical sciences—massive accidents, environmental pollution and public health problems—an evaluation is made of how chemistry has applied the Precautionary Principle to evaluate every scientific and technological development and thus reestablish new criteria for the remediation and prevention of harmful scenarios to humanity and the environment. The work concludes that chemistry has established a basis for ethical exercise applying the Precautionary Principle, and this is reflected in pragmatic and objective developments as Green Chemistry, remediation and substitution technologies, and in Sanitary and Environmental Regulation.
PART-3: SUMMARY AND CONCLUSIONS
Signficant evidence is found in the literature survey that (1) human beings, whether or not they are scientists, are subject to supersition. This supersitious tendency likely derives from the survival criteria for early humans where there is harm for the absence of supersition and no harm for supersitious beliefs that turn out to be wrong.
The supersitious nature of humans is the likely source of confirmation bias in climate change research particularly so for the idenfiicaiton of all bad weather events as creations of climate change. Yet, the application of the precautionary principle is not possible because the principle assumes a zero cost of being wrong whereas the climate change movement demands an extremely costly oveerhaul of the world’s energy infrastructure based on the climate change hypothesis.
The cost of a wrong supersition assessment in climate change is not zero but stagerringly large. The rationalization of confirmation bias and supersition is not possible under these conditions. A further analysis of the human traits of confirmation bias and supersition in their assessment of climate change with the precautionary principle is presented in a related post on this site. LINK: https://tambonthongchai.com/2018/08/03/confirmationbias/


THIS POST IS A CRITICAL REVIEW OF A TED CLIMATE SCIENCE LECTURE BY KATHARINE HAYHOE ON YOUTUBE. THE TITLE OF THE LECTURE IS “WHAT IF CLIMATE CHANGE IS REAL?”
LINK TO YOUTUBE VIDEO: https://youtu.be/PtrYNGs9oRM
PART-1: A TRANSCRIPT OF THE LECTURE
We’re known for a lot of things here in West Texas. One of the things we’re very well known for is our fields of white cotton. and the friendly farmers. We don’t just grow cotton we also raise cows and we’re very well known for what we do with those cows (BARBECUE TRI TIP RIBS). And of course no talk at Texas Tech would be complete without mentioning something else we’re very well known for here in West Texas and that is our world class football. But there is something we are not so well known for here in West Texas. And it’s something we’ve seen a lot of lately. And I have to admit that when I was scraping the snow and ice off my car, I was tempted to think “Where’s global warming now? I’d like a little of that please! This is not what our winters should look like.
SO HERE WE ARE TALKING ABOUT WHAT IF THOUGH?
WHAT IF THIS GLOBAL WARMING THING IS … REAL?
WHAT IF, DESPITE OUR HANDS FREEZING OFF AND OUR TOES FEELING LIKE THEY HAVE TO BE AMPUTATED, WHAT IF THIS CLIMATE CHANGE THING IS REALLY REAL????

Well, if we look at what our temperatures look like recently, we see that yes, it was really cold where we are. BUT … if we look at other parts of the world we see that it was really warm. In fact, Alaska was just as much warmer than average as we have been colder than average. The lesson here is that it isn’t just what we see in front of our own eyes that can tell us whether global warming is real.
I want to tell you about a fun little experiment we did a couple of years ago If you take a group of people and divide them in half. Half of the people got put in a room that was quite warm (81F), uncomfortably warm. The other half of the people were put in a room at a comfortable room temperature (73F). And then the people were asked this question: IS CLIMATE CHANGE REAL? What do you think happened? Yes! it’s actually what you think. The group in the uncomfortably warm room said YES, climate change is real, global warming is real, I feel warm. The group in the cold room said NO! I actually feel that it isn’t. THAT’S HOW SENSITIVE WE ARE TO OUR ENVIRONMENT. Our physical experience informed us our brain sometimes in a conscious way and sometimes in an unconscious way. SO WHEN WE LOOK AT THIS ISSUE OF GLOBAL WARMING AND CLIMATE CHANGE WE HAVE TO LOOK GLOBALLY. We can’t just look in the place where we live. Not only that but we have to look over a sufficiently long period of time. Weather is what we all know and remember – what happens at a single place at a single time. Weather is like a tree and climate on the other hand is like a forest. Climate is long term average weather over 20 to 30 years. So (in the climate change issue) we’re not talking about weather, we are talking about WHETHER OUR CLIMATE IS CHANGING. That means we have to look over multiple decades.

So if we look here, this is the temperature of the earth starting in 1900 and we see that one year may be colder than the other and one might be warmer than the other but over the century as we correct for time our planet is getting warmer and warmer. That’s what we see in the data around the world. When we look further into time we see not only that but we are starting to break all kinds of records. Last year was, believe it or not, the warmest year on record. Even while it was slightly below average across the Eastern USA. If you just look at the Eastern USA where people are saying Oh it’s freezing in Atlanta, it’s freezing in Chicago, but the world is like “Oh it’s so hot, come visit us”.
But there are people saying that “This global warming has stopped”. Some of these people even show me some data looking at for example selected time spans such as 2001-2011, and it is true that if you pick and choose specific geographies and or time spans you can find the data that will support whatever hypothesis you’re looking to support (DATA SELECTION BIAS). But if you look at all the data without a data selection bias, you get an unbiased view that is very different. So be suspicious of people touting trends with a data selection bias to prove something. One thing we can know in science is that we can’t just throw out the data we don’t like just because we don’t like it. 5.49

If you go to the doctor and the doctor says you really need this procedure and you say, well, I don’t like it. So no, I will not do it. You are allowed to do that but it is not very wise. (He’s the doctor and you aren’t). And in fact, when you include 2014 in the data series you will see that it was the highest temperature on record. BUT you may say, OK, that’s just a thermometer and humans have been using these thermometers for a while. Is there any other evidence that it was the warmest? There is this. If you go up to the Arctic you will see permanently frozen ground melting and the homes of hundreds of Native American villages crumbling and falling into the ocean. We see that trees and flowers are blooming weeks earlier this year than they have over hundreds of years in the data record. We see that our heavy rainfall and even our heavy snowfall are getting worse as the humidity of the atmosphere increases. Warmer air means more water evaporates from our oceans, our lakes, and our rivers. And we see that our hurricanes are getting stronger because hurricanes are powered by warm ocean water and our oceans are warming.
If we look around the world we see 26,500 indicators of a warming planet. It is telling us that the planet is warming. So, is climate changing? Yes it is.

BUT … hasn’t the climate always changed before? Yes! It has changed before. It has changed because of different amounts of energy from the sun, and it has changed because of natural cycles. That is actually a large part of what we climate scientists do, We spend a lot of time studying the natural reasons of why climate changes on this planet. And when we study those natural reasons, this is what we see. Let’s look at the sun first. This is the temperature of the earth. Now let’s look at the energy from the sun but before we do I just want to ask you this question: If our earth temperature was going up, would that mean we are getting more energy from the sun or less energy from the sun?

The answer is “more” and in fact that’s how we know that we are getting more energy from the sun. Now let’s look at the sun’s energy and earth’s temperature together. The sun’s energy was going up until the 1970s but since then it’s been going down while the temperature went up. It can’t be the sun that’s controlling our current temperature because if it were we’d be getting cooler not warmer. So it can’t be the sun but maybe it’s a natural cycle. We know all the natural cycles here in West Texas, don’t we? We know about El Nino, we know about the droughts that La Nina brings and we’re certainly hoping that we’re going to get a nice wet year in the future. We have to recognize other things All that these cycles do is to move the heat around the earth. They don’t make the entire planet hotter or the entire planet colder They just move the heat from north or east to west and also between atmosphere and ocean.

So when we look at the whole planet there is only one type of natural cycle that causes the whole planet to get warmer and cooler. And that’s the natural cycle that creates glaciation cycles {“ice age”). So we know that we had a glaciation long time ago and we know that the glaciation ended when 10,000 yeas ago So the natural question is aren’t we still getting warmer from deglaciation? The answer is no because the warming trend of deglaciation peaked about 8,000 years ago. And if we look at the earth’s temperature for the last 6,000 years, this is what we see. Do you believe that? Our planet was actually getting cooler and our carbon dioxide level was stable until when?

Something happened about 300 years ago and we can see that in the earth’s record. Then and only then do we as climate scientists have intellectual permission say Ok then well “what happened at that time?”. Why is our climate changing? And when we do, it’s pretty easy to figure out where to point the finger.

We know that about 300 years ago we figured out how to take massive amounts of coal, oil, and natural gas out of the ground and burn them. It was a tremendous technological innovation. We would not be sitting here today if it were not for the industrial revolution (1760). I would not be alive today if it were not for the medical advances that came with the industrial revolution. The industrial revolution was an amazing thing for humankind. It brought untold benefits to our lives and our society… BUT… what we did not know at the time of the industrial revolution was that when we burn fossil fuels it produces a heat trapping gas called carbon dioxide (CO2).

Why do we care about carbon dioxide? We care about it because our planet has this amazing natural blanket. That natural blanket is totally transparent to energy from the sun. The energy from the sun comes in and hits the earth and the earth heats up. And then the earth heats up and gives off heat and those little invisible particles of carbon dioxide, methane, and water vapor trap that heat inside the earth system. Now this is entirely natural and a good thing because we’d be 60F colder today if it weren’t for this natural blanket. We’d be a frozen ball of ice if it weren’t for this blanket. So what’s the problem?
The problem is that we have been adding to that blanket. And so just like my grandma used to do on a cold night, she’d sneak into our room on cold night and put some extra blankets over us because she was afraid we’d freeze to death. We did have central heating. And so we’d wake up in the middle of the night saying “Grandma I did not need this blanket! That’s what we’re doing to our planet. We are sneaking up on our planet and we’re wrapping and extra blanket around our planet. And our planet is heating up.

SO, IS CLIMATE CHANGING? – YES IT IS!
DO WE KNOW WHY IT’S CHANGING? – YES, WE DO.
AND FOR THE FIRST TIME IN HUMAN CIVILIZATION, IT’S US. WE ARE MAKING THE CLIMATE CHANGE.
AND THAT LEADS US TO OUR FINAL QUESTION –
SO WHAT? WHAT DO WE DO?
THIS IS THE MOST IMPORTANT QUESTION BECAUSE THIS IS WHY WE CARE ABOUT CLIMATE CHANGE
IT’S BECAUSE OUR CIVILIZATION IS BUILT ON THE ASSUMPTION THAT WE CAN HAVE UPS AND DOWNS HEAT AND DROUGHT COLD AND WET BUT IT WILL ALL AVERAGE OUT IN THE END. WHAT HAPPENS IF IT DOESN’T AVERAGE OUT IN THE END?

A THOUSAND YEARS AGO WHAT WOULD HAPPEN IF SEA LEVELS WENT UP 3 FEET IN HOUSTON? OR IF THE CARIBOU MOVED TO NORTHERN ALASKA? NOT MUCH. BUT TODAY SUCH CHANGES WOULD HAVE A MAJOR IMPACT AND THAT IS WHY WE CARE ABOUT CLIMATE CHANGE. WE ARE ALREADY SEEING THE IMPACTS OF CLIMATE CHANGE HERE IN TEXAS. WE’RE SEEING INCREASED RISK OF WILDFIRES LIKE THE 2011 WILDFIRES. WE’RE SEEING INCREASED RISK OF DROUGHT AS HOTTER TEMPERATURES BAKE OUR SOIL. AND WE’RE EVEN SEEING INCREASED RISK OF FLOODS AS WARMER AIR MEANS MORE WATER AVAILABLE IN THE ATMOSPHERE THAT THE STORMS COULD PICK UP AND DUMP ON US. CLIMATE MAKES WEATHER MORE EXTREME.

SO WHAT IF CLIMATE CHANGE IS REAL, WHAT ARE WE SUPPOSED TO DO? TWO THINGS. THE FIRST THING WE NEED TO DO IS TO PREPARE FOR A CHANGING CLIMATE. FOR YEARS WE HAVE BEEN CONDUCTING OUR SOCIETY AS IF WE’RE DRIVING DOWN A DEAD STRAIGHT ROAD FROM LUBBOCK TO PLAINVIEW LOOKING IN OUR REAR VIEW MIRROR TO KEEP US ON THE ROAD. IT WORKS AS LONG AS THE ROAD WE ARE ON HAS NOT CHANGED SINCE WE WERE LAST ON IT UNLESS THERE IS A NEW CURVE IN THE ROAD UP AHEAD THAT WASN’T THERE BEFORE SO THAT IF YOU ARE STILL LOOKING IN YOUR REAR VIEW MIRROR YOU WILL HAVE A BAD ACCIDENT. THE ANALOGY IS THAT BUSINESS AS USUAL OF PLANNING FOR THE FUTURE WITH WHAT WE KNOW ABOUT THE PAST IS NO LONGER GOING TO WORK. WENOW HAVE TO RELY ON UNCERTAIN FORECASTS OF THE FUTURE AND PREPARE FOR DIFFERENT OUTCOMES THAT WHAT WE ARE USED TO. THE PAST IS A GOOD PREDICTER FOR THE FUTURE ONLY UNDER STEADY STATE CONDITIONS. THIS METHOD OF PLANNING FOR THE FUTURE IS NO LONGER VALID BECAUSE OF OUR CHANGING CLIMATE.

THIS IS WHY WE NEED CLIMATE SCIENTISTS TO LOOK DOWN THE ROAD FOR 20, 30, OR 40 YEARS TO TELL US THIS IS WHAT OUR WATER’S GOING TO LOOK LIKE, THIS IS WHAT OUR ENERGY DEMAND IS GOING TO LOOK LIKE, AND THESE ARE THE PLACE WHERE WE CAN GROW OUR FOOD AND OUR CROPS. BUT THERE IS ONE MORE THING WE NEED TO DO. WE ARE PUTTING THIS EXTRA BLANKET AROUND THE EARTH AND THAT BLANKET IS MADE UP OF HEAT TRAPPING GASES LIKE CARBON DIOXIDE. IF WE KEEP WRAPPING THICKER AND THICKER BLANKETS AROUND THE PLANET IT WILL HEAT UP FASTER AND FASTER AND OUR EXTREME WEATHER DESTRUCTIONS ARE GOING TO GET WORSE AND WORSE. AND THAT IS WHY IT IS ESSENTIAL TO TRANSITION FROM OUR OLD, DIRTY, AND INEFFICIENT SOURCES OF ENERGY TO NEW CLEAN SOURCES OF ENERGY THAT WILL NEVER RUN OUT ON US. EVERY TIME WE FLY INTO LUBBOCK WE SEE THESE ENORMOUS WINDFARMS THAT WE PASS OVER. LAST YEAR WE SET A RECORD HERE IN TEXAS WHEN IN THE MONTH OF MARCH WE GOT A THIRD OF OUR ELECTRICITY FROM WIND. SO THE ANSWER TO WHAT IF CLIMATE CHANGE IS REAL IS A CLEAN ENERGY ECONOMY. WE HAVE ENOUGH SOLAR POTENTIAL HERE IN TEXAS TO POWER THE ENTIRE WORLD TWO TIMES OVER. THIS IS WHY MY ANSWER TO THE QUESTION OF “WHAT IF CLIMATE CHANGE IS REAL” IS THAT IF CLIMATE CHANGE IS REAL TEXAS CAN LEAD THE WORLD TO A BETTER FURURE.


PART-2: CRITICAL COMMENTARY
ITEM#1: SO WHEN WE LOOK AT THIS ISSUE OF GLOBAL WARMING AND CLIMATE CHANGE WE HAVE TO LOOK GLOBALLY. We can’t just look in the place where we live. Not only that but we have to look over a sufficiently long period of time. Weather is what we all know and remember – what happens at a single place at a single time. Weather is like a tree and climate on the other hand is like a forest. Climate is long term average weather over 20 to 30 years. So (in the climate change issue) we’re not talking about weather, we are talking about WHETHER OUR CLIMATE IS CHANGING. That means we have to look over multiple decades.
COMMENT#1: This is a very relevant and important principle in climate science that is apparently not well understood in climate science. In the literature it is described as the “Internal Climate Variability” issue. It is described in a related post: LINK: https://tambonthongchai.com/2020/07/16/the-internal-variability-issue/ (ICV). Briefly, anthropogenic global warming and climate change (AGW) is a theory about long term trends in global mean temperature and therefore the data can only be understood in this context, over a sufficiently long time span of 30 years or more, and over a sufficient geographical span that is global or a significant latitudinal section thereof. Specifically, the ICV issue implies that geographically localized data over brief time spans have no interpretation in terms of AGW. Although climate scientists preach this principle with great force and clarity as seen in the lecture being reviewed, it is frequently violated by climate scientists themselves as seen in later parts of this very lecture. Example#1: As an example, the hottest year on record claim found in this lecture is a one-year event without the long multi-decadal time span needed for an interpretation in terms of AGW. Example#2: Yet another violation of this ICV principle of climate science claimed in this lecture is the usual practice in climate science of claiming individual localized extreme weather and wildfire events post hoc as a climate change impact with the implication that the climate action demanded by climate science will prevent such events from occurring in the future. This part of climate science, described as science, as in “Event Attribution Science”, is a violation of the ICV principle and the principle of the long time span and large global span requirements explained so well in this lecture. This excellent lecture implies that for climate science to claim an impact of AGW on extreme weather, a long term trend in a summation of all global extreme weather events must be shown and a rationale for the observed trend if any must be provided apriori. Details of this issue in terms of the methodological flaws in the extreme weather event attribution procedure of climate science are explored in a number of related posts on this site:
LINK#1: https://tambonthongchai.com/2020/10/18/climate-change-causes-extreme-weather-events/
LINK#2: https://tambonthongchai.com/2020/06/29/diffenbaugh-2017-extreme-weather-of-climate-change/
LINK#3: https://tambonthongchai.com/2018/07/10/event-attribution-science-a-case-study/
LINK#4: https://tambonthongchai.com/2020/07/16/the-internal-variability-issue/
LINK#5: https://tambonthongchai.com/2020/07/26/climate-change-kills/

ITEM#2: DATA SELECTION BIAS: The lecture rightly points out that climate denial arguments that commit the fallacy of data selection bias are not credible. As for example CLIMATE CHANGE DENIERS claiming that “GLOBAL WARMING HAS STOPPED” select specific geographical regions or time spans where the data appear to support their hypothesis. These claims do not show that climate change is not real or that climate change has stopped because of the flawed and biased methodology employed.
COMMENT#2: It is true that data selection bias in research invalidates the the conclusions drawn from that research but that principle applies equally to deniers and climate scientists. In the lecture we are told that “So when we look at the whole planet there is only one type of natural cycle that causes the whole planet to get warmer and cooler. And that’s the natural cycle that creates glaciation cycles“. This statement is false. There are significant temperature cycles in the Holocene interglacial and the one we are in is the most recent such temperature cycle. Therefore, that climate science has chosen to explain only one of them is a form of data selection bias. Therefore data selection bias is also found in the work of climate science that has itself stated in this lecture that data selection bias invalidates research findings. The most significant example of data selection bias in climate science is the selection of the warming since the end of the Little Ice Age (LIA) as a cause and effect phenomenon that can be explained in terms of human activity since the industrial revolution that appears to be timed just right for that change in the temperature trend. Significantly this warming cycle of the Holocene comes late in the progress of the interglacial for about 9,000 years. It was preceded by 8 other temperature cycles at centennial and millennial time scales, both warming and cooling described in a related post: LINK: https://tambonthongchai.com/2019/06/11/chaoticholocene/ . There we provide an extensive literature review and find that (1) {Glaciation is not a linear and well behaved period of cooling and ice accumulation and deglaciation is not a linear and well behaved period of warming and ice dissipation but that rather, both glaciation and deglaciation are chaotic events consisting of both processes and differentiated only by a slight advantage to ice accumulation in glaciation and a slight advantage to ice dissipation in deglaciation and interglacials}. In this context we propose that if climate science can explain these Holocene temperature cycles as deterministic cause and effect phenomena, they must explain all of them and not just pick one of them to explain in that way because that kind of empirical research is subject to data selection bias, confirmation bias, and circular reasoning. Therefore, the climate science of Anthropogenic Global Warming and Climate Change that has selected only the post LIA warming cycle to explain as a cause and effect phenomenon contains a fatal methodological flaw. In a related post, Charles David Keeling, of Keeling Curve fame, makes a similar argument against climate science and proposes a theory to explain all of the temperature cycles of the Holocene: LINK: https://tambonthongchai.com/2018/08/05/tidalcyclesbiblio/ . The Keeling paper is discussed in this related post in the context of a bibliography of other theories to explain Holocene temperature cycles. This is what climate science must do. A theory to explain Holocene temperature cycles must explain all of them. The selection of the most recent such cycle to explain as a cause and effect phenomenon is a case of data selection bias and confirmation bias.

ITEM#3: CHANGES OF CONVENIENCE TO FIXED PARAMETERS: The lecturer shows a chart of atmospheric CO2 data saying that throughout history the CO2 level has been steady with some random up and down movements without a trend but that then something happened 300 years ago that coincides with a rising trend in atmospheric CO2 as seen in the chart above. We are told that the something that happened 300 years ago was the Industrial Revolution. The theory of AGW climate change as proposed is that it is a creation of the industrial economy such that it can be traced to the Industrial Revolution when the industrial economy got started.
COMMENT#3: The Industrial Revolution is thought to have started in England and spread to Europe and America over the period 1760 to 1840 with a median of 1800. The importance of the determination of this start year lies in the importance in climate science of “warming since pre-industrial“. This is because climate science has determined that the amount of warming since pre-industrial must not be allowed to exceed a benchmark value beyond which natural feedback processes will take over and make it impossible for humans to contain climate change with climate action consisting of not using fossil fuels. Therefore it is critical for us to know the reference pre-industrial year and to know the temperature in that reference pre-industrial year. In this lecture we are shown that the CO2 level began rising “300 years ago” which we can assume to be the year 2020-300=1720 but there was no industrial economy in 1720 so perhaps it’s a rounding error and that the real start of the curve was 260 years ago in 1760. This is consistent with the first IPCC report of 2001 where the pre-industrial reference year is set to 1760. However, in the 2015 IPCC report this pre-industrial year when CO2 began to rise and where we must find the reference temperature to measure warming since pre-industrial, was moved up to 1850, 170 years ago and not 300 years ago. This change is inconsistent with the claim in the lecture that what got this global warming started was something that happened 300 years ago. We also find that the identification of this critical pre-industrial year is different among climate scientists. For example NASA rocket scientists turned climate scientists and their premier climate scientist Dr. James Hansen have determined that this critical pre-industrial reference year when global warming began is 1950 and not 1850 found in the IPCC 2015 report. Yet another complication in this matter is an unsettled issue in climate science called the ETCW or Early Twentieth Century Warming discussed in a related post:
LINK: https://tambonthongchai.com/2020/10/09/the-etcw-issue-in-climate-science/
This is a reference to the determination by climate science that the observed warming from 1850 to 1950 is anomalous and cannot be explained in terms of rising CO2. See for example Knutson 2000 where he writes, “The observed global warming of the past century occurred primarily in two distinct 20-year periods, from 1925 to 1944 and from 1978 to the present. Although the latter warming is often attributed to a human-induced increase of greenhouse gases, causes of the earlier warming are less clear because this period precedes the time of strongest increases in human-induced greenhouse gas (radiative) forcing. Results from a set of six integrations of a coupled ocean-atmosphere climate model suggest that the warming of the early 20th century could have resulted from a combination of human-induced radiative forcing and an unusually large realization of internal multidecadal variability of the coupled ocean-atmosphere system. This conclusion is dependent on the model’s climate sensitivity, internal variability, and the specification of the time-varying human-induced radiative forcing”. The use of 1950 as the reference pre-industrial year by NASA and others may be understood in this context. The conclusion we draw about the CO2 curve presented by the lecturer showing that the the long term CO2 curve with no trend “began to rise 300 years ago because something happened 300 years ago” is that this curve has no interpretation in terms of climate change science as understood by James Hansen, NASA, and the IPCC.

ITEM#4: The problem is that we have been adding to that blanket. And so just like my grandma used to do on a cold night, she’d sneak into our room on cold night and put some extra blankets over us because she was afraid we’d freeze to death. And so we’d wake up in the middle of the night saying “Grandma I did not need this blanket! That’s what we’re doing to our planet. We are sneaking up on our planet and we’re wrapping and extra blanket around our planet. And our planet is heating up.
COMMENT#4: This is the case against fossil fuels in climate change science. It holds that since the Industrial Revolution atmospheric CO2 concentration has been going up (ignoring the data here that the rise in atmospheric CO2 began prior to the Industrial Revolution as seen in item 3 above) and what is different about the industrial economy is that we are burning fossil fuels. Based on this concurrence climate science concluded that there must be a causation relationship between fossil fuel emissions and atmospheric CO2 concentration such that fossil fuel emissions of the industrial economy cause atmospheric CO2 to rise. This is the critical link between fossil fuels and global warming. Yet, as seen in the many humorous examples of spurious correlations by Tyler Vigen, concurrence does not prove causation.

Although it is true that correlation does not prove causation, at the minimum it must be shown that a statistically significant relationship exists at the time scale of the proposed causation. In climate science the assumed time scale is annual such that annual fossil fuel emissions cause annual changes in atmospheric CO2 concentration. The time scale can be imposed on the correlation test by first detrending the two time series and then computing the detrended correlation between them at the time scale of interest, in this case an annual time scale. This work has been carried out in two related posts on this site where we find no detrended correlation at an annual time scale:
LINK#1: https://tambonthongchai.com/2018/12/19/co2responsiveness/
LINK#2: https://tambonthongchai.com/2020/11/11/annual-changes-in-mlo-co2/
A more comprehensive study of the relationship between emissions and changes in atmospheric CO2 concentration is presented in yet another related post on this site:
LINK: https://tambonthongchai.com/2020/11/21/the-case-against-fossil-fuels/
The results of the analysis presented in these related posts show that there is no evidence that atmospheric CO2 concentration is responsive to fossil fuel emissions at an annual time scale. The observation by climate science that atmospheric CO2 has been rising at a time of fossil fuel emissions is a case of concurrence that does not serve as evidence for causation. No evidence for this critical causation in climate science is found in climate science or in this lecture.


ITEM#5: CLEAN ENERGY: At the end of a lecture on how fossil fuel emissions cause warming with the implication that warming can be hazardous and that the rate of warming can and must be attenuated by giving up fossil fuel energy and moving the world energy infrastructure to renewables, the lecturer notes with great pleasure that the world and the lecturer’s home state of Texas are moving away from fossil fuels to CLEAN ENERGY.
COMMENT#5: CLEAN ENERGY: That the relevant issue here is cleanliness and not climate change suggests that the underlying activism in climate activism is anti fossil fuel activism and not climate science. This aspect of climate science is explored in a related post: LINK: https://tambonthongchai.com/2020/03/23/anti-fossil-fuel-activism-disguised-as-climate-science/



FOOTNOTE: THE SOLAR CYCLE

In a sunspot cycle presentation, Professor Hayhoe displays the chart shown above where a span of 40 years is presented to demonstrate the absence in the responsiveness of temperature to the sunspot cycle. We show in a related post both in terms of data and in terms of bibliography, that 40 years is not a sufficient time span for this analysis. LINK: https://tambonthongchai.com/2019/06/19/sunspots/ . There we show that the temperature relevance of the sunspot cycle must be studied over very long time spans longer than 100 years.
BIBLIOGRAPHY: PRECAUTIONARY PRINCIPLE IN THE ENVIRONMENTAL SCIENCES
- Foster, Kenneth R., Paolo Vecchia, and Michael H. Repacholi. “Science and the precautionary principle.” Science 288.5468 (2000): 979-981. The Precautionary Principle has become enshrined in international law, and is the basis for European environmental legislation. However, “precautionary” decisions have been controversial, and the principle itself lacks clear definition. A recent commentary by the European Commission offers guidelines for politically transparent application of the principle, while emphasizing the need for careful review of relevant scientific data. Recent precautionary policies for limiting public exposure to radio-frequency energy shows that the principle can be applied in a way that does not conflict with traditional exposure guidelines. Major uncertainties still remain in the standard of proof needed to invoke the principle.
- Kriebel, David, et al. “The precautionary principle in environmental science.” Environmental health perspectives 109.9 (2001): 871-876. Environmental scientists play a key role in society’s responses to environmental problems, and many of the studies they perform are intended ultimately to affect policy. The precautionary principle, proposed as a new guideline in environmental decision making, has four central components: taking preventive action in the face of uncertainty; shifting the burden of proof to the proponents of an activity; exploring a wide range of alternatives to possibly harmful actions; and increasing public participation in decision making. In this paper we examine the implications of the precautionary principle for environmental scientists, whose work often involves studying highly complex, poorly understood systems, while at the same time facing conflicting pressures from those who seek to balance economic growth and environmental protection. In this complicated and contested terrain, it is useful to examine the methodologies of science and to consider ways that, without compromising integrity and objectivity, research can be more or less helpful to those who would act with precaution. We argue that a shift to more precautionary policies creates opportunities and challenges for scientists to think differently about the ways they conduct studies and communicate results. There is a complicated feedback relation between the discoveries of science and the setting of policy. While maintaining their objectivity and focus on understanding the world, environmental scientists should be aware of the policy uses of their work and of their social responsibility to do science that protects human health and the environment. The precautionary principle highlights this tight, challenging linkage between science and policy.
- Smith, Roy L. “Risk-based concentrations: prioritizing environmental problems using limited data.” Toxicology 106.1-3 (1996): 243-266. A difficult task faced by regulatory agencies is that of choosing, on the basis of limited data, which environmental problems to address. This paper incorporates USEPA risk assessment methods into a quantitative approach for prioritizing locations, contaminants and media according to potential health risk. USEPA has developed either a reference dose (a chronic dose without adverse effect) or slope factor (upper bound lifetime cancer risk per mg · kg−1 · d−1) for many substances. This work combines these ‘toxicological constants’ with predetermined risk levels (either a 10−6 cancer risk or a chronic intake equal to the reference dose) and protective human exposure assumptions (e.g. 70-kg body mass, 30-year exposure, 2-l · d−1 drinking water ingestion, etc.) to produce risk-based concentrations for 596 contaminants in air, drinking water, edible fish and soil. Because USEPA designed its methods to estimate upper bound risks, these risk-based concentrations are likely to be protective of human health. Regulatory officials can use this information to calculate numerical ratios between measured environmental levels and risk-based concentrations. These ratios serve as a surrogate for potential health impacts and can be used to prioritize problems for attention. Ratio calculation and ranking can be automated for searches of computerized environmental databases.
- Goldstein, Bernard D. “The precautionary principle also applies to public health actions.” American Journal of Public Health 91.9 (2001): 1358-1361. The precautionary principle asserts that the burden of proof for potentially harmful actions by industry or government rests on the assurance of safety and that when there are threats of serious damage, scientific uncertainty must be resolved in favor of prevention. Yet we in public health are sometimes guilty of not adhering to this principle. Examples of actions with unintended negative consequences include the addition of methyl tert-butyl ether to gasoline in the United States to decrease air pollution, the drilling of tube wells in Bangladesh to avoid surface water microbial contamination, and villagewide parenteral antischistosomiasis therapy in Egypt. Each of these actions had unintended negative consequences. Lessons include the importance of multidisciplinary approaches to public health and the value of risk–benefit analysis, of public health surveillance, and of a functioning tort system—all of which contribute to effective precautionary approaches. PUBLIC HEALTH ADVOCATES around the world have increasingly invoked the precautionary principle as a basis for preventive actions.1–9 This has been particularly true for environmental and food safety issues, in which the precautionary principle has moved from being a rallying cry for environmental advocates to a legal principle embodied in international treaties.2,6,8–11 Definitional issues have become more important as the term has made the transition from a noble goal to a component of legal requirements. For the purposes of this commentary, a useful definition is one that is contained in the 1989 Rio Declaration12: “Nations shall use the precautionary approach to protect the environment. Where there are threats of serious or irreversible damage, scientific uncertainty shall not be used to postpone cost-effective measures to prevent environmental degradation.” The upsurge in use of the term “precautionary principle” has been relatively sudden. For example, changes in the approach to hazardous air pollutants in the 1990 US Clean Air Act Amendments embody the precautionary principle. Until then, control of individual air pollutants in this category depended on a risk-based approach in which the burden of proof was on the US Environmental Protection Agency (EPA) to demonstrate that environmental levels of the air pollutant were likely to produce adverse effects. Further, the extent of imposed control measures was based on the feasibility of reducing risk. Instead, the 1990 amendments state that maximal available control technology is to be used on each of more than 180 pollutants unless the pollutant can be clearly shown to be harmless. Shifting the burden of proof and moving away from risk science to a technology-based approach were much debated at the time, but the term “precautionary principle” was not part of the debate. Now it certainly would be, although whether this precautionary approach will be more successful than the previous risk-based approach is still open to debate. For example, germane to the broader issue of the value of the precautionary principle is the question of whether regulating specific air pollutant emission control technology will stifle the invention and application of newer, more effective technology. At its core, the precautionary principle contains many of the attributes of good public health practice, including a focus on primary prevention and a recognition that unforeseen and unwanted consequences of human activities are not unusual. Yet there are at least 3 recently reported examples of actions taken in the name of improving public health that would better have been avoided or at least considered more carefully beforehand. I argue that the precautionary principle needs to be applied to public health actions as well as to actions pursued by government and industry for competitive and economic reasons. It is not my intention to provide a well-rounded critique of the precautionary principle, which is discussed by Kriebel and Tickner13 and by Jamieson and Wartenburg14 in this issue of the Journal.
BIBLIOGRAPHY: DATA SELECTION BIAS
- Ambroise, Christophe, and Geoffrey J. McLachlan. “Selection bias in gene extraction on the basis of microarray gene-expression data.” Proceedings of the national academy of sciences 99.10 (2002): 6562-6566. In the context of cancer diagnosis and treatment, we consider the problem of constructing an accurate prediction rule on the basis of a relatively small number of tumor tissue samples of known type containing the expression data on very many (possibly thousands) genes. Recently, results have been presented in the literature suggesting that it is possible to construct a prediction rule from only a few genes such that it has a negligible prediction error rate. However, in these results the test error or the leave-one-out cross-validated error is calculated without allowance for the selection bias. There is no allowance because the rule is either tested on tissue samples that were used in the first instance to select the genes being used in the rule or because the cross-validation of the rule is not external to the selection process; that is, gene selection is not performed in training the rule at each stage of the cross-validation process. We describe how in practice the selection bias can be assessed and corrected for by either performing a cross-validation or applying the bootstrap external to the selection process. We recommend using 10-fold rather than leave-one-out cross-validation, and concerning the bootstrap, we suggest using the so-called .632+ bootstrap error estimate designed to handle overfitted prediction rules. Using two published data sets, we demonstrate that when correction is made for the selection bias, the cross-validated error is no longer zero for a subset of only a few genes.
- Reid, J. Leighton, Matthew E. Fagan, and Rakan A. Zahawi. “Positive site selection bias in meta-analyses comparing natural regeneration to active forest restoration.” Science advances 4.5 (2018): eaas9143. Several recent meta-analyses have aimed to determine whether natural regeneration is more effective at recovering tropical forests than active restoration (for example, tree planting). We reviewed this literature and found that comparisons between strategies are biased by positive site selection. Studies of natural forest regeneration are generally conducted at sites where a secondary forest was already present, whereas tree planting studies are done in a broad range of site conditions, including non-forested sites that may not have regenerated in the absence of planting. Thus, a level of success in forest regeneration is guaranteed for many studies representing natural regeneration, but not for those representing active restoration. The complexity of optimizing forest restoration is best addressed by paired experimentation at the same site, replicated across landscapes. Studies that have taken this approach reach different conclusions than those arising from meta-analyses; the results of paired experimental comparisons emphasize that natural regeneration is a highly variable process and that active restoration and natural regeneration are complementary strategies.
- Bevan, Shaun, et al. “Understanding selection bias, time-lags and measurement bias in secondary data sources: Putting the Encyclopedia of Associations database in broader context.” Social science research 42.6 (2013): 1750-1764. Secondary data gathered for purposes other than research play an important role in the social sciences. A recent data release has made an important source of publicly available data on associational interests, the Encyclopedia of Associations (EA), readily accessible to scholars (www.policyagendas.org). In this paper we introduce these new data and systematically investigate issues of lag between events and subsequent reporting in the EA, as these have important but under-appreciated effects on time-series statistical models. We further analyze the accuracy and coverage of the database in numerous ways. Our study serves as a guide to potential users of this database, but we also reflect upon a number of issues that should concern all researchers who use secondary data such as newspaper records, IRS reports and FBI Uniform Crime Reports.
- Whitehead, John C. “Environmental interest group behavior and self‐selection bias in contingent valuation mail surveys.” Growth and Change 22.1 (1991): 10-20. This paper analyzes the behavior of a general sample and an environmental interest group sample in a contingent market for wetlands preservation. Mail survey response rates and environmental values for wetlands preservation are significantly greater in the environmental interest group sample than in the general population sample. An estimate of the potential self‐selection bias in the benefits of wetlands preservation is made. These results suggest that self‐selection bias in contingent valuation mail surveys could upwardly bias aggregate benefit estimates as much as 50 percent. Potential, but costly, solutions to the problem of self‐selection bias are suggested.
ANTHROPOCENE: THE EVIDENCE
Posted December 24, 2020
on:
THIS POST IS A CRITICAL REVIEW OF AN ARTICLE IN THE CONVERSATION WHERE MARK MASLIN PRESENTS A REVISED VERSION OF HIS ANTHROPOCENE THEORY WHERE THE BEGINNING OF THE ANTHROPOCENE IS MOVED FROM THE INDUSTRIAL REVOLUTION BACK TO THE 15TH CENTURY HORSE AND BUGGY DAYS WHEN THE GLOBAL NORTH HAD STARTED THEIR COLONIZATION OF THE GLOBAL SOUTH. LINK: https://theconversation.com/why-the-anthropocene-began-with-european-colonisation-mass-slavery-and-the-great-dying-of-the-16th-century-140661 . THE RATIONALE FOR THIS CHANGE IS PROVIDED IN TERMS OF THE TOPPLING OF STATUES IN THE USA DURING THE BLACK LIVES MATTER RIOTS. QUOTE: “The toppling of statues at Black Lives Matter protests has powerfully articulated that the roots of modern racism lie in European colonization and slavery”. A further argument is offered as follows: “Racism will be more forcefully opposed once we acknowledge this history and learn from it“. This is how racism and the black lives matter movement have changed the theory of the Anthropocene.

Why the Anthropocene began with European colonisation, mass slavery and the ‘great dying’ of the 16th century. June 25, 2020 8.23pm AEST, Authors: Mark Maslin, Professor of Earth System Science, UCL & Simon Lewis, Professor of Global Change Science at University of Leeds and, UCL.
About the authors:
Mark Maslin is a Founding Director of Rezatec Ltd, Co-Director of The London NERC Doctoral Training Partnership and a member of Cheltenham Science Festival Advisory Committee. He is an unpaid member of the Sopra-Steria CSR Board. He has received grant funding in the past from the NERC, EPSRC, ESRC, Royal Society, DIFD, DECC, FCO, Innovate UK, Carbon Trust, UK Space Agency, European Space Agency, Resaerch England, Wellcome Trust, Leverhulme Trust and British Council. He has received research funding in the past from The Lancet, Laithwaites, Seventh Generation, Channel 4, JLT Re, WWF, Hermes, CAFOD and Royal Institute of Chartered Surveyors.
Simon Lewis has received funding from Natural Environment Research Council, the Royal Society, the European Union, the Leverhulme Trust, the Centre for International Forestry, National Parks Agency of Gabon, Microsoft Research, the Gordon and Betty Moore Foundation, the Greenpeace Fund, and the David and Lucile Packard Foundation.
PART-1: WHAT THE SOURCE PAPER SAYS
The toppling of statues at Black Lives Matter protests has powerfully articulated that the roots of modern racism lie in European colonisation and slavery. Racism will be more forcefully opposed once we acknowledge this history and learn from it. Geographers and geologists can help contribute to this new understanding of our past, by defining the new human-dominated period of Earth’s history as beginning with European colonialism.
Today our impacts on the environment are immense: humans move more soil, rock and sediment each year than is transported by all other natural processes combined. We may have kicked off the sixth “mass extinction” in Earth’s history, and the global climate is warming so fast we have delayed the next ice age. We’ve made enough concrete to cover the entire surface of the Earth in a layer two millimetres thick. Enough plastic has been manufactured to clingfilm it as well. We annually produce 4.8 billion tonnes of our top five crops and 4.8 billion livestock animals. There are 1.4 billion motor vehicles, 2 billion personal computers, and more mobile phones than the 7.8 billion people on Earth. All this suggests humans have become a geological superpower.
The evidence of our impact will be visible in rocks millions of years from now. This is a new geological epoch that scientists are calling the Anthropocene, combining the words for “human” and “recent-time”. But debate still continues as to when we should define the beginning of this period. When exactly did we leave behind the Holocene – the 10,000 years of stability that allowed farming and complex civilisations to develop – and move into the new epoch? Five years ago we published evidence that the start of capitalism and European colonisation meet the formal scientific criteria for the start of the Anthropocene.
Our planetary impacts have increased since our earliest ancestors stepped down from the trees, at first by hunting some animal species to extinction. Much later, following the development of farming and agricultural societies, we started to change the climate. Yet Earth only truly became a “human planet” with the emergence of capitalism, which itself grew out of European expansion in the 15th and 16th century and the era of colonisation and subjugation of indigenous peoples all around the world.
In central and south of the Americas, just 100 years after Christopher Columbus first set foot on the Bahamas in 1492, 56 million indigenous Americans were dead, mainly in South and Central America. This was 90% of the population. Most were killed by diseases brought across the Atlantic by Europeans, which had never been seen before in the Americas: measles, smallpox, influenza, the bubonic plague. War, slavery and wave after wave of disease combined to cause this “great dying”, something the world had never seen before, or since. In North America the population decline was slower but no less dramatic due to slower colonisation by Europeans. US census data suggest the Native American population may have been as low as 250,000 people by 1900 from a pre-Columbus level of 5 million, a 95% decline. This depopulation left the continents dominated by Europeans, who set up plantations and filled a labour shortage with enslaved workers. In total, more than 12 million people were forced to leave Africa and work for Europeans as slaves.
One further impact of the great dying was that there were at first very few farmers left to manage the fields and forests. Our image of the Native American hunting buffalo on horseback is false – those who adopted this new lifestyle only did so because they had been forced off their land by the European invaders, who also brought with them the horse. Most pre-Columbus indigenous Americans were farmers. In their absence, previously managed landscapes returned to their natural states, with new trees absorbing carbon from the atmosphere. So large was this carbon uptake that there is a drop in atmospheric carbon dioxide recorded in Antarctic ice cores, centred around the year 1610.
The deadly diseases hitched a ride on new shipping routes, as did many other plants and animals. This reconnecting of the continents and ocean basins for the first time in 200 million years has set Earth on a new developmental trajectory. The ongoing mixing and re-ordering of life on Earth will be seen in future rocks millions of years in the future. The drop in carbon dioxide at 1610 provides a first marker in a geological sediment associated with this new global, more homogeneous, ecology, and so provides a sensible start date for the new Anthropocene epoch.
In addition to the critical task of highlighting and tackling the racism within science, perhaps geologists and geographers can also make a small contribution to the Black Lives Matter movement by unflinchingly compiling the evidence showing that when humans started to exert a huge influence on the Earth’s environment was also the start of the brutal European colonisation of the world.
In her insightful book, A Billion Black Anthropocenes or None, the geography professor Kathryn Yusoff makes it very clear that predominantly white geologists and geographers need to acknowledge that Europeans decimated indigenous and minority populations whenever so-called progress occurred.
Defining the start of the human planet as the period of colonisation, the spread of deadly diseases and transatlantic slavery, means we can face the past and ensure we deal with its toxic legacy. If 1610 marks both a turning point in human relations with the Earth and our treatment of each other, then maybe, just maybe, 2020 could mark the start of a new chapter of equality, environmental justice and stewardship of the only planet in the universe known to harbour any life. It’s a struggle nobody can afford to lose.

PART-2: CRITICAL COMMENTARY
(1) THE DROP IN CARBON DIOXIDE IN 1610 IS ATTRIBUTED TO A DECLINE IN NATIVE FARMING AS A RESULT OF A RAPID DECLINE IN THE POPULATION OF NATIVE FARMERS BECAUSE OF A LARGE FATALITY RATE FROM DISEASES BROUGHT TO THE NEW WORLD BY THE EUROPEANS. THE LARGE AREAS OF LAND LEFT FALLOW TO RETURN TO FOREST IS SAID TO HAVE CAUSED A DROP IN ATMOSPHERIC CO2 BY WAY OF PHOTOSYNTHESIS. AN ANALYSIS OF THIS HYPOTHESIS IS PRESENTED IN A RELATED POST: LINK: https://tambonthongchai.com/2019/04/23/european-colonization-of-america-the-lia/ . THE AREA OF LAND INVOLVED IS 556,847 SQ KM. EQUAL TO ABOUT 0.374% OF THE WORLD’S LAND AREA. SINCE ABOUT 60% OF THE WORLD’S PHOTOSYNTHESIS IS IN THE OCEAN, 0.374% OF THE WORLD’S LAND COULD EXPLAIN A CHANGE OF NO MORE THAN 0.15%. IT IS NOT POSSIBLE FOR THAT PHOTOSYNTHESIS EFFECT TO EXPLAIN THE CHANGE SEEN IN THE LAWDOME CO2.


(2) THE PROPOSITION THAT COLONIZATION AND SLAVERY FROM THE 15TH TO THE 18TH CENTURIES BY THE EUROPEANS AND THE OPPOSITION TO SLAVERY BY THE BLACK LIVES MATTER MOVEMENT OF THE 21ST CENTURY CREATED A HUMAN CONTROLLED PLANET AND THE ERA OF THE ANTHROPOCENE OVERLOOKS EARLIER COLONIZATION AND SLAVERY EVENTS IN THE LONG HISTORY OF HUMAN CIVILIZATION THAT GOES BACK THOUSANDS OF YEARS TO THE BRONZE AGE AND WELL BEYOND TO MESOPOTAMIA. THESE INVASIONS, CONQUESTS, AND ENSLAVEMENT EVENTS WERE WELL BEYOND THE COMPARITIVELY MILD AND HUMANE VERSION OF IT WE SEE IN OUR ERA THE LATE IRON AGE.
CONQUEST AND SLAVERY IS NOT AN INNOVATION OF LATE IRON AGE EUROPEAN CIVILIZATION AND THE MOVEMENT AGAINST ITS INHUMANE CRUELTY IS NOT AN INNOVATION OF THE BLACK LIVES MATTER MOVEMENT. IF THESE THINGS CREATE AN ANTHROPOCENE, THEN THE ANTHROPOCENE IS THOUSANDS OF YEARS OLD.


(3) IN EVALUATING THE CLAIM THAT HUMANS MOVE MORE EARTH THAN NATURE DOES WE FIND THAT HUMANS MOVE ABOUT 45 GIGATONS OF EARTH PER YEAR. A SINGLE VOLCANIC ERUPTION MOVES ABOUT 3 TO 4 GIGATONS OF TEPHRA OR SAY 3.5 GIGATONS ON AVERAGE. TYPICALLY WE WILL HAVE ABOUT 60 VOLCANIC ERUPTIONS PER YEAR FOR A TOTAL AMOUNT OF TEPHRA MOVED EQUAL TO 210 GIGATONS. CONCLUSION: HUMANS DO NOT MORE MORE DIRT THAN NATURE DOES. AND THAT’S ONLY LAND VOLCANOES. MORE THAN 80% OF THE WORLD’S VOLCANIC ACTIVITY IS SUBMARINE.

(4) AS FOR CONCRETE, HUMANS MAKE 10 GIGATONS OF CONCRETE A YEAR BUT NATURE MAKES 210 GIGATONS OF TEPHRA FROM VOLCANOES AND AN EVEN GREATER FLOW OF MATERIAL IS FOUND IN MUD VOLCANOES. GEOLOGY IS BIGGER THAN HUMANS BY MANY ORDER OF MAGNITUDES AND HAS BEEN AROUND A LOT LONGER BY MANY ORDERS OF MAGNITUDES. IT IS NOT POSSIBLE TO DESCRIBE HUMAN ACTIVITY IN THE CONTEXT OF THE GEOLOGICAL FORCES THAT HAVE CREATED THE WORLD AS WE KNOW IT. WE JUST LIVE HERE. WE ARE NOT THE MASTERS AND CARETAKERS OF THE PLANET ALTHOUGH IT IS TRUE THAT THE BIBLE GIVES HUMANS DOMINION OVER NATURE.

CONCLUSIONS:
IF BLACK LIVES MATTER IT IS UP TO THE BLACKS TO MAKE BLACK LIVES MATTER. THE WHITE MAN’S IDEA EXPRESSED IN THIS PAPER BY TWO WHITE MEN THAT IT IS THE WHITE MAN’S JOB TO TAKE CARE OF THE BLACKS AND MAKE SURE THAT THEIR LIVES MATTER IS ITSELF RACIST. IT CONTAINS THE ASSUMPTION THAT BLACK LIVES DON’T MATTER BUT THAT’S OK BECAUSE THE WHITE MAN WILL TAKE CARE OF THEM AND CODDLE THEM WITH WEIRD WHITE MAN GRANDEUR LIKE THE ANTHROPOCENE WHERE THE WHITE MAN IMAGINES THAT HE IS BIGGER THAN GEOLOGY.
ALSO, COLONIZATION OF THE WORLD BY THE WHITES WAS NOT LIMITED TO BLACKS BUT INCLUDED BROWNS AND REDS AND YELLOWS. IF THE CRITERION OF WHETHER THE LIVES OF PEOPLE OF A CERTAIN SKIN COLOR MATTER IS DETERMINED BY EUROPEAN COLONIZATION, THEN WHY WERE THE OTHER SKIN COLORS LEFT OUT?
RACISM RUNS DEEP AND IT MAKES ITS CASE IN SUBTLE WAYS WHEREIN THE WHITE MAN CAN IMAGINE THAT HE CONTROLS THE CLIMATE AND EVEN THE GEOLOGICAL FORCES OF THE PLANET BUT SO DEEP IS HIS KINDNESS AND SO UNLIMITED IS HIS POWER AND LARGESSE THAT HE CAN CODDLE AND TAKE CARE OF THE LESSER RACES.

RELATED POSTS ON HOW THE THE WHITE MAN USES THE RACISM ISSUE TO PUSH THE CLIMATE CHANGE AGENDA
RELATED POSTS ON RACISM
#1: RACISM & CLIMATE CHANGE: https://tambonthongchai.com/2020/06/11/racism-and-climate-change/

#2: CLIMATE CHANGE RACISM: https://tambonthongchai.com/2021/03/19/climate-change-racism/

#3: CLIMATE MIGRATION RACISM: https://tambonthongchai.com/2021/04/03/divine-climate-activism/


THIS POST IS A CRITICAL REVIEW OF AN ARTICLE BY THE INTERNATIONAL SUPPORT NETWORK FOR AFRICAN DEVELOPMENT (ISNAD) ON “HEALING THE OZONE LAYER” LINK: https://isnad-africa.org/2020/09/17/healing-the-ozone-layer/

PART-1: WHAT THE SOURCE ARTICLE SAYS
The third law of motion by Sir Isaac Newton says “For every action there is an equal and opposite reaction.” However true this law might be, it makes for sad reading when applied in the context of man’s activities in the environment. What the globe is experiencing today stems from man’s past activities on the environment that did not put into consideration the consequences that might follow.
The effects of climate change, which are slowly taking their toll in many different parts of the globe today, are because man being a bad steward of the environment. Man’s actions such as deforestation, charcoal burning and removing vegetation from the environment has brought this impasse before the world, which was once a safe haven years ago.
Natural disasters are now more prominent than ever because the effects of not having balanced atmospheric conditions have led to rising sea levels, which result in storms, floods and cyclones, erratic rains, which result in partial or full droughts. The floods and storms that from time to time beseech the world today were preventable if the warning of the climatic experts were heeded. As things stand, a year hardly goes by without experiencing a natural disaster attributed to the effects of change in climatic conditions. Poor environmental management can be termed as a dormant volcano waiting to erupt at any moment.
One of the most critical topics on the management of the environment is that of deforestation for charcoal burning purposes. These processes produce gases that help the weakening of the earthly shield from the dangerous rays from the sun. The more trees are cut, the more the earth loses its ability to stop the rise in earth temperatures. When the trees are cut, the earth loses its shield from direct sunlight making it easy for the sun rays to penetrate and hit the ground causing a rise in earthly temperatures. This has weakened the ability of the Ozone layer to shield the earth from the high sun rays. The ozone layer is an important part of the atmosphere. This layer helps protect life on Earth from harmful radiation given off by the Sun.
The ozone layer or ozone shield is a region of Earth‘s stratosphere that absorbs most of the Sun‘s ultraviolet radiation. It contains a high concentration of ozone (O3) in relation to other parts of the atmosphere, although still small in relation to other gases in the stratosphere. The ozone layer contains less than 10 parts per million of ozone, while the average ozone concentration in Earth’s atmosphere as a whole is about 0.3 parts per million. The ozone layer is mainly found in the lower portion of the stratosphere, from approximately 15 to 35 kilometers above Earth, although its thickness varies seasonally and geographically.
The natural disasters that have beseeched man and the world at large has made man come to the realisation that action needs to be taken to start the healing of the Ozone layer. The adverse effects of the changing climate are a sad reality right before our eyes. For example, industrialization is credited for the massive developments the world has seen over the years. Since the beginning of the Industrial Revolution in the mid-1700s, however, human activities have added more and more of these gases into the atmosphere. The levels of carbon dioxide, a powerful greenhouse gas, have risen by 35 percent since 1750, largely from the burning of fossil fuels such as coal, oil, and natural gas. With more greenhouse gases in the mix, the atmosphere acts like a thickening blanket and traps more heat. This has led to rising temperatures world over. More actions have been taken to protect the earth by focusing on the protection of the ozone layer. The Montreal Protocol on Substances that Deplete the Ozone Layer, also known simply as the Montreal Protocol, is an international treaty designed to protect the ozone layer by phasing out the production of numerous substances that are responsible for ozone depletion.
The ozone layer can be depleted by free radical catalysts, including nitric oxide , nitrous oxide , hydroxyl , atomic chlorine and atomic bromine . While there are natural sources for all of these species, the concentrations of chlorine and bromine increased markedly in recent decades because of the release of large quantities of man-made organohalogen compounds, especially chlorofluorocarbons and bromofluorocarbons. These highly stable compounds are capable of surviving the rise to the stratosphere, where chlorofluorocarbons (CI) and bromofluorocarbons (Br) radicals are liberated by the action of ultraviolet light. Each radical is then free to initiate and catalyze a chain reaction capable of breaking down over 100,000 ozone molecules
Signed 16 September 1987, it was made pursuant to the 1985 Vienna Convention for the Protection of the Ozone Layer, which established the framework for international cooperation in addressing ozone depletion. The Montreal Protocol entered into force on 1 January 1989, and has since undergone nine revisions, in 1990 (London), 1991 (Nairobi), 1992 (Copenhagen), 1993 (Bangkok), 1995 (Vienna), 1997 (Montreal), 1998 (Australia), 1999 (Beijing) and 2016 (Kigali). The concern of the world over the years has been to set the recovery of the Ozone layer through the constant revision of the Montreal Protocol.
Since the Montreal Protocol came into effect, the atmospheric concentrations of the most important chlorofluorocarbons and related chlorinated hydrocarbons have either leveled off or decreased. Halon concentrations have continued to increase, as the halons presently stored in fire extinguishers are released, but their rate of increase has slowed and their abundances are expected to begin to decline by about 2020. Going forward the concern has been to ensure that existing restrictions on ozone-depleting substances are properly implemented and global use of ozone-depleting substances continue to be reduced. This reduction is what the world needs to make the full recovery of the ozone layer. Some of the efforts that have been implemented include: ensuring that banks of ozone-depleting substances (both in storage and contained in existing equipment) are dealt with in an environmentally-friendly manner and are replaced with climate-friendly alternatives. Ensuring that permitted uses of ozone-depleting substances are not diverted to illegal uses. Reducing use of ozone-depleting substances in applications that are not considered as consumption
The theme for 2020 is ’32 Years and Healing: “celebrating over three decades of remarkable international cooperation to protect the ozone layer and the climate under the Montreal Protocol.” This needs the continued efforts of everyone. Therefore, the United Nations saw it fit to reserve a day for people from all walks of life to celebrate the Ozone layer Day to raise awareness of the critical duty we have of helping in the healing of the Ozone Layer. The world is currently in the intensive care unit of the climate crisis but constant and careful implementation of the Montreal Protocol needs to take more precedence and the full recovery is even more feasible.
RELATED POST ON YET ANOTHER ATTEMPT TO PRESENT THE MONTREAL PROTOCOL AS A TRIUMPH OF ENVIRONMENTALISM THAT MUST NOW CONTINUE ON TO THE CLIMATE CHANGE ISSUE: LINK: https://tambonthongchai.com/2020/12/21/ozone-hole-environmentalism/

PART-2: CRITICAL COMMENTARY
THE THEORY OF ANTHROPOGENIC OZONE DEPLETION BY WAY OF CFC OR HFC IMPLIES A LONG TERM DECLINING TREND IN GLOBAL MEAN TOTAL COLUMN OZONE BUT NO EMPIRICAL EVIDENCE FOR SUCH A TREND HAS EVER BEEN PRESENTED BECAUSE NO SUCH TREND IS FOUND IN THE DATA. INSTEAD THE ONLY DATA PRESENTED IS PERIODIC, LOCALIZED, AND EPISODAL OZONE DEPLETION EVENTS ABOVE THE SOUTH POLE. THESE EVENTS ARE DESCRIBED AS “OZONE HOLES” AND PRESENTED AS EVIDENCE OF GLOBAL OZONE DEPLETION THAT CAN CAUSE SKIN CANCER ANYWHERE IN THE WORLD. NONE OF THIS IS TRUE. THE OZONE HOLE PHENOMENON HAS NO OZONE DEPLETION INTERPRETATION.
IN RELATED POSTS ON THIS SITE WE PRESENT DATA FOR GLOBAL MEAN TOTAL COLUMN OZONE FOR TIME SPANS OF 50 YEARS OR MORE. THERE IS NO EVIDENCE OF OZONE DEPLETION IN THE DATA. BELOW WE PRESENT A LIST OF LINKS TO OZONE DEPLETION POSTS ON THIS SITE FOLLOWED BY THE DATA FROM ONE OF THOSE POSTS AND A BRIEF SUMMARY OF OZONE DEPLETION CHEMISTRY.

PART-3: LINKS POSTS AT THIS SITE ON THE OZONE DEPLETION ISSUE
(1): HISTORY OF THE OZONE DEPLETION SCARE: LINK: https://tambonthongchai.com/2018/08/07/history-of-the-ozone-depletion-scare/
(2): CLIMATE CRISIS CONNECTED TO OZONE CRISIS: LINK: https://tambonthongchai.com/2020/03/03/climate-ozone-crisis/
(3): THE OZONE MYSTERY DEEPENS: LINK: https://tambonthongchai.com/2020/03/27/ozone-mystery/
(4): OZONE DEPLETION PART1: LINK: https://tambonthongchai.com/2020/09/30/ozone-depletion-part-1/
(5): OZONE DEPLETION PART2: LINK: https://tambonthongchai.com/2020/09/30/ozone-depletion-part-2/
(6): OZONE DEPLETION PART3: LINK: https://tambonthongchai.com/2020/10/01/ozone-depletion-part-3/
(7): THE OZONE HOLE OF 2020: LINK: https://tambonthongchai.com/2020/11/04/the-ozone-hole-of-2020/
(8): LEARNING FROM A HEALING OZONE HOLE: LINK: https://tambonthongchai.com/2020/11/30/the-unep-healed-the-ozone-hole/
(9): OZONE HOLE ENVIRONMENTALISM: LINK: https://tambonthongchai.com/2020/12/21/ozone-hole-environmentalism/
(10): OZONE DEPLETION CHEMISTRY: LINK: https://tambonthongchai.com/2018/04/01/ozone-depletion-and-ozone-holes/
(11): REMEMBERING MARIO MOLINA: LINK: https://tambonthongchai.com/2020/11/24/an-ode-to-mario-molina/
PLASTIC POLLUTION #2
Posted December 24, 2020
on:
THIS POST IS A CRITICAL REVIEW OF THE EUROPEAN SCIENCE ARTICLE OF 12/24/2020 THAT THE OCEAN AND THE PLANET ARE ENDANGERED BY HUMANS DUMPING 8 MILLION TONNES OF PLASTIC INTO THE OCEAN EVERY YEAR. LINK TO SOURCE: https://sciencemediahub.eu/2019/01/16/plastisphere-the-oceans-of-plastic/

PART 1: WHAT THE EUROPEAN SCIENCE ARTICLE SAYS
The Oceans of Plastics – Plastisphere: A new term for the marine environment
Every minute, the equivalent of one garbage truck of plastic waste enters the oceans. Plastic pollution is so ubiquitous that scientists have coined a new term for the marine environment: the PLASTISPHERE – AN OCEAN OF PLASTIC. For the last few years, the plastic crisis is making news headlines around the world. The plastic problem in the oceans and seas is so severe that the United Nations called it a ‘planetary crisis’ that is ruining marine ecosystems around the world. A total of 8 million (metric) tons a year of plastic pollution is dumped into the ocean every year by humans. This pollution litters the seafloor and floats on the surface in vast plastic patches, poisoning seabirds and other marine life. By 2050, there will be more plastic than fish in the world’s oceans. Single-use plastic items such as plates, cutlery, straws and cotton buds will be banned in the EU under plans provisionally agreed between the European Parliament and Council on 19 December 2018. Europe now has a legislative model to defend and promote at international level, given the global nature of the issue of marine pollution involving plastics. This is essential for the planet and this is what millions of concerned Europeans are asking us to do.
The precise danger of microplastics in the deep ocean: Plastic pollution has even reached the deep seas. The biggest issue is the creation of a new habitat where there was none before. The open ocean gyres are the equivalent of deserts in terms of life so adding a substrate with a nice nutrient-rich biofilm changes this deeply. We do not understand what the domino effect will be. In the deep sea, microplastics become a food source for organisms large enough to ingest it. And so the same issues regarding microplastics biomagnifying up the food chain apply here as elsewhere in the water column. Organisms and animals on the sea floor are also ingesting microplastics. The result astonished me. Plastic pollution has reached the ends of the Earth.

THE MISSING PLASTIC PROBLEM
Scientists also trying to solve the so-called ‘missing plastic problem’. In the TOPIOS project, they are creating state-of-the-art hydrodynamic ocean models able to track the movement of plastic through the ocean. We know that the plastic floating on the surface of the ocean represents only a very small percentage of the total amount of plastic going in to the oceans every year. This means that as much as 99 % of ocean plastic is missing: we don’t know exactly where it is or what damage it is doing. We must reduce the use of plastics.
The Dutch Ocean Cleanup initiative extracts plastic from a giant garbage patch in the Pacific. Although the approach has its merits, it also raises questions because we should stop the pollution and not pollute and then clean up. The EU policy is to reduce the use of plastics 25% by 2025, and to stimulate the collection and recycling beverage packaging severely. In May 2013, Canadian entrepreneur David Katz founded The Plastic Bank to tackle the plastic crisis with a combination of two components: individual financial incentives, and modern technology. Much of the plastic originates in underdeveloped countries with minimal waste management infrastructure, where citizens often survive on less than a dollar a day. I realized we had to challenge our perception of plastic, and make it too valuable to simply throw it away into a river or stream. The Plastic Bank encourages citizens of poor countries like Haiti and the Philippines to collect plastic waste and deliver it to local processing centers,

PART-2: CRITICAL COMMENTARY
- MASS BALANCE AGAINST FISH: WITH AND WITHOUT CONSIDERATION OF THE “MISSING PLASTIC” ISSUE: THE WEIGHT OF ALL THAT WATER IN THE OCEAN IS 1.4E18 TONNES. THERE ARE 2E9 TONNES OF FISH IN THE OCEAN. IF HUMANS DUMP 8 MILLION TONNES OF PLASTIC INTO THE OCEAN EACH YEAR EVERY YEAR, IT WILL TAKE THE HUMANS 250 YEARS OF CONTINUOUS DUMPING AT THIS RATE TO BREAK EVEN WITH THE WEIGHT OF FISH WITHOUT CONSIDERATION OF THE MISSING PLASTIC ISSUE. WHEN WE CONSIDER THE MISSING PLASTIC ISSUE WHERE OCEAN BIOTA ARE ALLOWED TO EAT MICROPLASTICS, IT WILL TAKE 25,000 YEARS TO BREAK EVEN WITH THE WEIGHT OF FISH IN THE OCEAN.
- MASS BALANCE AGAINST THE OCEAN: WITH AND WITHOUT CONSIDERATION OF THE “MISSING PLASTIC” ISSUE: THE WEIGHT OF ALL THAT WATER IN THE OCEAN IS 1.4E18 TONNES. THERE IS 2E9 TONNES OF FISH IN THE OCEAN. IF HUMANS DUMP 8 MILLION TONNES OF PLASTIC INTO THE OCEAN EACH YEAR EVERY YEAR, IT WILL TAKE THE HUMANS 1.75 BILLION YEARS OF CONTINUOUS DUMPING AT THIS RATE TO PUT ENOUGH PLASTIC INTO THE OCEAN WHERE THE WEIGHT OF PLASTIC REACHES 1% OF THE WEIGHT OF THE OCEAN WITHOUT CONSIDERATION OF THE MISSING PLASTIC ISSUE. WHEN WE CONSIDER THE MISSING PLASTIC ISSUE WHERE OCEAN BIOTA ARE ALLOWED TO EAT MICROPLASTICS, IT WILL TAKE US 175 BILLION YEARS TO PUT ENOUGH PLASTIC INTO THE OCEAN WHERE THE WEIGHT OF PLASTIC REACHES 1% OF THE WEIGHT OF THE OCEAN.
THE HUMANS HAVE OVERESTIMATED THEMSELVES. THERE IS NO ANTHRO-POCENE BECAUSE THE ANTHROS AND ALL THEIR ANTHROPOGENIC ACTIVITIES ARE INSIGNIFICANT ON A PLANETARY SCALE AS DESCRIBED IN A RELATED POST: LINK: https://tambonthongchai.com/2020/07/31/planetary-environmentalism-in-the-anthropocene/
OUR ILLUSORY AND EXAGGERATED SELF IMAGE LIKELY DERIVES FROM THE BIBLE WHERE GOD HAD GIVEN MAN DOMINION OVER NATURE. THE REALITY IS VERY DIFFERENT. WE DON’T EVEN HAVE DOMINION OVER THE MONKEYS.


1.5C BY 2027?
Posted December 23, 2020
on:
THIS POST IS A CRITICAL REVIEW OF AN ARTICLE BY DOWNTOEARTH.ORG FORECASTING A BREACH OF THE IPCC 2018 CRITICAL WARMING OF 1.5C SINCE PRE-INDUSTRIAL MUCH SOONER THAN THE IPCC FORECAST WITH THE NEW FORECAST OF 2027-2042, COMPARED WITH THE IPCC FORECAST OF 2052.

PART-1: WHAT THE DOWNTOEARTH.ORG ARTICLE SAYS
The world will heat up more than it can take much earlier than anticipated, a group of researchers said. The planet will breach the threshold of 1.5 degrees Celsius above pre-industrial levels between 2027 and 2042, they said. The United Nations Intergovernmental Panel on Climate Change (IPCC) had estimated that breach to occur between now and 2052. Researchers from McGill University, however, claimed to have introduced a more precise way to project the Earth’s temperature based on historical climate data instead of theoretical relationships, thereby increasing scope for more accurate calculations as well as predictions. The study was published in Climate Dynamics December 18, 2020, according to which prediction model deployed reduced uncertainties by half compared to the approach used by the IPCC. The IPCC uses the General Circulation Models (GCM) to express wide ranges in overall temperature projections. This makes it difficult to circle outcomes in different climate mitigation scenarios. For example, an IPCC model would predict a temperature increase of a massive range — between 1.9oC and 4.5oC — if carbon dioxide in the atmosphere is doubled. Climate models are mathematical simulations of different factors that interact to affect Earth’s climate, such as the atmosphere, ocean, ice, land surface and the sun. The data is tricky, and predictions can more often than not be inaccurate. “Our approach allows climate sensitivity and its uncertainty to be estimated from direct observations with few assumptions,” said co-author Raphaël Hébert, a former graduate researcher at McGill University. The researchers also found that expected warming was a little lower in this period by about 10 to 15 per cent. Climate sceptics have argued that global warming projections are unreliable because they depend on faulty supercomputer models. While these criticisms are unwarranted, they underscore the need for independent and different approaches to predicting future warming,” said co-author Bruno Tremblay, a professor in the Department of Atmospheric and Oceanic Sciences at McGill University. Shaun Lovejoy, professor in the Physics Department at McGill University said world leaders must stop claiming that their government’s weak policies could avert climate change.

PART-2: WHAT THE CITED RESEARCH PAPER SAYS
CITATION: Hébert, Raphaël, Shaun Lovejoy, and Bruno Tremblay. “An observation-based scaling model for climate sensitivity estimates and global projections to 2100.” Climate Dynamics (2020): 1-25. ABSTRACT: We directly exploit the stochasticity of the internal variability, and the linearity of the forced response to make global temperature projections based on historical data and a Green’s function, or Climate Response Function (CRF). To make the problem tractable, we take advantage of the temporal scaling symmetry to define a scaling CRF characterized by the scaling exponent H, which controls the long-range memory of the climate, i.e. how fast the system tends toward a steady-state, and an inner scale τ≈2τ≈2 years below which the higher-frequency response is smoothed out. An aerosol scaling factor and a non-linear volcanic damping exponent were introduced to account for the large uncertainty in these forcings. We estimate the model and forcing parameters by Bayesian inference which allows us to analytically calculate the transient climate response and the equilibrium climate sensitivity as: 1.7+0.3−0.21.7−0.2+0.3 K and 2.4+1.3−0.62.4−0.6+1.3 K respectively (likely range). Projections to 2100 according to the RCP 2.6, 4.5 and 8.5 scenarios yield warmings with respect to 1880–1910 of: 1.5+0.4−0.2K1.5−0.2+0.4K, 2.3+0.7−0.52.3−0.5+0.7 K and 4.2+1.3−0.94.2−0.9+1.3 K. These projection estimates are lower than the ones based on a Coupled Model Intercomparison Project phase 5 multi-model ensemble; more importantly, their uncertainties are smaller and only depend on historical temperature and forcing series. The key uncertainty is due to aerosol forcings; we find a modern (2005) forcing value of [−1.0,−0.3]Wm−2[−1.0,−0.3]Wm−2 (90 % confidence interval) with median at −0.7Wm−2−0.7Wm−2. Projecting to 2100, we find that to keep the warming below 1.5 K, future emissions must undergo cuts similar to RCP 2.6 for which the probability to remain under 1.5 K is 48 %. RCP 4.5 and RCP 8.5-like futures overshoot with very high probability.

THE FULL TEXT PDF OF THIS PAPER
The full text of the paper is available online: https://link.springer.com/article/10.1007/s00382-020-05521-x
It has also been made available at this site in pdf format. The link for the pdf download is provided below. Warning: clicking on this link will cause a large pdf file to be downloaded to your device.
PART-3: CRITICAL COMMENTARY
As seen in the image above labeled THE ANALYTICAL RELATIONSHIP BETWEEN TCR AND ECS, the authors of Hebert etal 2020 have determined that neither the ECS (climate sensitivity) nor the TCR (transient climate response) by itself is a sufficiently accurate measure for the rate of anthropogenic global warming caused by fossil fuel emissions. In determining the rate of warming for any given RCP scenario, they have departed from standard climate science and constructed their own unique theory and mathematical relationship between an RCP emissions scenario and the rate of warming. Among other things, this construction involves a mathematical combination of the TCRE and ECS parameters into a single unique relationship between any RCP emissions scenario and the corresponding rate of warming caused by the emissions in that RCP scenario.

In this context, we refer to a related post on this site: LINK: https://tambonthongchai.com/2020/08/26/a-mathematical-inconsistency/ . There we argue that ECS is a logarithmic relationship between atmospheric CO2 concentration and temperature where temperature is proportional to the logarithm of atmospheric CO2 concentration. Since atmospheric CO2 concentration is a creation of cumulative emissions in a linear relationship, surface temperature is a logarithmic function of cumulative emissions. But in the TCRE, surface temperature is a linear function of cumulative emissions. Therefore the ECS and the TCRE are mathematically inconsistent. A proposed theory of AGW that relates warming to fossil fuel emissions must decide whether this relationship is linear or logarithmic because it can’t be both. Yet this is what we see in Hebert etal 2020 where a combination of the TCRE and ECS parameters is used to construct the relationship between emissions and warming used in the analysis.
An added consideration is a statistical flaw in the TCRE shown in the video above and described in a related post LINK: https://tambonthongchai.com/2018/05/06/tcre/ . Briefly, a time series of the cumulative values of another time series contains neither time scale not degrees of freedom and therefore it does not contain useful information. An added consideration is that, as shown in the related post, the correlation between temperature and cumulative emissions derives not from causation but from a sign pattern where annual emissions are always positive and in a time of warming, the annual warming data are mostly positive. A demonstration of this relationship is shown in yet another related post where we show that any variable that contains only positive values works just as well as fossil fuel emissions: LINK: https://tambonthongchai.com/2018/12/03/tcruparody/

WE CONCLUDE FROM THE AGRGUMENTS PRESENTED ABOVE THAT THE HEBERT ETAL 2020 PAPER IS NOT CREDIBLE AND THAT IT DOES NOT PRESENT A SERIOUS CHALLENGE TO THE WORK OF THE IPCC AND ITS FORECAST OF 1.5C OF WARMING SINCE PRE-INDUSTRIAL BY THE YEAR 2052.
BIOCHAR AND CLIMATE CHANGE
Posted December 22, 2020
on:
THIS POST IS A CRITICAL REVIEW OF AN ARTICLE IN “ANALYTICAL SCIENTIST” ABOUT FIGHTING CLIMATE CHANGE WITH BIOCHAR. LINK TO SOURCE: https://theanalyticalscientist.com/fields-applications/biochar-to-combat-climate-change

PART-1: WHAT THE SOURCE ARTICLE SAYS
A huge issue the world faces right now is climate change; we’ve burnt fossil fuels for over a century, greenhouse gas levels have climbed and temperatures have followed suit. The global temperature has risen by 1C since pre-industrial. The climate is extremely sensitive to such changes. That explains the sharp rise in storms, droughts, hurricanes and floods that we have witnessed these past few years.
There are looming tipping points as climate change progresses. These include the melting permafrost in the Arctic, which would release large quantities of greenhouse gases. We must find a solution quickly; we risk irreversible climate change otherwise and survival of humans.
Plants absorb carbon dioxide throughout their life and release it back into the atmosphere upon dying. The production of biochar – primarily carbon extracted from dead plants – can block this cycle, preventing carbon dioxide release and limiting eventual increases in atmospheric temperature. Around 200 companies are operating globally in this area already – analyzing and improving their biochar with techniques like Fourier-transform infrared spectroscopy and solid-state nuclear magnetic resonance.
Our aim is to expand the industry further. Providing sales support to these companies could provide a much-needed push into the mainstream. Growth of this industry to a size at which it could process most of the world’s dead plants could produce positive change. We’re currently working with five biochar companies across America and Canada, and our next steps will be to build networks between these businesses, farmers, and the US Department of Agriculture. These networks will be essential to open up communication with those who will benefit from using biochar for their specific soil and environmental conditions. In short, as in all areas of science, communication and collaboration will be key to our success.

Looming tipping points such as the melting of the permafrost in the Arctic, which would release large quantities of greenhouse gases, mean that we must find a solution quickly; we risk an irreversible global temperature increase that we may not survive.
PART-2: CRITICAL COMMENTARY
There are two different issues here. 1. The first issue is that the climate action being proposed is inconsistent with the theory of anthropogenic global warming {AGW}. 2. The second issue is that AGW is presented in the context of environmentalism which holds that humans must not interfere with nature.
(1) The first issue: The theory of anthropogenic global warming and climate change is not a theory about how CO2 causes global warming. It is specifically about the industrial economy and its use of fossil fuels. This is why global warming is always measured “from pre-industrial”. AGW is a theory that fossil fuels cause warming and the solution demanded in the form of “climate action” is to reduce and eventually eliminate the use of fossil fuels. A proposal for climate action in the form of interfering with nature’s carbon cycle to compromise nature’s ability to add CO2 to the atmosphere at the same level of fossil fuel emissions cannot be presented as climate action. The climate change evil is not carbon nor carbon dioxide but fossil fuel emissions. It is true that fossil fuel emissions are carbon dioxide but carbon dioxide in general as part of nature’s carbon cycle is not an unnatural and external perturbation of the carbon cycle. Therefore, human intervention in the carbon cycle is not climate action.

(2) The second issue: The second issue is the environmentalism interpretation of AGW as a case of humans interfering with nature by burning fossil fuels and causing unnatural changes on a global scale. At the very foundation of environmentalism is the principle that humans must not interfere with nature. And yet, the proposed biochar solution to climate change is a grotesque human interference with nature’s carbon cycle where nature’s cyclical carbon flow between atmosphere and plants is intercepted and not allowed to complete the cyclical nature of this flow. The humans are interfering with nature ostensibly to fix a problem described as humans interfering with nature.
(3) Related post on the distinction between fossil fuel emissions and the carbon cycle in climate science: Here Dr Peter Griffith of NASA explains the unique role of fossil fuel emissions in AGW science that makes it impossible to equate carbon cycle carbon to fossil fuel carbon. https://tambonthongchai.com/2020/06/19/vegandiet/
(4) Related post on a statistical test of the assumption that the observed changes in atmospheric CO2 levels is caused by fossil fuel emissions where no evidence for this causation is found. https://tambonthongchai.com/2020/11/21/the-case-against-fossil-fuels/
(5) The proposed biochar solution to climate change assumes that making biochar removes atmospheric carbon from the climate system but no technology or explanation is offered for the sequestration of biochar from the carbon cycle.
SUMMARY AND CONCLUSION: The issue in AGW climate change is fossil fuels and the only solution offered and demanded by climate science is to stop using fossil fuels. The interpretation of this theory in terms of carbon cycle flows proposed here by the Analytical Scientist is inconsistent with climate science and with the statistical details provided in the related post linked above in item#(4).
OZONE HOLE ENVIRONMENTALISM
Posted December 21, 2020
on:
THIS POST IS A CRITICAL REVIEW OF A BBVA-OPEN-MIND-SCIENCE-AND-ENVIRONMENT ARTICLE ON THE OZONE HOLE PUBLISHED IN JANUARY 2020. LINK TO SOURCE: https://www.bbvaopenmind.com/en/science/environment/whatever-happened-to-the-ozone-layer-hole/

PART-1: WHAT THE OPEN-MIND ARTICLE SAYS
When researchers Frank Sherwood Rowland and Mario Molina told the world in 1974 that aerosol hairsprays damaged the part of the atmosphere that protects us from solar ultraviolet radiation, the reactions were not simply disbelief: a senior chemist at DuPont called the theory a “science fiction tale,” “a load of rubbish” and “utter nonsense.”
However, soon after, the so-called ozone hole became not only a global concern, but also one of the symbols of the green activism of the 1980s. The rapid reaction to tackle the problem by banning harmful compounds represents the greatest success achieved by an international environmental agreement. But it is also an example of how technological progress is seeking more sustainable solutions to the problems that technological progress itself has caused.
The success of Rowland and Molina, chemists at the University of California, Irvine, was to piece together ideas that had gone unnoticed by others. In the early 1970s, it was known that chlorine and other substances can catalyse the destruction of ozone, a compound composed of three oxygen atoms that is present in a greater proportion in a layer of the earth’s stratosphere, and which blocks much of the harmful UV radiation. However, no one had linked this phenomenon to chlorofluorocarbons (CFCs), gases that began to be produced industrially in the 1930s and were used extensively as aerosol propellants, refrigerants and to make plastic foams. CFCs are inert and long-lived, so they can remain in the atmosphere for decades. Rowland and Molina theorized that the breakdown of CFCs by sunlight releases chlorine, which could result in significant damage to the ozone layer.
Despite the initial negative reaction to the study by the two chemists, subsequent experiments and atmospheric measurements soon confirmed that they were correct. In 1985, a study by the British Antarctic Survey discovered something that surprised the scientific community, a particularly sharp decline in ozone concentration over Antarctica, when the decline was expected to be equally distributed across the planet. The following year, US National Oceanic and Atmospheric Administration (NOAA) researcher Susan Solomon provided the explanation: the cold winter temperatures at the poles form stratospheric polar clouds, which encourage the breakdown of CFCs and other halocarbons —composed of carbon and halogen elements such as chlorine, fluorine, bromine or iodine— generating more free chlorine, which in the southern spring accentuates the destruction of ozone.
THE MOST SUCCESSFUL ENVIRONMENTAL AGREEMENT IN HISTORY
The scientific consensus on the ozone hole led some countries to adopt unilateral measures, and in 1987 a total of 46 nations signed the Montreal Protocol, aimed at phasing out the production of ozone-depleting substances. However, industry was still reluctant to throw in the towel; in 1988, DuPont’s president, Richard Heckert, wrote to the U.S. Senate: “At the moment, scientific evidence does not point to the need for dramatic CFC emission reductions. There is no available measure of the contribution of CFCs to any observed ozone change.”
The Montreal Protocol, in force since 1989, is often regarded as the most successful international environmental agreement in history. In fact, according to the UN, to date it is the only UN treaty that has been ratified by all the countries on the planet, all 197 member states. On a transitional basis, CFCs have been replaced by hydrochlorofluorocarbons (HCFCs), which are supposed to be less harmful to the ozone layer, with the aim of replacing them entirely with hydrofluorocarbons (HFCs) and other compounds. These are more unstable in the lower atmosphere, so their impact on stratospheric ozone is assumed to be low or zero. Thanks to the reduction of CFCs, since 2005 ozone destruction has decreased by 20%, according to NASA, and the hole is expected to disappear almost completely between 2060 and 2080.
MORE SUSTAINABLE AND VIABLE SOLUTIONS
But the implications are more complex: in addition to their effect on ozone, CFCs are also much more powerful greenhouse gases than CO2. A recent study conducted by Columbia University (USA) atmospheric and climate dynamics expert Lorenzo Polvani has determined that ozone-depleting substances such as CFCs have been responsible for half of the warming of the Arctic and the melting of North Pole ice during the second half of the 20th century. “Banning of CFCs by the Montreal Protocol will mitigate Arctic warming and sea ice loss in the coming decades,” Polvani told OpenMind, although he made it clear that the overall trend will not be reversed without the necessary reductions in CO2, the main culprit in climate change.
One problem is that alternative solutions to CFCs must not only be more sustainable, but also economically viable. In 2018, a team led by NOAA researcher Stephen Montzka discovered an unexpected 25% increase in emissions of CFC-11 (the second most abundant CFC) starting from 2012, slowing the decline in the concentration of this gas by 50%, and this despite the fact that the Montreal Protocol established the cessation of global production by 2010. “We identified China as being responsible for about half of that global emissions increase,” Montzka told OpenMind. The researcher notes that studies are still under way to determine the causes and impact of this clear violation of the international agreement, but some experts suggest that perhaps the alternatives to CFCs could be too expensive or unaffordable for some countries.
There is also another complication: ozone-friendly HFCs also contribute to climate change. Recently, researchers have detected an increase in emissions of HFC-23, the greatest cause of global warming among HFCs and a compound that should have been drastically reduced under the current version of the Montreal Protocol. In short, finding a practical way to cool ourselves and propel our aerosols without destroying the ozone layer or aggravating climate change is still an ongoing technological challenge.
Javier Yanes
PART-2: CRITICAL COMMENTARY
- The background to the Rowland Molina theory of ozone depletion (RMTOD) is that since 1969 multiple failed theories of ozone depletion were proposed with claims that supersonic airliners, the space shuttle, and various other technologies being proposed would cause ozone depletion with blindness and skin cancer epidemics. RMTOD was simply the latest in that line of an obsession with ozone depletion fearology and it can only be understood in that context.
- RMTOD 1974 is not a work in isolation that can be accredited solely to Rowland and Molina. Firstly, as explained above, it was just yet another ozone depletion fear in a long line of ozone depletion fears since 1969. Even more important is that RMTOD is a product of the Lovelock 1973 paper. In 1973 James Lovelock discovered that air samples taken from the Middle of the Atlantic Ocean contained CFCs. He then published his now famous paper in which he said that these man made chemicals that did not otherwise occur in nature were inert and could therefore accumulate in the atmosphere indefinitely. It was from this work that Rowland and Molina surmised that given enough time, maybe 40 to 100 years, the inert and long lived CFCs could, by random molecular movement, end up in the stratosphere where they could be disintegrated by UV radiation to produce radical agents of ozone destruction. What Rowland and Molina proved in their lab is that UV radiation would indeed break down the CFCs and that the radicals thus produced would indeed destroy ozone but no evidence has every been produced and none exists that CFCs did indeed end up in the stratosphere. That part of RMTOD is simply imagined in a “What If” logic.
- The only empirical evidence presented in support of RMTOD is Farman etal 1985. The Farman study showed only that there was a brief and localized 5-year period of low ozone in the months of October and November above the South Pole that had recovered to normal levels and this was taken as evidence of RMTOD. Yet, this episodal and localized low ozone event does not serve as evidence of the RMTOD theory of ozone depletion. This theory implies a long term declining trend in global mean total column ozone. No evidence for this trend has ever been presented and we show in a related post that none exists. LINK: https://tambonthongchai.com/2019/03/12/ozone1966-2015/ .
- Instead, the South Polar periodic low ozone event that quickly recovers back to normal levels was sold to the general public as an “ozone hole” and claimed as evidence of RMTOD human caused global ozone depletion that could cause skin cancer in humans and blindness in animals up in North America. Then at some point, it was declared with great fanfare that the UN brokered Montreal Protocol had solved the ozone depletion problem and that the ozone had recovered. No explanation is offered for the continuation of these South Polar ozone events that had been named ozone holes.
- In a related post LINK: https://tambonthongchai.com/2020/11/30/the-unep-healed-the-ozone-hole/ we show that these South Polar events should be understood as ozone distribution events and not ozone depletion. Ozone is both created and destroyed by UV radiation. Ozone is created only above the Tropics where sunlight is direct and distributed to the greater latitudes by the Brewer Dobson circulation and episodic changes in ozone levels at the higher latitudes can be understood in terms of the dynamics of this distribution but not in terms of long term ozone depletion due to the presence of ozone depleting substances in the stratosphere.
- The only significant impact of what is claimed to be finally a proven case of ozone depletion after all those failures is that it served to expand the role of the UN into global environmentalism.


