Thongchai Thailand

1C AGW Since Pre-Industrial Times

Posted on: October 24, 2018

































  1. This work is a critical evaluation of the claim in the IPCC SR15 that by the year 2017 human activity in the form of fossil fuel emissions had caused a warming of 1ºC since pre-Industrial times. Five different global temperature series including four reconstructions and the RCP8.5 theoretical series are used to frame the context of this claim and to test its validity.
  2. The RCP8.5 is a temperature series predicted by climate models with CMIP5 forcings for the “business as usual” emission scenario (no climate action taken).  The four temperature anomaly reconstructions are the HadCRUT4 anomalies 1850-2017 from the Hadley Centre of the Climate Research Unit of the UK Met Office, the GISTEMP anomalies 1880-2017 from NASA-GISS, and the B.E.S.T reconstruction from Berkeley Earth 1850-2017. There are two versions of the Berkeley Earth reconstruction depending on how sea ice temperatures are estimated. Both are used and labeled as Berkeley1 and Berkeley2.
  3. The temperature datasets are studied one calendar month at a time separately as it has been shown that the warming trend behaviors of the months differ significantly and that their combination into an annual temperature risks losing a great deal of trend behavior information [RELATED POST] . For each of five  temperature datasets and for each of twelve calendar months we compute the total amount of warming from all possible start years separated by ten-year increments. The amount of warming is computed as the linear OLS regression trend in ºC/year for each time span, times the length of the time span in years. The analysis consists of a study of these warming amounts in the context of the claim by the IPCC that human emissions of carbon dioxide in the industrial economy generated a warming “since pre-industrial times” of 1ºC.
  4. The HadCRUT4, GISTEMP, Berkeley1, Berkeley2, and RCP8.5 data are presented in Figure 1, Figure 2, Figure 3, Figure 4, and Figure 5 respectively. Each presentation consists of a tabulation of the computed warming amounts in degrees Celsius and their graphical display. The end of the time span for each starting year from 1850 to 1970 is fixed at 2017. Below the tabulated total warming values for each calendar month, is a chart that shows the plot of these values for each calendar month (in red) compared with the average of all twelve calendar months (in blue).
  5. It is noted that as the number of years in the warming period decreases linearly from 168 years for start year 1850 to 48 years for start year 1970, the amount of warming does not show a corresponding linear decrease but rather a complex non-linear pattern of both rising and falling amounts of warming as the time span decreases linearly. This behavior is driven by extreme short term changes in the rate of warming that include both warming and cooling periods with a greater influence of cooling in the earlier part of the time series as seen in the 30-year trend profile shown in Figure 7. As a result the greatest amount of warming is seen for start years around the year 1900, thereafter falling until 1940 and then rising again until the end of the time series likely driven by higher rates of warming. This is an example of the kind of complexity in temperature trend information that becomes lost when seasonal temperatures are combined into annual means as shown in the chart below. Here, the red curve shows the amount of warming computed for each calendar month separately and then averaged. The blue curve shows the result of the more conventional procedure of averaging monthly temperatures into annual means prior to trend analysis. The trend behavior of the calendar months are very different and this information becomes lost when monthly mean temperatures are combined into annual mean temperatures as shown in a related post.  [LINK]  AVERAGING-ANOMALY
  6. Figure 6 is a correlation analysis of the total warming amounts presented in Figure 1 to Figure 5. It shows fairly good agreement among the observational data series  but little or no correlation between the observational data and the theoretical temperature series created by climate models with CMIP5 forcings.  Underneath the tabulation of these correlations are two charts. The one on the left shows correlation between warming amounts in the observational series with the RCP8.5 climate model generated theoretical series. The black horizontal line marks the zero correlation location. Most of the correlations are negative and the few positive correlations found (with the GISS temperature reconstructions) are not statistically significant. The chart on the right shows correlations among the observational data. Mostly strong correlations are seen except where the GISS data are involved. All low correlations seen in this chart involve GISS and all correlations that do not involve GISS show strong statistically significant correlations. The correlation behavior of GISS is anomalous in ways that imply that its construction may have been influenced by climate models.
  7. A direct comparison of the five temperature time series is shown graphically in Figure 8. The four observational data series are shown in thin lines of various colors while the RCP8.5 climate model series appears as a thick black line. The left frame compares temperatures directly. It appear to show good agreement among all five temperatures with the RCP8.5 theoretical series tracking the middle of the distribution. The right frame of the chart compares the the “trend profiles” of the five temperature series computed as trends in a moving 30-year window as ºC/century equivalent (the period of 30-years is recognized as the appropriate span for study of short term trends. See references below). Here we find that in terms of short term trends, the homogeneity seen among the source temperatures is not found. Significant differences between the RCP8.5 and the observational data and also among the observational data are seen. The charts cycle through the twelve calendar months in a GIF animation demonstrating differences in the comparison among calendar months. This comparison implies that short term trends cannot be generalized across the full span of the data or across calendar months; and that the homogeneity among the source temperature data seen in the left frame is illusory.
  8. The “total warming” data in Figure 1 to Figure 5 contain 61 average values (averaged across all twelve calendar months). Of these warming of 1ºC or greater is found in 16 cases for an overall rate of 26%. The highest rate of warming found is 1.11ºC in the RCP8.5 theoretical series and the highest value in the observational data is 1.08ºC in the Berkeley1 series.
  9. A more extensive analysis of the overall assessment of the amount of warming across datasets is presented in Figure 7 where the warming amounts seen in the five datasets are summarized as averages across datasets. There are two tables in Figure 7, one atop the other. The top table contains averages among the four observational data series while the averages in the bottom table also include the RCP8.5 climate model series. The chart below the two tables show the average of the averages across calendar months for the four observational datasets (in blue) and the corresponding averages that include the theoretical RCP8.5 climate model series (in red). The horizontal purple line delineates the grand average as the average of averages as approximately 0.91ºC of warming across all calendar months, all time spans and locations, and all datasets. The dark horizontal line at the top of the cart marks location of the 1ºC warming mark claimed by the IPCC. We conclude from this analysis that although the claimed 1ºC warming (or greater) can be found in specific instances of the data, it is not representative since most of the data show lower warming rates. Thus the most generous assessment possible is that the IPCC’s claim to 1ºC warming since pre-industrial times is an exaggeration possibly motivated by the needs of advocacy for pushing climate action.
  10. However, a more serious issue is the the reference to pre-industrial times as the baseline from which the anthropogenic effect of fossil fuel emissions should be measured. In the AR5 and other publications, the IPCC states that “Human-induced warming reached approximately 1°C (±0.2°C likely range) above pre-industrial levels in 2017. Warming is expressed relative to the period 1850-1900, used as an approximation of pre-industrial temperatures in AR5″. Yet, in the matter of identifying human cause, the IPCC writes that “The current warming trend is of particular significance because most of it is extremely likely (greater than 95 percent probability) to be the result of human activity since the mid-20thcentury” (citations below). In other words, the whole of the 1°C warming from pre-industrial times cannot be shown to be human caused because only the warming since the “mid-20th-century” is human caused. That raises the question with regard to the amount of warming that can be shown to be human caused in this context.
  11. That only the warming since the mid-20th century contains a “fingerprint” of human cause is found elsewhere in climate science. Figure 9 contains a graphic from that “human drivers of climate” are detectable at some time after 1960 using a fingerprinting method climate models are run with and without the human forcings. A demonstration of this fingerprinting methodology by climate scientist Peter Cox of the University of Exeter is shown in the video that appears in the bottom panel of Figure 9. Here, the Hadcrut temperatures since 1850 are plotted in red and then overlaid with two sets of climate model runs one after the other in the video sequence. The first climate model run contains only natural factors and the output is plotted in green. It shows good agreement with the observational data until at some point after 1960 where the green curve and the red curve begin to diverge. A second climate model run is made but this time with human factors included and the output of this run is plotted in yellow and now with human factors included the model output and the data do not diverge proving that “from about 1970 onwards” the climate model and the data show excellent agreement. This analysis and its conclusions are consistent with the IPCC’s identification of the “mid-20th century” when a human hand is detectable in the climate.
  12. The analysis presented above implies that only the amount of warming since 1970 can be ascribed to human cause. The average of the warming amounts in the observational data found in Figure 7 for start-year=1970 is 0.847°C. This is the best unbiased estimate of the total amount of warming caused by human activity. The standard error is 0.02 which yields a 90% confidence interval of [0.814-0.880).
  13. We conclude that the IPCC claims to human caused warming of 1°C or greater “since pre-industrial times” is not an unbiased assessment and that it is inconsistent with the data.







  1. [2018: IPCC SR15 SPECIAL REPORT] Human-induced warming reached approximately 1°C (±0.2°C likely range) above pre-industrial levels in 2017, increasing at 0.2°C (±0.1°C) per decade (high confidence). Global warming is defined in this report as an increase in combined surface air and sea surface temperatures averaged over the globe and a 30-year period. Unless otherwise specified, warming is expressed relative to the period 1850-1900, used as an approximation of pre-industrial temperatures in AR5. For periods shorter than 30 years, warming refers to the estimated average temperature over the 30 years centered on that shorter period, accounting for the impact of any temperature fluctuations or trend within those 30 years. Accordingly, warming up to the decade 2006-2015 is assessed at 0.87°C (±0.12°C likely range). Since 2000, the estimated level of human-induced warming has been equal to the level of observed warming with a likely range of ±20% accounting for uncertainty due to contributions from solar and volcanic activity over the historical period (high confidence). {1.2.1} Warming greater than the global average has already been experienced in many regions and seasons, with average warming over land higher than over the ocean?? (high confidence). Most land regions are experiencing greater warming than the global average, while most ocean regions are warming at a slower rate. Depending on the temperature dataset considered, 20-40% of the global human population live in regions that, by the decade 2006-2015, had already experienced warming of more than 1.5°C above pre-industrial in at least one season ?? (medium confidence). {1.2.1 & 1.2.2}
  2. 2018: NASA, Global climate change, vital signs of the planet,  [SOURCE DOCUMENT]The Earth’s climate has changed throughout history. Just in the last 650,000 years there have been seven cycles of glacial advance and retreat, with the abrupt end of the last ice age about 7,000 years ago marking the beginning of the modern climate era — and of human civilization. Most of these climate changes are attributed to very small variations in Earth’s orbit that change the amount of solar energy our planet receives. The current warming trend is of particular significance because most of it is extremely likely (greater than 95 percent probability) to be the result of human activity since the mid-20thcentury and proceeding at a rate that is unprecedented over decades to millennia. Earth-orbiting satellites and other technological advances have enabled scientists to see the big picture, collecting many different types of information about our planet and its climate on a global scale. This body of data, collected over many years, reveals the signals of a changing climate. The heat-trapping nature of carbon dioxide and other gases was demonstrated in the mid-19th century. Their ability to affect the transfer of infrared energy through the atmosphere is the scientific basis of many instruments flown by NASA. There is no question that increased levels of greenhouse gases must cause the Earth to warm in response. Ice cores drawn from Greenland, Antarctica, and tropical mountain glaciers show that the Earth’s climate responds to changes in greenhouse gas levels. Ancient evidence can also be found in tree rings, ocean sediments, coral reefs, and layers of sedimentary rocks. This ancient, or paleoclimate, evidence reveals that current warming is occurring roughly ten times faster than the average rate of ice-age-recovery warming.
  3. 2015: Trenberth, Kevin E., John T. Fasullo, and Theodore G. Shepherd. “Attribution of climate extreme events.” Nature Climate Change 5.8 (2015): 725. There is a tremendous desire to attribute causes to weather and climate events that is often challenging from a physical standpoint. Headlines attributing an event solely to either human-induced climate change or natural variability can be misleading when both are invariably in play. The conventional attribution framework struggles with dynamically driven extremes because of the small signal-to-noise ratios and often uncertain nature of the forced changes. Here, we suggest that a different framing is desirable, which asks why such extremes unfold the way they do. Specifically, we suggest that it is more useful to regard the extreme circulation regime or weather event as being largely unaffected by climate change, and question whether known changes in the climate system’s thermodynamic state affected the impact of the particular event. Some examples briefly illustrated include ‘snowmaggedon’ in February 2010, superstorm Sandy in October 2012 and supertyphoon Haiyan in November 2013, and, in more detail, the Boulder floods of September 2013, all of which were influenced by high sea surface temperatures that had a discernible human component.
  4. [2014: IPCC AR5] Scientific evidence for warming of the climate system is unequivocal according to the IPCC. The current warming trend is of particular significance because most of it is extremely likely (greater than 95 percent probability) to be the result of human activity since the mid-20th century and proceeding at a rate that is unprecedented over decades to millennia. [Source: IPCC Fifth Assessment Report, Summary for Policymakers]
  5. 2011: Hegerl, Gabriele, and Francis Zwiers. “Use of models in detection and attribution of climate change.” Wiley interdisciplinary reviews: climate change 2.4 (2011): 570-591. Most detection and attribution studies use climate models to determine both the expected ‘fingerprint’ of climate change and the uncertainty in the estimated magnitude of this fingerprint in observations, given the climate variability. This review discusses the role of models in detection and attribution, the associated uncertainties, and the robustness of results. Studies that use observations only make substantial assumptions to separate the components of observed changes due to radiative forcing from those due to internal climate variability. Results from observation‐only studies are broadly consistent with those from fingerprint studies. Fingerprint studies evaluate the extent to which patterns of response to external forcing (fingerprints) from climate model simulations explain observed climate change in observations. Fingerprints are based on climate models of various complexities, from energy balance models to full earth system models. Statistical approaches range from simple comparisons of observations with model simulations to multi‐regression methods that estimate the contribution of several forcings to observed change using a noise‐reducing metric. Multi‐model methods can address model uncertainties to some extent and we discuss how remaining uncertainties can be overcome. The increasing focus on detecting and attributing regional climate change and impacts presents both opportunities and challenges. Challenges arise because internal variability is larger on smaller scales, and regionally important forcings, such as from aerosols or land‐use change, are often uncertain. Nevertheless, if regional climate change can be linked to external forcing, the results can be used to provide constraints on regional climate projections
  6. 2010: Stott, Peter A., et al. “Detection and attribution of climate change: a regional perspective.” Wiley Interdisciplinary Reviews: Climate Change 1.2 (2010): 192-211. The Intergovernmental Panel on Climate Change fourth assessment report, published in 2007 came to a more confident assessment of the causes of global temperature change than previous reports and concluded that ‘it is likely that there has been significant anthropogenic warming over the past 50 years averaged over each continent except Antarctica.’ Since then, warming over Antarctica has also been attributed to human influence, and further evidence has accumulated attributing a much wider range of climate changes to human activities. Such changes are broadly consistent with theoretical understanding, and climate model simulations, of how the planet is expected to respond. This paper reviews this evidence from a regional perspective to reflect a growing interest in understanding the regional effects of climate change, which can differ markedly across the globe. We set out the methodological basis for detection and attribution and discuss the spatial scales on which it is possible to make robust attribution statements. We review the evidence showing significant human‐induced changes in regional temperatures, and for the effects of external forcings on changes in the hydrological cycle, the cryosphere, circulation changes, oceanic changes, and changes in extremes. We then discuss future challenges for the science of attribution. To better assess the pace of change, and to understand more about the regional changes to which societies need to adapt, we will need to refine our understanding of the effects of external forcing and internal variability
  7. 2010: Gleick, Peter H., et al. “Climate change and the integrity of science.” Science 328.5979 (2010): 689-690. Climate change falls into the category of undeniable science along with the big bang theory, the theory of the earth, and the theory of evolution. There is compelling, comprehensive, and consistent objective evidence that humans are changing the climate in ways that threaten our societies and the ecosystems on which we depend. The planet is warming due to increased concentrations of heat-trapping gases in our atmosphere. A snowy winter in Washington does not alter this fact. Most of the increase in the concentration of these gases over the last century is due to human activities, especially the burning of fossil fuels and deforestation. Natural causes always play a role in changing Earth’s climate, but are now being overwhelmed by human-induced changes.(iv) Warming the planet will cause many other climatic patterns to change at speeds unprecedented in modern times, including increasing rates of sea-level rise and alterations in the hydrological cycle. Rising concentrations of carbon dioxide are making the oceans more acidic. The combination of these complex climate changes threatens coastal communities and cities, our food and water supplies, marine and freshwater ecosystems, forests, high mountain environments, and far more. Much more can be, and has been, said by the world’s scientific societies, national academies, and individuals, but these conclusions should be enough to indicate why scientists are concerned about what future generations will face from business-as-usual practices. We urge our policy-makers and the public to move forward immediately to address the causes of climate change, including the unrestrained burning of fossil fuels.
  8. 2009: Shindell, Drew T., et al. “Improved attribution of climate forcing to emissions.” Science 326.5953 (2009): 716-718. Evaluating multicomponent climate change mitigation strategies requires knowledge of the diverse direct and indirect effects of emissions. Methane, ozone, and aerosols are linked throughatmospheric chemistry so that emissions of a single pollutant can affect several species. We calculated atmospheric composition changes, historical radiative forcing, and forcing per unit of emission due to aerosol and tropospheric ozone precursor emissions in a coupled compositionclimate model. We found that gas-aerosol interactions substantially alter the relative importance of the various emissions. In particular, methane emissions have a larger impact than that used in current carbon-trading schemes or in the Kyoto Protocol. Thus, assessments of multigas mitigation policies, as well as any separate efforts to mitigate warming from short-lived pollutants, should include gas-aerosol interactions.
  9. 2003: Parmesan, Camille, and Gary Yohe. “A globally coherent fingerprint of climate change impacts across natural systems.” Nature 421.6918 (2003): 37. Causal attribution of recent biological trends to climate change is complicated because non-climatic influences dominate local, short-term biological changes. Any underlying signal from climate change is likely to be revealed by analyses that seek systematic trends across diverse species and geographic regions; however, debates within the Intergovernmental Panel on Climate Change (IPCC) reveal several definitions of a ‘systematic trend’. Here, we explore these differences, apply diverse analyses to more than 1,700 species, and show that recent biological trends match climate change predictions. Global meta-analyses documented significant range shifts averaging 6.1 km per decade towards the poles (or metres per decade upward), and significant mean advancement of spring events by 2.3 days per decade. We define a diagnostic fingerprint of temporal and spatial ‘sign-switching’ responses uniquely predicted by twentieth century climate trends. Among appropriate long-term/large-scale/multi-species data sets, this diagnostic fingerprint was found for 279 species. This suite of analyses generates ‘very high confidence’ (as laid down by the IPCC) that climate change is already affecting living systems.
  10. 1999: Allen, Myles R., and Simon FB Tett. “Checking for model consistency in optimal fingerprinting.” Climate Dynamics 15.6 (1999): 419-434. Current approaches to the detection and attribution of an anthropogenic influence on climate involve quantifying the level of agreement between model-predicted patterns of externally forced change and observed changes in the recent climate record. Analyses of uncertainty rely on simulated variability from a climate model. Any numerical representation of the climate is likely to display too little variance on small spatial scales, leading to a risk of spurious detection results. The risk is particularly severe if the detection strategy involves optimisation of signal-to-noise because unrealistic aspects of model variability may automatically be given high weight through the optimisation. The solution is to confine attention to aspects of the model and of the real climate system in which the model simulation of internal climate variability is adequate, or, more accurately, cannot be shown to be deficient. We propose a simple consistency check based on standard linear regression which can be applied to both the space-time and frequency domain approaches to optimal detection and demonstrate the application of this check to the problem of detection and attribution of anthropogenic signals in the radiosonde-based record of recent trends in atmospheric vertical temperature structure. The influence of anthropogenic greenhouse gases can be detected at a high confidence level in this diagnostic, while the combined influence of anthropogenic sulphates and stratospheric ozone depletion is less clearly evident. Assuming the time-scales of the model response are correct, and neglecting the possibility of non-linear feedbacks, the amplitude of the observed signal suggests a climate sensitivity range of 1.2–3.4 K, although the upper end of this range may be underestimated by up to 25% due to uncertainty in model-predicted response patterns
  11. 1998: North, Gerald R., and Mark J. Stevens. “Detecting climate signals in the surface temperature record.” Journal of climate11.4 (1998): 563-577. Optimal signal detection theory has been applied in a search through 100 yr of surface temperature data for the climate response to four specific radiative forcings. The data used comes from 36 boxes on the earth and was restricted to the frequency band 0.06–0.13 cycles yr−1 (16.67–7.69 yr) in the analysis. Estimates were sought of the strengths of the climate response to solar variability, volcanic aerosols, greenhouse gases, and anthropogenic aerosols. The optimal filter was constructed with a signal waveform computed from a two-dimensional energy balance model (EBM). The optimal weights were computed from a 10000-yr control run of a noise-forced EBM and from 1000-yr control runs from coupled ocean–atmosphere models at Geophysical Fluid Dynamics Laboratory (GFDL) and Max-Planck Institute; the authors also used a 1000-yr run using the GFDL mixed layer model. Results are reasonably consistent across these four separate model formulations. It was found that the component of the volcanic response perpendicular to the other signals was very robust and highly significant. Similarly, the component of the greenhouse gas response perpendicular to the others was very robust and highly significant. When the sum of all four climate forcings was used, the climate response was more than three standard deviations above the noise level. These findings are considered to be powerful evidence of anthropogenically induced climate change.
  12. 1997: Hegerl, Gabriele C., et al. “Multi-fingerprint detection and attribution analysis of greenhouse gas, greenhouse gas-plus-aerosol and solar forced climate change.” Climate Dynamics13.9 (1997): 613-634. A multi-fingerprint analysis is applied to the detection and attribution of anthropogenic climate change. While a single fingerprint is optimal for the detection of climate change, further tests of the statistical consistency of the detected climate change signal with model predictions for different candidate forcing mechanisms require the simultaneous application of several fingerprints. Model-predicted climate change signals are derived from three anthropogenic global warming simulations for the period 1880 to 2049 and two simulations forced by estimated changes in solar radiation from 1700 to 1992. In the first global warming simulation, the forcing is by greenhouse gas only, while in the remaining two simulations the direct influence of sulfate aerosols is also included. From the climate change signals of the greenhouse gas only and the average of the two greenhouse gas-plus-aerosol simulations, two optimized fingerprint patterns are derived by weighting the model-predicted climate change patterns towards low-noise directions. The optimized fingerprint patterns are then applied as a filter to the observed near-surface temperature trend patterns, yielding several detection variables. The space-time structure of natural climate variability needed to determine the optimal fingerprint pattern and the resultant signal-to-noise ratio of the detection variable is estimated from several multi-century control simulations with different CGCMs and from instrumental data over the last 136 y. Applying the combined greenhouse gas-plus-aerosol fingerprint in the same way as the greenhouse gas only fingerprint in a previous work, the recent 30-y trends (1966–1995) of annual mean near surface temperature are again found to represent a significant climate change at the 97.5% confidence level. However, using both the greenhouse gas and the combined forcing fingerprints in a two-pattern analysis, a substantially better agreement between observations and the climate model prediction is found for the combined forcing simulation. Anticipating that the influence of the aerosol forcing is strongest for longer term temperature trends in summer, application of the detection and attribution test to the latest observed 50-y trend pattern of summer temperature yielded statistical consistency with the greenhouse gas-plus-aerosol simulation with respect to both the pattern and amplitude of the signal. In contrast, the observations are inconsistent with the greenhouse-gas only climate change signal at a 95% confidence level for all estimates of climate variability. The observed trend 1943–1992 is furthermore inconsistent with a hypothesized solar radiation change alone at an estimated 90% confidence level. Thus, in contrast to the single pattern analysis, the two pattern analysis is able to discriminate between different forcing hypotheses in the observed climate change signal. The results are subject to uncertainties associated with the forcing history, which is poorly known for the solar and aerosol forcing, the possible omission of other important forcings, and inevitable model errors in the computation of the response to the forcing. Further uncertainties in the estimated significance levels arise from the use of model internal variability simulations and relatively short instrumental observations (after subtraction of an estimated greenhouse gas signal) to estimate the natural climate variability. The resulting confidence limits accordingly vary for different estimates using different variability data. Despite these uncertainties, however, we consider our results sufficiently robust to have some confidence in our finding that the observed climate change is consistent with a combined greenhouse gas and aerosol forcing, but inconsistent with greenhouse gas or solar forcing alone.
  13. 1996: Santer, Benjamin D., et al. “A search for human influences on the thermal structure of the atmosphere.” Nature 382.6586 (1996): 39. The observed spatial patterns of temperature change in the free atmosphere from [1963 to 1987] are similar to those predicted by state-of-the-art climate models incorporating various combinations of changes in carbon dioxide, anthropogenic sulphate aerosol and stratospheric ozone concentrations. The degree of pattern similarity between models and observations increases through this period. It is likely that this trend is partially due to human activities, although many uncertainties remain, particularly relating to estimates of natural variability.
  14. 1996: Hegerl, Gabriele C., et al. “Detecting greenhouse-gas-induced climate change with an optimal fingerprint method.” Journal of Climate 9.10 (1996): 2281-2306. A strategy using statistically optimal fingerprints to detect anthropogenic climate change is outlined and applied to near-surface temperature trends. The components of this strategy include observations, information about natural climate variability, and a “guess pattern” representing the expected time–space pattern of anthropogenic climate change. The expected anthropogenic climate change is identified through projection of the observations onto an appropriate optimal fingerprint, yielding a scalar-detection variable. The statistically optimal fingerprint is obtained by weighting the components of the guess pattern (truncated to some small-dimensional space) toward low-noise directions. The null hypothesis that the observed climate change is part of natural climate variability is then tested. This strategy is applied to detecting a greenhouse-gas-induced climate change in the spatial pattern of near-surface temperature trends defined for time intervals of 15–30 years. The expected pattern of climate change is derived from a transient simulation with a coupled ocean-atmosphere general circulation model. Global gridded near-surface temperature observations are used to represent the observed climate change. Information on the natural variability needed to establish the statistics of the detection variable is extracted from long control simulations of coupled ocean-atmosphere models and, additionally, from the observations themselves (from which an estimated greenhouse warming signal has been removed). While the model control simulations contain only variability caused by the internal dynamics of the atmosphere-ocean system, the observations additionally contain the response to various external forcings (e.g., volcanic eruptions, changes in solar radiation, and residual anthropogenic forcing). The resulting estimate of climate noise has large uncertainties but is qualitatively the best the authors can presently offer. The null hypothesis that the latest observed 20-yr and 30-yr trend of near-surface temperature (ending in 1994) is part of natural variability is rejected with a risk of less than 2.5% to 5% (the 5% level is derived from the variability of one model control simulation dominated by a questionable extreme event). In other words, the probability that the warming is due to our estimated natural variability is less than 2.5% to 5%. The increase in the signal-to-noise ratio by optimization of the fingerprint is of the order of 10%–30% in most cases. The predicted signals are dominated by the global mean component; the pattern correlation excluding the global mean is positive but not very high. Both the evolution of the detection variable and also the pattern correlation results are consistent with the model prediction for greenhouse-gas-induced climate change. However, in order to attribute the observed warming uniquely to anthropogenic greenhouse gas forcing, more information on the climate’s response to other forcing mechanisms (e.g., changes in solar radiation, volcanic, or anthropogenic sulfate aerosols) and their interaction is needed. It is concluded that a statistically significant externally induced warming has been observed, but our caveat that the estimate of the internal climate variability is still uncertain is emphasized.
  15. 1995: Santer, B. D., K. E. Taylor, and J. E. Penner. A search for human influences on the thermal structure of the atmosphere. No. UCRL-ID-121956. Lawrence Livermore National Lab., CA (United States), 1995. Several recent studies have compared observed changes in near-surface temperature with patterns of temperature change predicted by climate models in response to combined forcing by carbon dioxide and anthropogenic sulphate aerosols. These results suggest that a combined carbon dioxide + sulphate aerosol signal is easier to identify in the observations than a pattern of temperature change due to carbon dioxide alone. This work compares modelled and observed patterns of vertical temperature change in the atmosphere. Results show that the observed and model-predicted changes in the mid- to low troposphere are in better accord with greenhouse warming predictions when the likely effects of anthropogenic sulphate aerosols and stratospheric ozone reduction are incorporated in model calculations, and that the level of agreement increases with time. This improved correspondence is primarily due to hemispheric-scale temperature contrasts. If current model-based estimates of natural internal variability are realistic, it is likely that the level of time-increasing similarity between modelled and predicted patterns of vertical temperature change is partially due to human activities.
  16. 1995: North, Gerald R., et al. “Detection of forced climate signals. Part 1: Filter theory.” Journal of Climate 8.3 (1995): 401-408. This paper considers the construction of a linear smoothing filter for estimation of the forced part of a change in a climatological field such as the surface temperature. The filter is optimal in the sense that it suppresses the natural variability or “noise” relative to the forced part or “signal” to the maximum extent possible. The technique is adapted from standard signal processing theory. The present treatment takes into account the spatial as well as the temporal variability of both the signal and the noise. In this paper we take the signal’s waveform in space-time to be a given deterministic field in space and lime. Formulation of the expression for the minimum mean-squared error for the problem together with a no-bias constraint leads to an integral equation whose solution is the filter. The problem can be solved analytically in terms of the space-time empirical orthogonal function basis set and its eigenvalue spectrum for the natural fluctuations and the projection amplitudes of the signal onto these eigenfunctions. The optimal filter does not depend on the strength of the assumed waveform used in its construction. A lesser mean-square error in estimating the signal occurs when the space-time spectral characteristics of the signal and the noise are highly dissimilar; for example, if the signal is concentrated in a very narrow spectral band and the noise in a very broad band. A few pedagogical exercises suggest that these techniques might be useful in practical situations.
  17. 1993: Hasselmann, Klaus. “Optimal fingerprints for the detection of time-dependent climate change.” Journal of Climate 6.10 (1993): 1957-1971. An optimal linear filter (fingerprint) is derived for the detection of a given time-dependent, multivariate climate change signal in the presence of natural climate variability noise. Application of the fingerprint to the observed (or model simulated) climate data yields a climate change detection variable (detector) with maximal signal-to-noise ratio. The optimal fingerprint is given by the product of the assumed signal pattern and the inverse of the climate variability covariance matrix. The data can consist of any, not necessarily dynamically complete, climate dataset for which estimates of the natural variability covariance matrix exist. The single-pattern analysis readily generalizes to the multipattern case of a climate change signal lying in a prescribed (in practice relatively low dimensional) signal pattern space: the single-pattern result is simply applied separately to each individual base pattern spanning the signal pattern space. Multipattern detection methods can be applied either to test the statistical significance of individual components of a predicted multicomponent climate change response, using separate single-pattern detection tests, or to determine the statistical significance of the complete signal, using a multivariate test. Both detection modes make use of the same set of detectors. The difference in direction of the assumed signal pattern and computed optimal fingerprint vector allows alternative interpretations of the estimated signal associated with the set of optimal detectors. The present analysis yields an estimated signal lying in the assumed signal space, whereas an earlier analysis of the time-independent detection problem by Hasselmann yielded an estimated signal in the computed fingerprint space. The different interpretations can be explained by different choices of the metric used to relate the signal space to the fingerprint space (inverse covariance matrix versus standard Euclidean metric, respectively). Two simple natural variability models are considered: a space-time separability model, and an expansion in terms of P0Ps (principal oscillation patterns). For each model the application of the optimal fingerprint method is illustrated by an example.

2 Responses to "1C AGW Since Pre-Industrial Times"

[…] 1C AGW Since Pre-Industrial Times […]

[…] 1C AGW Since Pre-Industrial Times […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: