THE LESS WE KNOW THE SCARIER IT GETS
Posted November 3, 2019
on:SUMMARY: At current sea level rise forecasts, it was projected that 110 million people will be affected by coastal high tide flooding events by the year 2100 but new improved DEM data for coastal land elevation shows that they are not as high as we had thought and so the number of people affected by high tide flood events at the same rate of sea level rise will be higher, maybe 190 million or somewhere between 140 and 240 million. A problem with that assessment is that the large uncertainty in coastal land elevation data may mean that we don’t really know what the coastal land elevation is exactly.
- How the media sees it: November 1, 2019 at 6:37 a.m. GMT+7: HERE’S ANOTHER piece of evidence that climate change might be worse than scientists previously predicted. The seas are rising, and will continue to rise, because hotter temperatures melt land-based ice and expand the volume existing ocean water takes up. But while much study has examined the shift in amount and warmth of seawater humans will face, there is another variable scientists must get right to assess the risk to humanity: just how many people live in low-lying areas. A new paper suggests previous estimates of land elevation — and, therefore, the number of at-risk people — were wrong. The study, published Tuesday in the journal Nature Communications, corrects satellite elevation data, and it “reveals a developed global coastline three times more exposed to extreme coastal water levels than previously thought,” the authors warn. Even under an optimistic scenario in which heat-warming greenhouse emissions are restrained and Antarctic ice sheets remain stable, “the global impacts of sea-level rise and coastal flooding this century will likely be far greater than indicated by the most pessimistic past analyses.” [LINK]
- TRANSLATION: Sadly, it looks like AGW climate change driven sea level rise won’t be as high and as scary as we were hoping for but there is still hope for us. What if coastal lands are not as high as we think they are? That would cause the same degree of devastation at the lower sea level rise that we now have to live with. All those people in Bangladesh and elsewhere living close to sea level will die and it will all be your fault for using fossil fuels. [RELATED POST] .
- What their source paper says: Article Open Access Published: 29 October 2019: New elevation data triple estimates of global vulnerability to sea-level rise and coastal flooding, Scott A. Kulp & Benjamin H. Strauss, Nature Communications volume 10, Article number: 4844 (2019) Cite this article. ABSTRACT: Most estimates of global mean sea-level rise this century fall below 2 m. This quantity is comparable to the positive vertical bias of the principle digital elevation model (DEM) used to assess global and national population exposures to extreme coastal water levels, NASA’s SRTM. CoastalDEM is a new DEM utilizing neural networks to reduce SRTM error. Here we show – employing CoastalDEM—that 190 M people (150–250 M, 90% CI) currently occupy global land below projected high tide lines for 2100 under low carbon emissions, up from 110 M today, for a median increase of 80 M. These figures triple SRTM-based values. Under high emissions, CoastalDEM indicates up to 630 M people live on land below projected annual flood levels for 2100, and up to 340 M for mid-century, versus roughly 250 M at present. We estimate one billion people now occupy land less than 10 m above current high tide lines, including 250 M below 1 m.
- TRANSLATION: Using a new improved digital elevation model (DEM) developed by NASA, we find that land is not as high as we thought it was. We used to think that land was higher such that only 110 million people were at risk of suffering from coastal flooding at current SLR projections for the year 2100. But new land elevation data shows that land is lower than we thought it was such that at the same sea level rise, 190 million people are at risk of suffering from coastal flooding at current SLR projections for the year 2100. The uncertainty in this projection shows a range of {190-40=150} to {190+60=250} million people affected for a 90% confidence interval. Subtracting 40 million for the low end of the confidence interval and adding 60 million for the high end of the confidence interval makes it a little scarier, we thought.
- UNCERTAINTY: The Zhang etal 2019 paper shows that the error in the DEM estimations may be even higher with values ranging from 1.74 to 14.29 and the Dai etal 2019 paper finds a much lower uncertainty within 2 meters with much lower scary predictions. Although uncertainty measures the extent of our ignorance the advantage of uncertainty in climate alarmism is that the less we know, the wider the 90% confidence interval gets, and the scarier climate change becomes as explained in paragraph#5 in this related post [LINK] {Activism needs of researchers also corrupt how the statistical property of variance is viewed and interpreted. In statistics and also in information theory, high variance implies low information content. In other words, the higher the variance the less we know. In this context high variance is undesirable because it degrades the information we can derive from the data. However, high variance also yields large confidence intervals and where either end of this confidence interval is extreme enough to support the activism needs of climate researchers, high variance is interpreted not as absence of information but as information about a danger of how extreme it could be. This interpretation in conjunction with the precautionary principle leads to a perverse interpretation of uncertainty such that uncertainty becomes transformed into certainty of extreme values}. In other words, THE LESS WE KNOW THE SCARIER IT GETS.
- An alternative methodology that bypasses the uncertainty problem in SRTM v4. 1 and MERIT DEM data was suggested in a WUWT comment by Hans Erren as follows: “November 3, 2019 at 4:46 am. A simple solution springs to my mind: use sea level gauges in coastal areas and do not use satellites at all” [LINK] . The Hans Erren insight is that an unnecessary complexity is imposed on the study of high tide floods in a list of specific locations by the climate science reliance on global mean eustatic sea level (GMES). It is true that the study of sea level rise and its overall impacts should be studied in terms of GMES but the study of localized events in terms of global data creates an unnecessary complication that introduces layers of uncertainty that do not exist in local data. To understand localized high tide floods as a function of GMES requires land elevation data on a standardized global scale. On the other hand to understand localized high tide flood events as a function of local sea level is a very simple exercise that does not require uncertain satellite measures of land elevation. Since the coastal areas at risk have already been identified, and their tidal gauge data and high tide flood events are recorded, much greater precision in forecasting future high tide floods can be realized if these at-risk-areas are studied separately instead translating them into global data and then back again to local data. The number of people at risk at each coastal area can then be assessed. The global number can then be estimated by summation. The study of global GMES and DEM data in order to understand localized phenomena is the source of the large uncertainties that ultimately erode the utility of such findings.
BIBLIOGRAPHY
- Kulp, Scott A., and Benjamin H. Strauss. “CoastalDEM: A global coastal digital elevation model improved from SRTM using a neural network.” Remote sensing of environment 206 (2018): 231-239. Positive vertical bias in elevation data derived from NASA’s Shuttle Radar Topography Mission (SRTM) is known to cause substantial underestimation of coastal flood risks and exposure. Previous attempts to correct SRTM elevations have used regression to predict vertical error from a small number of auxiliary data products, but these efforts have been focused on reducing error introduced solely by vegetative land cover. Here, we employ a multilayer perceptron artificial neural network to perform a 23-dimensional vertical error regression analyses, where in addition to vegetation cover indices, we use variables including neighborhood elevation values, population density, land slope, and local SRTM deviations from ICESat altitude observations. Using lidar data as ground truth, we train the neural network on samples of US data from 1–20 m of elevation according to SRTM, and assess outputs with extensive testing sets in the US and Australia. Our adjustment system reduces mean vertical bias in the coastal US from 3.67 m to less than 0.01 m, and in Australia from 2.49 m to 0.11 m. RMSE is cut by roughly one-half at both locations, from 5.36 m to 2.39 m in the US, and from 4.15 m to 2.46 in Australia. Using ICESat data as a reference, we estimate that global bias falls from 1.88 m to −0.29 m, and RMSE from 4.28 m and 3.08 m. The methods presented here are flexible and effective, and can be effectively applied to land cover of all types, including dense urban development. The resulting enhanced global coastal DEM (CoastalDEM) promises to greatly improve the accuracy of sea level rise and coastal flood analyses worldwide.
- Kulp, Scott Andrew, and B. Strauss. “Improved elevation data more than doubles estimates of global coastal vulnerability to sea level rise.” AGU Fall Meeting Abstracts. 2018. As sea levels rise and damaging storm surge becomes more intense and frequent, accurate flooding vulnerability assessments are essential to prepare coastal communities for the growing impacts and damage these threats may bring. A digital elevation model (DEM) is the foundation of such analyses, but large numbers of assessments performed outside of the United States use NASA’s SRTM, which has a multimeter mean vertical bias in the coastal zone globally – more than most sea level projections for this century. Here, we apply an improved global coastal elevation model we have developed using artificial neural newtorks, CoastalDEM, that reduces mean vertical bias to on the order of 10cm. A global vulnerability assessment with our new model suggests that SRTM has gravely underestimated coastal threats from sea level rise. Across multiple carbon emission pathways and sea level rise projection models, CoastalDEM predicts more than twice as many people living on land at risk of permanent inundation this century than SRTM does.
- Hirt, Christian. “Artefact detection in global digital elevation models (DEMs): The Maximum Slope Approach and its application for complete screening of the SRTM v4. 1 and MERIT DEMs.” Remote Sensing of Environment 207 (2018): 27-41. Despite post-processing efforts by space agencies and research institutions, contemporary global digital elevation models (DEMs) may contain artefacts, i.e., erroneous features that do not exist in the actual terrain, such as spikes, holes and line errors. The goal of the present paper is to illuminate the artefact issue of current global DEM data sets that might be an obstacle for any geoscience study using terrain information. We introduce the Maximum Slope Approach (MSA) as a technique that uses terrain slopes as indicator to detect and localize spurious artefacts. The MSA relies on the strong sensitivity of terrain slopes for sudden steps in the DEM that is a direct feature of larger artefacts. In a numerical case study, the MSA is applied for globally complete screening of two SRTM-based 3 arc-second DEMs, the SRTM v4.1 and the MERIT-DEM. Based on 0.1° × 0.1° sub-divisions and a 5 m/m slope threshold, 1341 artefacts were detected in SRTM v4.1 vs. 108 in MERIT. Most artefacts spatially correlate with SRTM voids (and thus with the void-filling) and not with the SRTM-measured elevations. The strong contrast in artefact frequency (factor ~12) is attributed to the SRTM v4.1 hole filling. Our study shows that over parts of the Himalaya Mountains the SRTM v4.1 data set is contaminated by step artefacts where the use of this DEM cannot be recommended. Some caution should be exercised, e.g., over parts of the Andes and Rocky Mountains. The same holds true for derived global products that depend on SRTM v4.1, such as gravity maps. Primarily over the major mountain ranges, the MERIT model contains artefacts, too, but in smaller numbers. As a conclusion, globally complete artefact screening is recommended prior to the public release of any DEM data set. However, such a quality check should also be considered by users before using DEM data. MSA-based artefact screening is not only limited to DEMs, but can be applied as quality assurance measure to other gridded data sets such as digital bathymetric models or gridded physical quantities such as gravity or magnetics.
- Jain, Akshay O., et al. “Vertical accuracy evaluation of SRTM-GL1, GDEM-V2, AW3D30 and CartoDEM-V3. 1 of 30-m resolution with dual frequency GNSS for lower Tapi Basin India.” Geocarto international 33.11 (2018): 1237-1256. Shuttle Radar Topography Mission (SRTM-GL1), Advanced Space Borne Thermal Emission and Reflection Radiometer- Global DEM (GDEM-V2), recently released Advanced Land Observing Satellite (‘DAICHI’) DEM (AW3D30) and Indian National Cartosat-1 DEM v3 (CartoDEM-V3.1) provide free topographic data at a 30-m resolution for Indian peninsula. In this research study, the vertical accuracy of DEM is evaluated for above data-sets and compared with high accuracy dual frequency GNSS of a millimetre accuracy. The extensive field investigation is carried out using a stratified random fast static DGPS survey for collecting 117 high accuracy ground control points in a predominantly agriculture catchment. Further, the effect of land cover, slope and low-lying coastal zone on DEM vertical accuracy was also analysed and presented in this study. The results for RMSE of terrain elevation are 2.88m, 5.46m, 2.45m and 3.52m for SRTM-GL1, GDEM-V2, AW3D30 and CartoDEM-V3.1 respectively.
- Zhang, Keqi, et al. “Accuracy assessment of ASTER, SRTM, ALOS, and TDX DEMs for Hispaniola and implications for mapping vulnerability to coastal flooding.” Remote sensing of environment 225 (2019): 290-306. Digital elevation models (DEMs) derived from remote sensing data provide a valuable and consistent data source for mapping coastal flooding at local and global scales. Mapping of flood risk requires quantification of the error in DEM elevations and its effect on delineation of flood zones. The ASTER, SRTM, ALOS, and TanDEM-X (TDX) DEMs for the island of Hispaniola were examined by comparing them with GPS and LiDAR measurements. The comparisons were based on a series of error measures including root mean square error (RMSE) and absolute error at 90% quantile (LE90). When compared with >2000 GPS measurements with elevations below 7 m, RMSE and LE90 values for ASTER, SRTM, ALOS, TDX DEMs were 8.44 and 14.29, 3.82 and 5.85, 2.08 and 3.64, and 1.74 and 3.20 m, respectively. In contrast, RMSE and LE90 values for the same DEMs were 4.24 and 6.70, 4.81 and 7.16, 4.91 and 6.82, and 2.27 and 3.66 m when compared to DEMs from 150 km2 LiDAR data, which included elevations as high as 20 m. The expanded area with LiDAR coverage included additional types of land surface, resulting in differences in error measures. Comparison of RMSEs indicated that the filtering of TDX DEMs using four methods improved the accuracy of the estimates of ground elevation by 20–43%. DTMs generated by interpolating the ground pixels from a progressive morphological filter, using an empirical Bayesian kriging method, produced an RMSE of 1.06 m and LE90 of 1.73 m when compared to GPS measurements, and an RMSE of 1.30 m and LE90 of 2.02 m when compared to LiDAR data. Differences in inundation areas based on TDX and LiDAR DTMs were between −13% and −4% for scenarios of 3, 5, 10, and 15 m water level rise, a much narrower range than inundation differences between ASTER, SRTM, ALOS and LiDAR. The TDX DEMs deliver high resolution global DEMs with unprecedented elevation accuracy, hence, it is recommended for mapping coastal flood risk zones on a global scale, as well as at a local scale in developing countries where data with higher accuracy are unavailable.
- Dai, Chunli, et al. “Coastline extraction from repeat high resolution satellite imagery.” Remote Sensing of Environment 229 (2019): 260-270. This paper presents a new coastline extraction method that improves water classification accuracy by benefitting from an ever-increasing volume of repeated measurements from commercial satellite missions. The widely-used Normalized Difference Water Index (NDWI) method is tested on a sample of around 12,600 satellite images for statistical analysis. The core of the new water classification method is the use of a water probability algorithm based on the stacking of repeat measurements, which can mitigate the effects of translational offsets of images and the classification errors caused by clouds and cloud shadows. By integrating QuickBird, WorldView-2 and WorldView-3 multispectral images, the final data product provides a 2 m resolution coastline, as well as a 2 m water probability map and a repeat-count measurement map. Improvements on the existing coastline (GSHHS-the Global Self-consistent, Hierarchical, High-resolution Shoreline Database, 50 m–5000 m) in terms of resolution (2 m) is substantial, thanks to the combination of multiple data sources.
5 Responses to "THE LESS WE KNOW THE SCARIER IT GETS"

[…] small island states, low lying deltas such as Bangladesh, and coastal communities such as Florida [LINK] [LINK] [LINK] [LINK] and it is therefore proposed that climate action in the form of reducing or […]


[…] large uncertainties that are not reflected in their evaluation in terms of high tide floods [LINK] . The photo presented in the video is that of a monsoon flood in South Asia similar to the […]


[…] as reason for costly climate action as in “the less we know the scarier it gets” [LINK] […]


[…] elevation data may mean that we don’t really know what the coastal land elevation is exactly. [LINK] […]

November 5, 2019 at 2:50 am
Reblogged this on Climate- Science.press.