Total Pageviews

Tuesday, April 7, 2020

"How Much Human-Caused Global Warming Should We Expect with Business-As-Usual Climate Policies?"

This 51 page pdf report included an unusually clear explanation of the questions climate scientists still need answers for. Real climate science has a lot of complex questions, but not many answers.

Junk climate science, that you read elsewhere, claims to have a simple "answer" -- a simple, wrong answer -- manmade CO2 is evil. And the climate alarmists promote the obviously false claim of being able to predict the future climate, even after 60+ years of wrong predictions ... so far.

Below are direct quotes 
from the report, 
chosen by your editor.


"How Much Human-Caused 
Global Warming Should We Expect 
with Business-As-Usual (BAU) 
Climate Policies? 
A Semi-Empirical Assessment"

Ronan Connolly
Michael Connolly
Robert M. Carter and 
Willie Soon

Published: 15 March 2020


Quotes From The Abstract (edited) 
"In order to assess the merits of national climate change mitigation policies, it is important to have a reasonable benchmark for how much human-caused global warming would occur over the coming century with “Business-As-Usual” (BAU) conditions.

However, currently, policymakers are limited to making assessments by comparing the Global Climate Model (GCM) projections of future climate change under various different “scenarios”, none of which are explicitly defined as BAU. 

... the current Intergovernmental Panel on Climate Change (IPCC) “likely” range estimates for TCR of 1.0 to 2.5 C and ECS of 1.5 to 4.5 C have not yet established if human-caused global warming is a 21st century problem."


Quotes From the Report (edited):
"Since the late-1960s and early-1970s, computer model simulations of the Earth’s climate have been predicting that increasing concentrations of “greenhouse gases” (chiefly, carbon dioxide or CO2) in the atmosphere from human activity should be causing substantial global warming at the Earth’s surface and in the lower atmosphere.

 In the 1960s and 1970s, estimates of global surface temperature trends (which mostly were confined to the Northern Hemisphere since there is less long-term data for the Southern Hemisphere) suggested, if anything, a global cooling trend. 

However, during the 1980s, the cooling trend reversed and, by the late-1980s, the long-term linear trend since the (relatively cold) late-19th century was warming.

The term “Anthropogenic (man made) Global Warming” (AGW) is often used. 

These claims garnered a lot of media attention and public concern. 

Ultimately, this led the United Nations to set up the United Nations Framework Convention on Climate Change (UNFCCC) with the goal of facilitating international negotiations to achieve the “ . . .  stabilization of greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system”

To work in parallel with the political work of the UNFCCC, the United Nations also co-founded, with the World Meteorological Organization, a separate body called the Intergovernmental Panel on Climate Change (IPCC) to provide amongst other things “a comprehensive review (  on ) the state of knowledge of the science of climate and climatic change”

In the ensuing years, the computer models have continued to predict that increasing greenhouse gas concentrations should be causing substantial global warming. 

Indeed, based on the results of simulations with one of the NASA Goddard Institute of Space Studies (GISS)’ computer models, Lacis et al. (2010) concluded that “atmospheric CO2 ( . . . is the) principal control knob governing Earth’s temperature” .

Largely on the basis of comparing the results of such computer models to global temperature trends, the IPCC’s most recent complete Assessment Report (2013) concluded that, “It is extremely likely that human influence has been the dominant cause of the observed warming since the mid-20th century” 

Although a competing series of reports has been published by the Nongovernmental International Panel on Climate Change (NIPCC), which contradicts many of the IPCC’s findings, e.g., NIPCC (2013) and (2019), the IPCC reports are widely cited and have been highly influential among both the scientific community and policymakers.

Meanwhile, the efforts of the UNFCCC have led to a series of major international treaties and agreements to try to reduce greenhouse gas emissions, from the Kyoto Protocol (1996) to the Paris Agreement (2015).

In particular, the Paris Agreement specifically aims to encourage national and international policies to reduce greenhouse gas emissions with the view to, “Holding the increase in the global average temperature to well below 2 C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5 C above pre-industrial levels”

Although the United States has decided to withdraw from the Paris Agreement, most nations are currently signed up to the voluntary Paris Agreement.

The selection of a specific “global average temperature” in C as an international “goal” is a remarkably arbitrary and subjective process, e.g., see Mahoney (2015). 

If the motivation of the Paris Agreement was to encourage individual nations to significantly alter their national policies relative to “Business-As-Usual”, then it is important to know how much Anthropogenic Global Warming to expect under “Business-As-Usual” conditions.

In other words, what is the “human-caused global warming baseline” against which efforts to meet the Paris Agreement are to be assessed? 

This is the question we will attempt to answer in this paper. 

However, while the question might initially seem fairly reasonable and straightforward, as we will discuss, it is remarkably challenging to answer satisfactorily. 

In essence, 
it depends on 
the answers to four 
separate questions:

Question 1. 
What would future greenhouse gas emissions be over the coming century under “Business-As-Usual” conditions?


Question 2. 
For each of the greenhouse gases, what is the relationship between emissions and actual changes in atmospheric concentrations?


Question 3. 
How different would global average temperatures be at present if greenhouse gases were still at “pre-industrial concentrations?

In other words, how do we define the “pre-industrial levels” of global average temperatures to which the Paris Agreement refers? 


Question 4. 
How “sensitive” are global average temperatures to increases in the atmospheric concentrations of greenhouse gases?

With each of these questions, there is considerable debate in the scientific literature. 

However, the relevant literature for each subject comes from quite different academic disciplines. 

The first question is typically addressed by economists, political scientists, environmental governance researchers, etc. 

The second question is mostly the realm of biologists, ecologists, geochemists, oceanographers, etc. 

The third and fourth questions are both climate science problems, but even within these topics, there are separate bodies of literature from, e.g., computer modeling research groups, groups evaluating climate records, statisticians evaluating results from a meta-analysis perspective, etc.


... it has not been directly established experimentally how “sensitive” the global average temperature is to changes in greenhouse gas concentrations. 

... the computer model projections are considering concentrations twice,
four times, or more that of pre-industrial CO2, while the historical observations still only cover a period where CO2 has still only increased by less than 45% relative to the pre-industrial concentrations implied by the Antarctic ice cores. 

... computer model projections are only describing climate change in a computer model world.

... if you want to understand how the climate changes in the real world, you will get more realistic answers if you base them on actual experimental observations.

In a well-cited National Research Council report in 1979, led by Jule Charney (and hence commonly referred to as “the Charney report”), the results of such ECS simulations from several computer modeling groups were used to conclude that a doubling of atmospheric CO2 would probably lead to 1.5–4.5 C of human-caused global warming. 

More recently, Knutti et al. (2017) confirmed that, still, “the consensus on the ‘likely’ range for climate sensitivity of 1.5 to 4.5 C today is the same as given by Jule Charney in 1979.

... the most recent IPCC 5th Assessment Report (2013) also argued that the “likely” value for the Equilibrium Climate Sensitivity was probably in the range 1.5–4.5 C.

At any rate, by the mid-1980s, it was already apparent that the long-term global warming since 1880 was at least half of what would have been expected from the ECS values, given the increase in atmospheric CO2 that had already occurred.

Some (or even all) of the observed global warming may have been due to natural factors and/or other non-greenhouse gas-related factors. 

On the other hand, there may have been additional “global cooling” factors—either natural (e.g., decreases in solar activity) or human-caused (e.g., increases in aerosols)—that led to a reduction in human-caused global warming. 

That is, the amount of human-caused global warming that should have already occurred might be less than or greater than the amount of observed global warming. 

Indeed, this is a major part of the reason why there is still such uncertainty over the actual climate sensitivity to greenhouse gases.


At any rate, for us, probably the most striking result is the sheer range of possible values by the end of our BAU projections in 2100."