Total Pageviews

Monday, January 31, 2022

Japanese climate modeler Mototaka Nakamura published the Japanese Version of Confessions of a Climate Scientist: The global warming hypothesis is an unproven hypothesis

 Source:

"Unproven Hypothesis:
In July 2018, established climate modeler Mototaka Nakamura had published the Japanese Version of Confessions of a climate scientist:

The global warming hypothesis is an unproven hypothesis.

On July 25, 2020, TWTW reviewed it and discussed it several times thereafter.

It reinforced Richard Lindzen’s invited paper in The European Physical Journal Plus in June 2020, on climate change reviewed in TWTW in June and July of that year, Lindzen stated:


1.    The core of the system consists in two turbulent fluids (the atmosphere and oceans) interacting with each other.

2.    The two fluids are on a rotating planet that is differentially [unevenly] heated by the sun and unevenly absorbing the solar warming.

Solar rays directly hit the equator and skim the earth at the poles resulting in uneven heating, which drives the circulation of the atmosphere.

The result is heat transport from the equator towards the poles (meridional).

3.    The earth’s climate system is never in equilibrium.

4.    In addition to the oceans, the atmosphere is interacting with a hugely irregular land surface distorting the airflow, causing planetary scale waves, which are not accurately described in climate models.

5.    A vital component of the atmosphere is water in its liquid, solid, and vapor phases, and the changes in phases have immense dynamic consequences.

Each phase affects incoming and outgoing radiation differently.

Substantial heat is released when water vapor condenses, driving thunder clouds.

Further, clouds consist of water in the form of fine droplets and ice crystals.

Normally, these are suspended by rising air currents, but when these grow large enough, they fall as rain and snow.

The energies involved in phase changes are important, as well as the fact that both water vapor and clouds strongly affect radiation.

“The two most important greenhouse substances by far are water vapor and clouds.

Clouds are also important reflectors of sunlight.

These matters are discussed in detail in the IPCC WG1 reports, each of which openly acknowledge clouds as major sources of uncertainty in climate modeling.”

[Despite that acknowledgement, the IPCC Summaries to Policymakers largely ignore these uncertainties.]

6.    “The energy budget of this system involves the absorption and reemission of about 240 W/m2 [Watts per square meter].

Doubling CO2 involves a perturbation [deviation] a bit less than 2% to this budget (4 W/m2)

So do changes in clouds and other features, and such changes are common.

The Earth receives about 340 W/m2 from the sun, but about 100 W/m2 is simply reflected back to space by both the Earth’s surface and, more importantly, by clouds.

This would leave about 240 W/m2 that the Earth would have to emit in order to establish balance.

The sun radiates in the visible portion of the radiation spectrum because its temperature is about 6000 K.

If the Earth had no atmosphere at all (but for purposes of argument still was reflecting 100 W/m2), it would have to radiate at a temperature of about 255 K, and, at this temperature, the radiation is mostly in the infrared.”

The oceans and the atmosphere introduce a host of complications including evaporation creating water vapor which strongly absorbs and emits radiation in the infrared.

“The water vapor essentially blocks infrared radiation from leaving the surface, causing the surface and (via conduction) the air adjacent to the surface to heat, and convection sets in.

The combination of the radiative and the convective processes results in
decreasing temperature with height [lapse rate].

To make matters more complicated, the amount of water vapor that the air can hold decreases rapidly as the temperature decreases.

Above some height there is so little water vapor remaining that radiation from this level can now escape to space.

It is at this elevated level (around 5 km) that the temperature must be about 255 K in order to balance incoming radiation.

However, because the temperature decreases with height, the surface of the Earth now has to actually be warmer than 255 K.

It turns out that it has to be about 288 K (which is indeed the average temperature of the earth’s surface).

The addition of other greenhouse gases (like CO2) increases further the emission level and causes an additional increase of the ground temperature.

Doubling CO2 is estimated to be equivalent to a forcing of about 4W/m2 which is a little less than 2% of the net incoming 240 W/m2.

“The situation can actually be more complicated if upper-level cirrus clouds are present.

They are very strong absorbers and emitters of infrared radiation and effectively block infrared radiation from below.

Thus, when such clouds are present above about 5 km, their tops, rather than 5 km determine the emission level.

This makes the ground temperature (i.e., the greenhouse effect) dependent on the cloud coverage.

“Many factors, including fluctuations of average cloud area and height, snow cover, ocean circulations, etc. commonly cause changes to the radiative budget comparable to that of doubling of CO2.

For example, the net global mean cloud radiative effect is of the order of − 20 W/m2 (cooling effect).

A 4 W/m2 forcing, from a doubling of CO2, therefore corresponds to only a 20% change in the net cloud effect.

7.    It is important to note that such a system will fluctuate with timescales ranging from seconds to millennia even in the absence of explicit forcing other than a steady Sun.

Much of the popular literature (on both sides of the climate debate) assumes that all changes must be driven by some external factor.

Simply put, global climate modelers have failed to capture the enormous complexity of the climate system.

Further, what the climate modelers produce has major deficiencies.

Among other important changing phenomena, the climate system is largely made up of two fluids in dynamic motion, the ocean, and the atmosphere,

and we simply do not know enough about fluid dynamics to make long-term predictions about the interactions of these fluids.

According to Nakamura the climate models are useful tools for academic purposes, but useless for prediction.

The beginning of Nakamura’s book states:
“Before pointing out a few of the serious flaws in climate simulation models, in defense of those climate researchers who use climate simulation models for various meaningful scientific projects,

I want to emphasize here that climate simulation models are fine tools to study the climate system, so long as the users are aware of the limitations of the models

and exercise caution in designing experiments and interpreting their output.

In this sense, experiments to study the response of simplified climate systems, such as those generated by the ‘state-of-the-art’ climate simulation models,

to major increases in atmospheric carbon dioxide or other greenhouse gases

are also interesting and meaningful academic projects that are certainly worth pursuing.

So long as the results of such projects are presented with disclaimers that unambiguously state the extent to which the results can be compared with the real world, I would not have any problem with such projects.

The models just become useless pieces of junk or worse (worse, in a sense that they can produce gravely misleading output) only when they are used for climate forecasting.

“All climate simulation models have many details that become fatal flaws when they are used as climate forecasting tools,

especially for mid- to long-term (several years and longer) climate variations and changes.

These models completely lack some of critically important climate processes and feedbacks

and represent some other critically important climate processes and feedbacks in grossly distorted manners

to the extent that makes these models totally useless for any meaningful climate prediction.

It means that they are also completely useless for assessing the effects of the past atmospheric carbon dioxide increase on the climate.

I myself used to use climate simulation models for scientific studies, not for predictions, and learned about their problems and limitations in the process.”


Uncertainty:
In his 2021book, Unsettled: What Climate Science Tells Us, What It Doesn't, and Why It Matters, Stephen Koonin writes:

“The process of science is less about collecting pieces of knowledge than it is about reducing the uncertainties in what we know.

Our uncertainties can be greater or lesser for any given piece of knowledge depending upon where we are in that process

—today we are quite certain of how an apple will fall from a tree,

but our understanding of turbulent fluid flow (such as convection in the atmosphere) remains a work in progress after more than a century of effort.”

“Every measurement of the physical world has an associated uncertainty interval (usually denoted by the Greek letter sigma: σ).

We can’t say what the measurement’s true value is precisely, only that it is likely to be within some range specified by σ.

Thus, we might say the global mean surface temperature in 2016 was 14.85oC with a σ of 0.07oC.

That is, there is a two-thirds chance that the true value is between 14.78 and 14.92oC.”

“For a scientist, knowing the uncertainty in a measurement is as important as knowing the measurement itself,

because it allows you to judge the significance of differences between measurements...” [pp. 18 & 19]

As Under Secretary for Science at the United States Department of Energy under President Obama, Koonin was responsible for assuring the nuclear arsenal was properly maintained.

Since the US no longer does nuclear testing, it is required that the models used to ensure proper maintenance were reliable.

Perhaps Koonin was prompted to explore the issues further by the exchange he had as moderator of a debate on man-made global warming (sponsored by the American Physical Society).

As Rupert Darwall wrote:
“The ensuing dialogue between Koonin and Dr. William Collins of the Lawrence Berkeley National Laboratory

– a lead author of the climate model evaluation chapter in the Fifth Assessment Report

– revealed something more troubling and deliberate than holes in scientific knowledge:

•    Dr. Koonin:
But if the model tells you that you got the response to the forcing wrong by 30 percent, you should use that same 30 percent factor when you project out a century.

•    Dr. Collins:
Yes. And one of the reasons we are not doing that is we are not using the models as [a] statistical projection tool.

•    Dr. Koonin:
What are you using them as? •    

Dr. Collins:
 Well, we took exactly the same models that got the forcing wrong and which
got sort of the projections wrong up to 2100. •    

Dr. Koonin:
So, why do we even show centennial-scale projections? •    

Dr. Collins:
 Well, I mean, it is part of the [IPCC] assessment process.

Whether he knows it or not, Collins states there is no integrity in the modeling process creating models used by the IPCC to make projections / predictions.

Thus, they are misleading and unreliable.

So much for the popular press science claiming, “scientists say.”