Total Pageviews

Monday, August 12, 2019

Two European professors reject the concept of a global average temperature

Two European professors 
wrote that the IPCC projections 
of future warming are based 
on huge unknowns.

Projections of the future 
of the world’s climate 
are very unreliable, 
according to Samuel Furfari, 
Professor at the Free University 
of Brussels, and Henri Masson, 
Professor (Emeritus), 
University of Antwerpen.




The (NASA) GISS 
Surface Temperature 
Analysis (GISTEMP v4) 
is an estimate of global 
surface temperature change,
often used by climate scientists 
for their reports to the media.

This estimate is computed 
using data files from 
NOAA GHCN v4 (land stations), 
and ERSST v5 (ocean areas). 

In June 2019, the number 
of land stations was 8,781 
in the GHCNv4 unadjusted 
dataset; but in June 1880, 
that figure was a mere 
281 stations, the two 
professors wrote.

Professors Furfari 
and Masson write: 
“The climate system, 
and the way
IPCC represents it, 
is highly sensitive
to tiny changes 
in the value 
of parameters 
or initial conditions and 
these must be known 
with high accuracy. 

But this is not the case.

“This puts serious doubt 
on whatever conclusion 
that could be drawn 
from model projections.” 

Opening the door to: 
“fake conclusions”
… “manipulations”.




Masson and Furfari 
say that IPCC scientists 
ignore that climate change 
occurs with cyclic behavior, 
and that linear trend lines 
applied to (poly-)cyclic data 
of period similar 
to the length 
of the time window 
considered, 
open the door
to any kind of
fake conclusions, 
if not manipulations, 
aimed to push 
one political agenda 
or another.

“Until very recently, 
these (sea surface)
temperatures have been 
only scarcely reported, 
as the data for SST 
(Sea Surface Temperature) 
came from vessels 
following a limited number 
of commercial routes,” 
report Masson and Furfari.

The gaping data holes 
mean scientists 
are free to guess 
whatever numbers
they want.




IPCC projections result from 
mathematical models 
which need to be calibrated 
by making use of data 
from the past. 

The accuracy of calibration data 
is of paramount importance, 
as the climate system 
is highly non-linear, and this 
is also the case for the 
(Navier-Stokes) equations 
and (Runge-Kutta integration) 
algorithms used in the 
IPCC computer models. 

The spatial coverage 
of the data 
is highly questionable, 
as is the temperature 
over the oceans, 
representing 70% 
of the Earth surface, 
is mostly neglected or 
“guesstimated”.

The number and location 
land surface weather stations
have also considerably changed 
over time, inducing biases 
and fake trends.




The global temperature anomaly, 
is obtained by spatially averaging
local temperature anomalies. 

Local anomalies are the comparison 
of present local temperatures
to the averaged local temperature 
calculated over a previous 
fixed reference period of 30 years, 
changing each 30 years (1930-1960, 
1960-1990, etc.). 

The concept of local anomaly 
is highly questionable, 
due to biases and false trends 
when the “measurement window” 
is shorter than at least 6 times 
the longest period detectable 
in the data; which is unfortunately 
the case with temperature data.

It is highly recommended 
to abandon 
the IPCC concept 
of a single global average 
temperature anomaly, 
and to focus on 
local climate data.

A change in a local climate, 
is a physically meaningful concept, 
because local climates are those
in which people actually live !