Total Pageviews

763,014

Sunday, December 9, 2018

Climate Science Controversies

The  UN's  IPCC 
says it is 
“certain that 
global mean 
surface 
temperature (GMST) 
has increased 
since the late 
19th century” 
and estimates 
the increase 
from 1850–1900 
to 2003–2012 
was +0.78°C 
[ +0.72 to +0.85 ] 
based on the 
Hadley Center
Climatic Research Unit 
dataset (HadCRUT4) 

The IPCC itself admits 
its temperature 
reconstructions 
are highly uncertain:

IPCC’s estimate 
of +0.78°C 
[ +0.72 to +0.85 ] 
from 1850–1900 
to 2003–2012 
is not an observation 
but an estimate 
based on a long chain 
of judgements about
how to handle data, 
what data to include 
and what to leave out, 
and how to summarize it 
into a single 
“global temperature.” 

The IPCC’s 
temperature estimate 
cannot be tested 
because it has 
no empirical existence. 

Insight into the flaws 
of the HadCRUT dataset 
comes from emails leaked 
from the Climatic Research 
Unit in 2009.

A file titled “Harry Read Me” 
contained some 247 pages 
of email exchanges 
with a programmer 
responsible for maintaining 
and correcting errors 
in the HadCRUT climate data 
between 2006 and 2009. 

Reading only 
a few of the 
programmer’s 
comments 
reveals 
the inaccuracies, 
data manipulation, 
and incompetence 
that render 
the database 
unreliable:

“Wherever I look, 
there are data files, 
no info about what they are 
other than their names. 
And that’s useless ...”

“It’s botch after botch after botch” 

“Am I the first person 
to attempt to get the CRU
databases in working order?!!"

“As far as I can see, this renders 
the [weather] station counts 
totally meaningless” 

“COBAR AIRPORT AWS
( data from an Australian weather station )
cannot start in 1962, 
it didn’t open until 1993!”

“What the hell is supposed to happen 
here? Oh yeah – there is no ‘supposed,
’ I can make it up. So I have : - )” 

“I’m hitting yet another problem 
that’s based on the hopeless state 
of our databases. 
There is no uniform data integrity, 
it’s just a catalogue of issues 
that continues to grow 
as they’re found” .

“Harry Read Me” was in charge 
of “quality control” at the time. 

Relying on such haphazard data 
violates the Scientific Method. 




This is the actual
temperature record 
relied on by the IPCC, 
and climate scientists, 
who claim to know 
how much the mean 
global surface temperature 
has changed since 1850.

The warming 
since 1850, 
if it occurred at all, 
is meaningful data
only if it exceeds 
natural variability. 

Proxy temperature records 
from the Greenland ice core 
for the past 10,000 years 
demonstrate a natural range
 of warming and cooling rates
between
+2.5°C and -2.5°C / century, 
significantly greater than 
rates measured for Greenland, 
or the globe, during the 
twentieth century.

The ice cores also show 
repeated 
“Dansgaard–Oeschger” events 
when air temperatures 
rose at rates of about 
10 degrees per century. 

There have been about 
20 such warming events 
in the past 80,000 years.




In its Fifth 
Assessment Report, 
the IPCC admits the 
global mean 
surface temperature 
stopped rising 
from 1997 to 2010, 
reporting the
temperature increase 
for that period 
was only +0.07°C.

This “pause” 
extended 18 years 
before being interrupted 
by the major El Niño events 
( Pacific Ocean heat releases )
of 2010–2012 
and 2015–2016. 

During “the pause” 
humans released 
approximately one-third 
of all the greenhouse gases 
emitted since the beginning 
of the Industrial Revolution. 

If CO2 concentrations 
drive global temperatures, 
their impact should have 
been visible during this period. 

Temperatures quickly fell 
after each El Niño event, 
though not to previous levels. 



Weather satellite 
temperature data 
cover almost 
the entire planet, 
something that
surface-based 
temperature stations 
can't do.

They are free from 
human influences 
other than 
greenhouse gas emissions 
( such as change in land use, 
urbanization, farming, and land clearing ).

Despite the 
known deficiencies,
the IPCC and many 
government agencies 
and environmental 
advocacy groups 
continue to rely on the 
surface station 
temperature record. 

The satellite record 
shows only
 +0.07°C to +0.13°C 
per decade warming 
since 1979, 
while the 
surface station record 
for approximately 
the same period 
 ( ending in 2016, rather than in 2017 ) 
shows +0.16°C to +0.19°C 
per decade warming, 
about 40% higher.

Geologists point to 
how short a period 40 years, 
or even a century are, 
when studying climate.

Even a century’s 
worth of data 
would be a mere 1% 
of the 10,000 years 
of the Holocene Epoch. 




Earth does not have 
just one temperature. 

It is not in global 
thermodynamic equilibrium
 – neither within itself 
nor with its surroundings. 

There is no 
physically meaningful 
global temperature 
for the Earth 
-- the method of deriving 
a global average temperature 
must be arbitrary 
since there is an 
infinite number of ways 
it could be calculated 
from available data. 

Weather stations 
are not 
evenly distributed 
around the world.

Efforts to manipulate 
and “homogenize” 
divergent datasets, 
fill in missing data, 
remove outliers, 
and compensate for 
changes in 
sampling technology 
are all opportunities 
for subjective 
decision-making,
and deliberate lying too.



Governments need to 
assign numbers to the things 
they seek to regulate or tax. 

Saying the world has 
a single temperature 
– 14.9°C (58.82°F) in 2017, 
according to NASA’s 
Goddard Institute 
for Space Studies,
violates many 
of the principles 
of the Scientific Method, 
but it fills a need 
expressed by 
government officials 
at the United Nations 
and in many 
world capitals.