HadCrut4 Global
Temperature Data,
from 1850 to 2018.
HadCRUT4 is a global
surface temperature
dataset.
Here's what
I've heard
"HadCRUT4"
stands for:
Highly
Unconvincing
Data
Confirming
Unfeasible
Theory
4
The HadCRUT4 dataset
is a joint production
of the UK Met Office’s
Hadley Centre and the
Climatic Research Unit
of the University
of East Anglia.
Just like other
surface data,
from the US,
it shows
more warming
than alternative
measurement
methodologies --
Weather satellites
and weather balloons
(which are both very similar
to each other, and completely ignored
by the government bureaucrats
with science degrees).
by the government bureaucrats
with science degrees).
Had CRUT4 provides
temperature anomalies,
versus a base period,
for surface "grids"
(5 degree latitude
by 5 degree longitude
surface areas)
across the world,
averages for
the two hemispheres,
and an average
global temperature.
CRUTEM4 and HadSST3
are the land and ocean
components of HadCRUT4.
Dr John McLean
has now shown
the surface data
are haphazard,
not worthy of
real scientists ...
( as I have written
so many times
in this blog ),
... with freak data,
empty grids (fields),
Fahrenheit temps
recorded as Celsius,
mistakes in longitude
and latitude, and
brutal "adjustments" !
Temperature data
used by climate models
has more than 70
different problems.
The most basic quality control
checks have not been done.
Some examples
of OBVIOUS errors
coming in from the
national meteorological
services
(I don't know how many of these local errors,
if any, are caught and fixed before compilation
of the HADcrut4 average global temperature:
coming in from the
national meteorological
services
(I don't know how many of these local errors,
if any, are caught and fixed before compilation
of the HADcrut4 average global temperature:
(1)
One town in Columbia
spent three months
in 1978, at an average
daily temperature
of over 80 degrees C.
which is unbelievably hot !
( Celsius, not Fahrenheit ! )
For April, June and July
of 1978 Apto Uto
(Colombia, ID:800890)
had an average
monthly temperature
of 81.5° C., 83.4° C.
and 83.4° C., respectively.
Obviously wrong !
(2)
One town in Romania
stepped out from summer
in 1953, straight into a month
of Spring at minus -46° C. !
The monthly
mean temperature
mean temperature
in September 1953
at Paltinis, Romania
is reported
at Paltinis, Romania
is reported
as -46.4° C.
(in other years
the September average
was about 11.5° C.).
Obviously wrong !
These are supposed to be
“average” temperatures
for a full month !
(3)
St Kitts, a Caribbean island,
was recorded at 0° C.
for a whole month ... twice !
At Golden Rock Airport,
on the island of St Kitts
in the Caribbean,
mean monthly temperatures
for December in 1981 and 1984
are reported as 0.0° C.
Obviously wrong !
But from 1971 to 1990
the average
in all the other years
was 26.0° C.
(4)
Boats whose location
was dry land (?)
allegedly recorded
ocean temperatures
from as far as
100 kilometers inland !
Obviously wrong !
Dr. McLean
audited
the HadCrut4
global data
from 1850 onwards
for his PhD thesis,
and then continued
afterwards,
until his audit
was complete:
“I was aghast to find
that nothing was done
to remove absurd values
… the whole approach
to the dataset’s
create is careless
and amateur,
about the standard
of a first-year
university student.”
– John McLean
McLean's supervisor
was Peter Ridd,
famously fired
for saying that
“the science was not
being checked,
tested or replicated”,
and for suggesting
we might not be able
to trust our institutions.
Data are
incredibly sparse:
For two years the entire
Southern Hemisphere
temperature was estimated
from one land-based site
in Indonesia,
and some ship data.
We didn’t get
50% global coverage
until 1906.
We didn’t consistently get
50% Southern Hemisphere
coverage until about 1950.
McLean’s findings:
Almost no quality control
on these data.
Countries include
“Venezuala”,
” Hawaai”, and
the “Republic of K”
( also known as
South Korea ).
One country is
“Unknown”
while other "countries"
are not even countries,
such as “Alaska”.
Yet government bureaucrats
“sell” their temperature trends
as if they are accurate !
Climatic Research Unit data
covers 10,295 stations,
but 2693 of them,
( over 26% )
don’t meet the criteria
for inclusion, described
in Jones et al 2012,
the best description
of what should and
shouldn’t be included.
It is impossible to know
exactly which sites
are included in the final
temperature analysis,
and whether
a site’s records
have been adjusted.
The sub-parts of the datasets
contradict each other.
The land dataset
and the sea dataset
should combine to
equal the global set,
but they don’t
always match.
Which one is right?
In probably the worst
systematic error,
the past is rewritten,
"cooled", in a
haphazard attempt
haphazard attempt
to correct for site moves.
While some corrections
are necessary,
these adjustments
are brutally sweeping.
As the site “ages”
over time,
buildings and roads
get built nearby,
and sometimes
air conditioners too,
all of them
artificially warming
artificially warming
the nearby
weather station.
So a replacement thermometer
is placed in a nearby open location,
surrounded by grass.
Usually each separate
national meteorology center
compares both sites
for a while, and figures out
the temperature difference
between them.
Then they adjust the readings
from the old locations
down to match the new ones.
But, when a thermometer
is relocated to a new site,
the adjustments assume
that the old station site
was always being “heated”
by nearby concrete and buildings.
In reality, the artificial warming
from buildings and concrete,
probably crept in slowly
with economic growth.
"Correcting" (cooling)
1880 to 1950 data,
for example,
for the extra heat
of a nearby building
built in 1950,
causes the old
temperature records,
from 1880 through 1949,
to be artificially cooled.
If you cool the past,
and not the current year,
the trend of global warming
increases,
even if all the adjustments
were "cooling adjustments" !
When the past is rewritten ,
to be colder than it really was,
that makes global warming
look faster than it really was.
Thousands of
men and women
trudged through
snow, rain and mud
to take temperatures that
a computer “corrected”
a century later.
In Australia
data adjustments
increased
the warming trend
by as much as 40%.
Ken Stewart found
some adjustments
to old historic data
in Australia
reduced the
earliest temperatures
by as much as
-2 degrees C. !
Each national
meteorological bureau
supplies “pre-adjusted”
national data.
The Hadley Centre
just accepts them
with no audits,
and no checking.
This systematic error
creates an artificial
warming trend
from incorrect
"adjustments".
warming trend
from incorrect
"adjustments".
In May 1861
the global coverage,
according to the
grid-system method
that HadCRUT4 uses,
was only 12% !
That means in 1861,
there were data for 88%
of the Earth’s surface,
yet iot was used
for a “global average”.
“Until 1906
global coverage
was less than 50%
and coverage
didn’t hit 75%
until 1956”
John McLean
Note that "coverage"
only means the grid
has at least
one thermometer
-- it does not mean
daily readings
were made
every day of a month,
or even every day
of a week.
If you look at
how much daily
temperature data
are provided,
even today in 2018,
a majority of grids
in a typical month
have no thermometers,
(so obviously no data),
or the grid has
incomplete data.
That means
the monthly
average
temperature
for most grids
requires
wild guessing
by government
bureaucrats,
whose wild guesses
can never be verified,
or falsified !
There are large areas
in central Africa,
which NASA (U.S.) claims
are having record heat,
yet there are
no thermometers there,
meaning NASA
just wild guessed
the temperatures there,
and declared
their guesses
to be a "record" !
In 1850 and 1851,
official data for the
Southern Hemisphere
only included one
sole thermometer
in Indonesia,
and some
random ship data.
that covered
only about 15%
of the oceans
in the southern half
of the globe,
( and “covers”
may mean
as little as
one measurement
in one month
in a grid cell,
though it is
usually more. )
Sometimes
there are national data
that could have been used,
but were not.
On February 6,1851,
Australian
newspaper archives
show temperatures
in the shade
hit 117 F. in Melbourne,
(that’s 47 C.),.
115 F. in Warnambool,
and 114 F. in Geelong.
Really hot !
The Australian
Bureau of Meteorology
argued these
were not standard
officially sited
thermometers,
and didn't use the data.
But surely those
unofficial
temperatures
would have been
more useful
than the data
from the one
"officlal"
1851 thermometer
located in Indonesia,
5,000 to 10,000 kilometers
5,000 to 10,000 kilometers
away?
While the Hadley dataset
is not explicitly estimating
the 1851 Melbourne temperature,
they are estimating
the average temperature of
the average temperature of
“the Southern Hemisphere”
and “The Globe”
( and Melbourne
is a part of that. )
This article was based
on an excellent summary
of the McLean report at,
www.JoNova.com
... because this cheapskate
did not buy the 135-page audit
with more than 70 findings
that was being sold for $8
from Robert Boyle Publishing.