Total Pageviews

Monday, October 28, 2019

Escape From Model-land -- quotes from a very interesting study

NOTE:
Imagine that I 
started the year
with a very confident
computer model
global warming
prediction, of 
+0.3 degrees C.

And at the end 
of that year, 
actual warming
turned out to be
only +0.1 degree C.

Would you still trust
my computer model 
prediction the next year ?

Maybe ?

How about 
30 consecutive
years of predicting 
+0.3 degrees C.
global warming,
with actual warming
averaging about 
+0.1 degrees C. ?

I hope it wouldn't 
take more than 
a few years of me
over-predicting
global warming,
before you
lost interest in 
my computer 
model !

Unfortunately,
the belief in a 
"coming climate crisis",
is "supported" by
a surprisingly small
amount of in-lab 
experiments, and 
30 years of wrong 
computer model
predictions !

Yet gullible people,
especially leftists,
just keep believing !




A recent English study 
that I read on the use 
of computer models 
was a little complicated,
16 pages long, and not 
directly about climate
models ... but it made 
many good points 
about computer models 
in general, such as this one:  
"... we must remain careful 
to distinguish model-land 
and model-land quantities 
from the real world."




My summary of the study:

Erica L. Thompson, 
London School of Economics 
and Political Science, 
Centre for the Analysis 
of Time Series, and 

Leonard A. Smith,
Leonard A. Smith, 
Mathematical Institute, 
University of Oxford, UK
Citation

2019

Escape From Model-land. 

Economics: The Open-Access, 
Open-Assessment E-Journal, 
Published October 8, 2019




Quotes from the  study abstract
" ... we must remain careful to distinguish model-land and model-land quantities from the real world."


"The authors present a short guide to some of the temptations and pitfalls of model-land, some directions towards the exit, and two ways to escape."



Quotes from the study:
" ... the outputs of these models are again used to inform real-world decision making, sometimes in public view, sometimes not."


"Decision-support in model-land implies taking the output of model simulations at face value ... and then interpreting frequencies in model-land to represent probabilities in the real-world."


"We have found remarkably similar challenges to good model-based decision support in macroeconomics, energy demand, fluid dynamics, hurricane formation, life boat operations, nuclear stewardship, weather forecasting, climate calculators, and sustainable governance of reindeer hunting."


"Our aim is a decision-making process that remains acceptable to all involved regardless of the outcome ... This cannot be accomplished in model-land."


" ... when writing about forecasts of household consumption, energy prices, or global average surface temperature, many authors will use the same name and the same phrasing to refer to effects seen in the simulation, as those used for the real world."


"Within model-land, we cannot even enunciate the possibility of a “Big Surprise”, let alone think about the probability of such an event occurring."


"Yet the possibilities remain of economic surprises, previously-unseen weather events, energy price spikes, or worse-than-expected climate impacts, even where these are not simulated by today’s models."


"Such events, in fact, happen disturbingly often."


"It is comfortable for researchers to remain in model-land as far as possible, since within model-land everything is well-defined, our statistical methods are all valid, and we can prove and utilise theorems (Judd and Smith, 2001)."


"Exploring the furthest reaches of model-land in fact is a very productive career strategy, since it is limited only by the available computational resource."


"Until the outcome is known, the ultimate arbiter must be expert judgement, as a model is always blind to things it does not contain, and thus may experience Big Surprises."


"A complex model is not an end in itself, however, but a stand-in for a complex real-world system such as the Earth’s atmosphere, the economy, or the energy system, then we can say with confidence that our model is not perfect (Smith, 2002)."


"It is sometimes suggested that if a model is only slightly wrong, then its outputs will correspondingly be only slightly wrong."


"The Butterfly Effect (Lorenz, 1963) revealed that in deterministic nonlinear dynamical systems, a “slightly wrong” initial condition can yield wildly wrong outputs." 


"When the justification of the research is to inform some real-world time-sensitive decision, merely employing the best available model can undermine (and has undermined) the notion of the science-based support of decision making ...  .  (Smith, 2002; Frigg et al, 2015; Smith and Petersen, 2014; Beven, 2019; Beven, 2019b)."


"It is helpful to recognise a few critical distinctions regarding pathways out of model-land and back to reality. "


"Is the model used simply the “best available” at the present time, or is it arguably adequate for the specific purpose of interest?"


"How would adequacy for purpose be assessed, and what would it look like?"


"Are you working with a weather-like task, where adequacy for purpose can more or less be quantified, or a climate-like task, where relevant forecasts cannot be evaluated fully? "


"Mayo (1996) argues that severe testing is required to build confidence in a model (or theory); we agree this is an excellent method for developing confidence in weather-like tasks, but is it possible to construct severe tests for extrapolation (climate-like) tasks?"


" ... the most recent IPCC climate change assessment uses an expert judgement that there is only approximately a 2/3 chance that the actual outcome of global average temperatures in 2100 will fall into the central 90% confidence interval generated by climate models 
(IPCC, 2013 – see footnotes c and d to table SPM.2 on page 23 of the Summary for Policymakers)". 


"It is worth noting here that presenting model output at face value as if it were a prediction of the real-world, or interpreting simulation frequencies as real-world probabilities, is equivalent to making an implicit expert judgement that the model structure is perfect. The IPCC does not make this claim."



"You may be living 
in model-land if you:
try to optimise anything 
regarding the future; 

believe that decision-relevant 
probabilities can be extracted 
from models;

believe that there are 
precise parameter values to be found; 

refuse to believe in anything 
that has not been seen in the model; 

think that learning more 
will reduce the uncertainty in a forecast; 

explicitly or implicitly set the 
Probability of a Big Surprise to zero; 
that there is nothing your model 
cannot simulate;

want “one model to rule the mall”;

treat any failure, no matter how large, 
as a call for further extension to the
existing  modeling strategy."