Researchers Claim “Climate Change” Causes Currency Vulnerability: An Instance Of Forgotten Uncertainties

Guest Perspective by Dr. William M. Briggs
Exclusive to The MacIver Institute

 

We know all about models by now, do we not, dear readers? A model, also known as a theory, can be made about any contingent thing. Say, your 401-K, or the fate of a currency, or the weather, or of the growth of academics researching “climate change”.

Because the model is of something contingent, also known as an observable, and because of things like limitations in data and, even more important, limitations in thought, there will always be, or should be always, uncertainty in the model’s predictions.

This is in no way a limitation or a fault. It is the nature of models.

But what kind of mistake would you be making if you took the output of your model and input it into another stripped of all uncertainty?

And what kind of mistake would the owner of the second model be making if he supplied his output, also stripped of uncertainty, to the creator of a third model?

It’s a trick question. The answer is: a dumb mistake.

We discussed this before under the label multiplication of uncertainties, but maybe a better title is forgotten uncertainties.

Making a Bad Model

The idea is simple.

Model A says the chance of temperatures exceeding some high, critical value (at some place, etc.) is, being generous, 50%.

Model B says the chance of some high-temperature-dependent thing, like the production of a crop falling, is 60%.

Model C says the chance the GDP falls by some amount, if the crop is low, is 80%.

It would be a tremendous blunder to announce “Eighty percent chance the GDP falls by 6 basis points when climate change hits”, because the uncertainties have not been multiplied like they should have been.

In our example, the proper phrase is this: “The chance the temperature exceeds some value, and the crop production falls, and the GDP falls is 25%”, because 0.5 x 0.6 x 0.8 is about 0.25.

What seemed almost certain with one headline, has, with proper accounting for the full uncertainty, dropped to something not especially likely.

I was most generous with these probabilities, making the best case for those who tout “climate change”. In real models the complexity is much greater, the string of models and assumptions (assumptions are models) longer, and the resultant full uncertainty much greater.

So great is the uncertainty that it’s difficult to get worked up over any announcement of doom.

Notre Dame’s Global Adaptation Initiative

Let’s keep this in mind as we look at the peer-reviewed paper “Climate Change Vulnerability and Currency Returns” by Alexander Cheema-Fox and two others, in the Financial Analysts Journal.

They begin by stating only bad things that can happen with “climate change”. The good is forgotten, or never considered. On the other hand, try getting a paper published that says “Look at all these terrific things climate change will bring us.” I won’t wait up for you.

Incidentally, I put the scare quotes around “climate change” to indicate I haven’t any clear idea what they mean by the phrase. And neither do the authors. To them, as to many, it is a scare word, a bad thing, something to be avoided. Precision in the term is not only missing, but unwanted.

Our authors “use data from the Notre Dame Global Adaptation Initiative (ND-GAIN Index) that measures a country’s vulnerability to climate change.” This is a model, and though I won’t prove it here, an over-certain one. (Our hunger for over-certain single-number summaries of vast complexities is not limited to this field.)

Norway (at this writing) is given a 75.4. Not 75.3, or 75.5: 75.4. Denmark is 71.0.

This one single precise number summarizes a country’s “climate change” capability in terms of: (1) exposure, (2) sensitivity, (3) adaptive capacity, (4) economic readiness, (5) governance readiness, and (6) social readiness. The first three are, our authors say, “considers six life-supporting sectors including food, water, health, ecosystem services, human habitat, and infrastructure.”

One perfectly certain number to summarize an entire culture and what it might do in an unknown future once “climate change” hits. What can it mean? The question is rhetorical. The index should be laughed at. Be sure, though, that you see that the first hidden model is that of “climate change” itself. We nowhere see it, and it is somehow taken as certain.

Harvard’s Climate Change Study

Back to the authors of “Climate Change Vulnerability and Currency Returns,” who use the time-changing ND-GAIN indexes (or rather a form of them) to study currencies. Basically, they put a form of the ND-GAIN indexes into a third model (regression) with currencies from around the world (separated into G10 and non-G10 countries).

They then announce what happened to the parameters of that regression model—as if the parameters were the observables themselves. When what they should have done is “integrate out” the uncertainty in the parameters and stated the results in terms of uncertainty in the observables, i.e., the currencies.

In their favor, most researchers commit this error, not knowing better. But because the certainty in the parameters is necessarily always larger than in the observable, it means they are far too certain.

And that is not considering the multiplication of uncertainties, which appears nowhere in their paper. Even if they put their model in terms of observables, we still have to consider, at least, the “climate change” models, all the models (assumptions) that go into the ND-GAIN indexes, and the regression model.

Plus, I’m leaving out a lot of details about their models’ complexities (they do several models). The Notre Dame index “vulnerability” is a function of “Log GDP per capita”, “Price level”, “Cumulative current account”, “Debt-to-GDP”, “Unemployment”, “Industrial production”, “Retail sales”, some of which are known, but some of which are uncertain in themselves. But they are used as if they are certain.

Yet they conclude things like “we find trends in climate vulnerability predict currency returns.” There is no indication they are anything but certain about that conclusion, an impossibility.

If there is any predictive value to this model it comes not from “climate change” certainty, but because currencies change in time in modestly predictable ways (such as knowing you are from a G10 country). In essence, they have re-discovered time series modeling. Of course, time series models say nothing about cause—though they are often mistakenly thought to. That is a story for another day.

 

Dr. William M. Briggs has a Ph.D. in Mathematical Statistics and an MS in Atmospheric Physics. Briggs describes himself as a “Data Philosopher, Epistemologist, Probability Puzzler, Unmasker of Over-Certainty”, and statistician to the stars. As you will see, Briggs has a healthy skepticism of the supposedly ironclad findings that modern science purports to find seemingly every week.