The Heart of Climategate–the “fudge factor”

by Julie Walsh on November 25, 2009

in Blog

You can download the entire zip file of the uncovered emails and documents here but it is 61mb of text. The searchable files are here. A good list of the worst emails discovered so far is here. Here’s a very funny video on the whole mess. But here is the best article, by Marc Sheppard, that shows that all the data and wherever it has been used–Kyoto and IPCC–has to be thrown out. The data has been massaged, corrupted, manipulated, and abused beyond recognition:

One can only imagine the angst suffered daily by the co-conspirators, who knew full well that the “Documents” sub-folder of the CRU FOI2009 file contained more than enough probative program source code to unmask CRU’s phantom methodology.

In fact, there are hundreds of IDL and FORTRAN source files buried in dozens of subordinate sub-folders. And many do properly analyze and chart maximum latewood density (MXD), the growth parameter commonly utilized by CRU scientists as a temperature proxy, from raw or legitimately normalized data. Ah, but many do so much more.

Skimming through the often spaghetti-like code, the number of programs which subject the data to a mixed-bag of transformative and filtering routines is simply staggering. Granted, many of these “alterations” run from benign smoothing algorithms (e.g. omitting rogue outliers) to moderate infilling mechanisms (e.g. estimating missing station data from that of those closely surrounding). But many others fall into the precarious range between highly questionable (removing MXD data which demonstrate poor correlations with local temperature) to downright fraudulent (replacing MXD data entirely with measured data to reverse a disorderly trend-line).

In fact, workarounds for the post-1960 “divergence problem”, as described by both RealClimate and Climate Audit, can be found throughout the source code. So much so that perhaps the most ubiquitous programmer’s comment (REM) I ran across warns that the particular module “Uses ‘corrected’ MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”

What exactly is meant by “corrected” MXD, you ask? Outstanding question — and the answer appears amorphous from program to program. Indeed, while some employ one or two of the aforementioned “corrections,” others throw everything but the kitchen sink at the raw data prior to output.

For instance, in subfolder “osborn-tree6mannoldprog” there’s a program (Calibrate_mxd.pro) that calibrates the MXD data against available local instrumental summer (growing season) temperatures between 1911-1990, then merges that data into a new file. That file is then digested and further modified by another program (Pl_calibmxd1.pro) which creates calibration statistics for the MXD against the stored temperature and “estimates” (infills) figures where such temperature readings were not available. The file created by that program is modified once again by Pl_Decline.pro, which “corrects it” – as described by the author — by “identifying and “artificially” removing “the decline.”

But oddly enough – the series doesn’t begin its “decline adjustment” in 1960 — the supposed year of the enigmatic “divergence.” In fact, all data between 1930 and 1994 are subject to “correction.”

And such games are by no means unique to the folder attributed to Michael Mann.

A Clear and Present Rearranger

In 2 other programs, briffa_Sep98_d.pro and briffa_Sep98_e.pro, the “correction” is bolder by far. The programmer (Keith Briffa?) entitled the “adjustment” routine “Apply a VERY ARTIFICAL correction for decline!!” And he/she wasn’t kidding. Now, IDL is not a native language of mine, but its syntax is similar enough to others I’m familiar with, so please bear with me while I get a tad techie on you.

Here’s the “fudge factor” (notice the brash SOB actually called it that in his REM statement):

yrloc=[1400,findgen(19)*5.+1904]

valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor

These 2 lines of code establish a 20 element array (yrloc) comprised of the year 1400 (base year but not sure why needed here) and 19 years between 1904 and 1994 in half-decade increments. Then the corresponding “fudge factor” (from the valadj matrix) is applied to each interval. As you can see, not only are temperatures biased to the upside later in the century (though certainly prior to 1960) but a few mid-century intervals are being biased slightly lower. That, coupled with the post-1930 restatement we encountered earlier, would imply that in addition to an embarrassing false decline experienced with their MXD after 1960 (or earlier), CRU’s “divergence problem” also includes a minor false incline after 1930.

And the former apparently wasn’t a particularly well-guarded secret, although the actual adjustment period remained buried beneath the surface.

Plotting programs such as data4alps.pro print this reminder to the user prior to rendering the chart:

IMPORTANT NOTE: The data after 1960 should not be used. The tree-ring density records tend to show a decline after 1960 relative to the summer temperature in many high-latitude locations. In this data set this ‘decline’ has been artificially removed in an ad-hoc way, and this means that data after 1960 no longer represent tree-ring density variations, but have been modified to look more like the observed temperatures.”

Others, such as mxdgrid2ascii.pro, issue this warning:

NOTE: recent decline in tree-ring density has been ARTIFICIALLY REMOVED to facilitate calibration. THEREFORE, post-1960 values will be much closer to observed temperatures then (sic) they should be which will incorrectly imply the reconstruction is more skilful than it actually is. See Osborn et al. (2004).’

Care to offer another explanation, Dr. Jones?

Gotcha

Clamoring alarmists can and will spin this until they’re dizzy. The ever-clueless mainstream media can and will ignore this until it’s forced upon them as front-page news, and then most will join the alarmists on the denial merry-go-round.

But here’s what’s undeniable: If a divergence exists between measured temperatures and those derived from dendrochronological data after (circa) 1960 then discarding only the post-1960 figures is disingenuous to say the least. The very existence of a divergence betrays a potential serious flaw in the process by which temperatures are reconstructed from tree-ring density. If it’s bogus beyond a set threshold, then any honest men of science would instinctively question its integrity prior to that boundary. And only the lowliest would apply a hack in order to produce a desired result.

And to do so without declaring as such in a footnote on every chart in every report in every study in every book in every classroom on every website that such a corrupt process is relied upon is not just a crime against science, it’s a crime against mankind.

Indeed, miners of the CRU folder have unearthed dozens of email threads and supporting documents revealing much to loathe about this cadre of hucksters and their vile intentions. This veritable goldmine has given us tales ranging from evidence destruction to spitting on the Freedom of Information Act on both sides of the Atlantic. But the now irrefutable evidence that alarmists have indeed been cooking the data for at least a decade may just be the most important strike in human history.

Advocates of the global governance/financial redistribution sought by the United Nations at Copenhagen in two weeks and the expanded domestic governance/financial redistribution sought by Liberal politicians both substantiate their drastic proposals with the pending climate emergency predicted in the reports of the Intergovernmental Panel on Climate Change (IPCC). Kyoto, Waxman-Markey, Kerry-Boxer, EPA regulation of the very substances of life – all bad policy concepts enabled solely by IPCC reports. And the IPCC, in turn, bases those reports largely on the data and charts provided by the research scientists at CRU – largely from tree ring data — who just happen to be editors and lead authors of that same U.N. panel.

Bottom line: CRU’s evidence is now irrevocably tainted. As such — all assumptions based on that evidence must now be reevaluated and readjudicated. And all policy based on those counterfeit assumptions must also be re-examined.

Gotcha. We’ve known they’ve been lying all along, and now we can prove it. It’s time to bring sanity back to this debate.

It’s time for the First IPCC Reassessment Report.

J Carlton November 26, 2009 at 4:19 pm

All of the link in this post point to a youtube Greenpeace video. Is this the correct link?

Kevin S. Van Horn November 27, 2009 at 9:57 am

Looks like your site has been hacked. Every one of the URLs in this article points to a GreenPeace video.

Comments on this entry are closed.

{ 1 trackback }

Previous post:

Next post: