The Climate Model is coded to intentionally suppress non-warming results
Michael Mann (popularizer of the Hockey Stick) personally cooked the code:
"Divergence" is a big problem for the climate change alarmists. The proxy data sets that they use (tree rings, etc) show that for the last 50 years, the temperature as shown by the proxies has been lower than the temperature readings that we get from thermometers. How to address this problem? Code it out of the models.
I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) amd [sic] from1961 for Keith’s to hide the decline.
Check out this quote from the code – cudo’s to Steve Neil for digging it out.
; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES
; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate
; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE
; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE
; USUAL correlate FUNCTION ON THE RESIDUALS.
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
Poor programming implies no Quality Control
It looks like at least some of the models were written by people that don't know the basics of how to code:
This is, quite frankly, a complete n00b error. Anybody working in industry who made this mistake would find himself in the "bottom 5%" group come annual review time, and would very likely get a suggestion to look for work elsewhere.
The bit that made me laugh was this bit. Anyone into programming will burst out laughing before the table of numbersQuote:
17. Inserted debug statements into anomdtb.f90, discovered that
a sum-of-squared variable is becoming very, very negative!
For those unfamiliar with this problem, computers use a single “bit” to indicate sign. If that is set to a “1″ you get one sign (often negative, but machine and language dependent to some extent) and if it is “0″ you get another (typically positive).
OK, take a zero, and start adding ones onto it. We will use a very short number (only 4 digits long, each can be a zero or a one. The first digit is the “sign bit”). I’ll translate each binary number into the decimal equivalent next to it.0000 zero
1000 negative (may be defined as = zero, but oftentimes
defined as being as large a negative number as you can
have via something called a 'complement'). So in this
case NEGATIVE seven
1001 NEGATIVE six
1010 NEGATIVE five (notice the 'bit pattern' is exactly the
opposite of the "five" pattern... it is 'the complement').
1011 NEGATIVE four
1100 NEGATIVE three
1101 NEGATIVE two
1110 NEGATIVE one
1111 NEGATIVE zero (useful to let you have zero without
needing to have a 'sign change' operation done)
Sometimes the 1111 pattern will be “special” in some way. And there are other ways of doing the math down at the hardware level, but this is a useful example.
You can see how adding a digit repeatedly grows to a large value (the limit) then “overflows” into a negative value. This is a common error in computer math and something I was taught in the first couple of weeks of my very first programming class ever. Yes, in FORTRAN.
OK, so the University of East Anglia has some bad programmers. So what? Well, this means that large parts of the climate models have never had a design review or code review. This means that the model is essentially unaudited for correctness. This means that there's no assurance that it produces output that's sane - even discounting for Dr. Jone's code to "fix" divergence.
If I could only ask one question at a Senate Hearing, mine would be "What Quality Control processes do you have for climate model software development." 'Cause it looks like there aren't any.
The programmers don't understand what the data is, and what it is for
Maintenance coding (maintaining a program someone else wrote) isn't any fun, not least because the person who wrote it may not have documented what the parts are and what they do. It looks like things are no different at CRU:
What does this mean, in non-technical terms?7. Removed 4-line header from a couple of .glo files and loaded them into
Matlab. Reshaped to 360r x 720c and plotted; looks OK for global temp
(anomalies) data. Deduce that .glo files, after the header, contain data
taken row-by-row starting with the Northernmost, and presented as '8E12.4'.
The grid is from -180 to +180 rather than 0 to 360.
This should allow us to deduce the meaning of the co-ordinate pairs used to
describe each cell in a .grim file (we know the first number is the lon or
column, the second the lat or row - but which way up are the latitudes? And
where do the longitudes break?
There is another problem: the values are anomalies, wheras the 'public'
.grim files are actual values. So Tim's explanations (in _READ_ME.txt) are
8. Had a hunt and found an identically-named temperature database file which
did include normals lines at the start of every station. How handy - naming
two different files with exactly the same name and relying on their location
to differentiate! Aaarrgghh!! Re-ran anomdtb:
Uhm... So they don't even KNOW WHAT THE ****ING DATA MEANS?!?!?!?!
What dumbass names **** that way?!
Talk about cluster****. This whole file is a HUGE ASS example of it. If they deal with data this way, there's no ****ing wonder they've lost **** along they way. This is just unbelievable.
And it's not just one instance of not knowing what the hell is going on either:
Quote:The deduction so far is that the DTR-derived CLD is waaay off. The DTR looks OK, well
OK in the sense that it doesn;t have prominent bands! So it's either the factors and
offsets from the regression, or the way they've been applied in dtr2cld.
Well, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I
immediately found a mistake! Scanning forward to 1951 was done with a loop that, for
completely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!! That may have had something to do with it. I also noticed, as I was correcting
THAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!! I can only assume that I was being interrupted continually when
I was writing this thing. Running with those bits fixed improved matters somewhat,
though now there's a problem in that one 5-degree band (10S to 5S) has no stations! This
will be due to low station counts in that region, plus removal of duplicate values.
I've only actually read about 1000 lines of this, but started skipping through it to see if it was all like that when I found that second quote above somewhere way down in the file....
CLUSTER.... ****. This isn't science, it's gradeschool for people with big data sets.
It explains why CRU would not release their code and data, even under Freedom Of Information Act requests. They knew that the quality was terribly shoddy, and took the chance that they could successfully stonewall, rather than have their climate models be exposed as junk.
And they were successful stonewalling, until someone on the inside leaked their data.