Can we trust the historical temperature data?The data are mostly in terrible shape. That doesn't mean that they are necessarily wrong, but rather the data is generally not what we hear from Climate Scientists, and is absolutely not what we hear from the Environment Press. That's the bad news.
The good news is that a lot of people are looking at the data today, a lot more than were looking a decade ago. This is why the AGU is whining about mean old Scott Pruitt making them release their data if their studies are going to be used to make EPA regulations. After all, if the data were solid, they wouldn't hesitate to release it all. But they won't, and so we suspect that it's the kicked dog that yelps.
The data are pretty interesting when you actually look closely at it. This post (which is actually an old post from 2009) will give you a sense of just how big the uncertainties in the climate databases are. Again, this doesn't mean the data is wrong - the data are what they are. But it shows you how big a grain of salt you will want when people tell you that the temperature is the highest it's ever been.
Oh, and the punchline to this post? The scientists who lost the data described here are the same ones who hid the decline.
------------------
The science is settled, but there's no data (originally posted 8//14/09)
Ever wonder how scientists measure temperature? With all the talk about Anthropogenic Global Warming (AGW), have you ever thought about where the data comes from? Somebody has to collect temperature data, right? So how do they do it?
Well, how do you measure temperature? If you're like me, you look at a thermometer.
So how many thermometers are being used to measure all of this AGW? I mean, if "the science is settled", someone should have an answer, right? Well, someone does have an answer, and it's quite interesting.
There are a lot of thermometers, at weather stations around the world. There didn't used to be, though. Until World War II there were less than a thousand world wide. Then there was a huge expansion of the number of weather stations, to a peak of over 13,000 during the Cold War. Note that this is the count of how many thermometers are in the 3000 longest continually operating weather stations. There are a lot (maybe 10,000) of new thermometers, but since the AGW debate is pretty uninteresting if you only look at the last 40 years, this is where the theory will be made or broken. (Actually, the data is very interesting, and suggests strongly that the new thermometers have thrown off the averages and may indeed account for 100% of the warming trend. You should readthe series of posts here, which include not only data, but source code).
This tells us several things:
1. There are not many weather stations that have been consistently measuring temperature in the same location for more than about 70 years. Maybe a thousand world wide, maybe a few more.
2. There are very few weather stations that have been consistently measuring temperature in the same location for over a hundred years. Two or three hundred world wide, tops.
3. There are almost no weather stations that have been consistently measuring temperature data in the same location since the end of the Civil War - essentially around the time when the Industrial Revolution was in full swing most places in North American and Europe, and everyone started belching CO2 into the air. Maybe 30 or 40 world wide.
Think about that last one - thirty or forty weather stations. That's what this whole AGW debate is centered on. A few dozen thermometers.
OK, so there really aren't many measurement locations. For those from which we have data, where's the data? Can we look at it? Check it for mistakes? I mean, after all this is supposed to be the biggest crisis in human history, the very survival of the human race is at stake, and we HAVE TO DO SOMETHING RIGHT NOW. Okay, but first can we look at the data?
Well, no. It's not that the scientists don't want to share it with you. They don't, but that doesn't matter. You see, it seems that the original data sets have gone missing:
The story is summed up nicely by Kenneth Green:
The questions about data quality are very, very serious, and the entire AGW debate is scientifically meaningless without an examination of the data, the methods, the mathematical models (especially the statistics, which are subtle and easy to mess up), and the source code of the climate models.
This last one in particular is an area I have relevant experience in, although we look for errors that cause security vulnerabilities. However, it's a truism that there will be at least one bug per 1000 lines of source code. How many lines of code are in the models? The process of higher math will tell us how many bugs to expect. How many of these would impact the results? You simply can't know until you look at it.
You say that the science is settled? Then give me the the data.
Well, how do you measure temperature? If you're like me, you look at a thermometer.
So how many thermometers are being used to measure all of this AGW? I mean, if "the science is settled", someone should have an answer, right? Well, someone does have an answer, and it's quite interesting.
There are a lot of thermometers, at weather stations around the world. There didn't used to be, though. Until World War II there were less than a thousand world wide. Then there was a huge expansion of the number of weather stations, to a peak of over 13,000 during the Cold War. Note that this is the count of how many thermometers are in the 3000 longest continually operating weather stations. There are a lot (maybe 10,000) of new thermometers, but since the AGW debate is pretty uninteresting if you only look at the last 40 years, this is where the theory will be made or broken. (Actually, the data is very interesting, and suggests strongly that the new thermometers have thrown off the averages and may indeed account for 100% of the warming trend. You should readthe series of posts here, which include not only data, but source code).
This tells us several things:
1. There are not many weather stations that have been consistently measuring temperature in the same location for more than about 70 years. Maybe a thousand world wide, maybe a few more.
2. There are very few weather stations that have been consistently measuring temperature in the same location for over a hundred years. Two or three hundred world wide, tops.
3. There are almost no weather stations that have been consistently measuring temperature data in the same location since the end of the Civil War - essentially around the time when the Industrial Revolution was in full swing most places in North American and Europe, and everyone started belching CO2 into the air. Maybe 30 or 40 world wide.
Think about that last one - thirty or forty weather stations. That's what this whole AGW debate is centered on. A few dozen thermometers.
OK, so there really aren't many measurement locations. For those from which we have data, where's the data? Can we look at it? Check it for mistakes? I mean, after all this is supposed to be the biggest crisis in human history, the very survival of the human race is at stake, and we HAVE TO DO SOMETHING RIGHT NOW. Okay, but first can we look at the data?
Well, no. It's not that the scientists don't want to share it with you. They don't, but that doesn't matter. You see, it seems that the original data sets have gone missing:
Data storage availability in the 1980s meant that we were not able to keep the multiple sources for some sites, only the station series after adjustment for homogeneity issues. We, therefore, do not hold the original raw data but only the value-added (i.e. quality controlled and homogenized) data. The priorities we use when merging data from the same station from different sources are discussed in some of the literature cited below.Oh, and that last link? It's not to your run of the mill right wing tool/hater site; it's to ground zero of AGW research. Nature has a good post about the refusal to release the data, with a very interesting comments thread. The comments are running around 20 to 1 in favor of releasing the data so it can be verified.
The story is summed up nicely by Kenneth Green:
In a nutshell, the story is this. Canadian Steve McIntyre, co-demolisher of Michael Mann’s hockey stick chart, has been after the CRU to let him review their original climate data. For those unfamiliar with Steve, he is like a dog with a bone when it comes to data, and to validating statistical methodologies used in data representation. To come to Steve’s analytical attention is a bit like coming to the attention of a 60-Minutes news crew, only a few hundred times worse, particularly if you have anything to hide.Scientists and politicians are asking for maybe $50 Trillion to "fix" the problem, but don't worry, you can trust them. I mean, it's not like they have data or anything, but it's an emergency. Srlsy.
So Steve politely (He is Canadian, after all) requested the climate data from CRU, only to be refused on the grounds that he is not in academia. That’s where the story gets interesting, because Roger Pielke Jr. (who IS in academia), put in his own request, and was also turned down. Not because he didn’t qualify, but because the CRU apparently didn’t bother keeping the original climate data used in compiling the first surface temperature record!
...
In other words, there is now no way to test to see whether any of the “homogenizing” that has been done to the original record biased it in any way, or whether any of the subsequent “adjustments” to the data for things like urban expansion, and such can be validated.
The questions about data quality are very, very serious, and the entire AGW debate is scientifically meaningless without an examination of the data, the methods, the mathematical models (especially the statistics, which are subtle and easy to mess up), and the source code of the climate models.
This last one in particular is an area I have relevant experience in, although we look for errors that cause security vulnerabilities. However, it's a truism that there will be at least one bug per 1000 lines of source code. How many lines of code are in the models? The process of higher math will tell us how many bugs to expect. How many of these would impact the results? You simply can't know until you look at it.
You say that the science is settled? Then give me the the data.
2 comments:
The easy answer to whether we can trust the information is, "No."
When the climatologists discount the various warming and cooling periods as recorded by actual people of the time, then you know that the fix is in.
Greenland isn't green, we've been told. It was a con job by Lief Ericksson, we've been told. Vinland was false, no grapes in Nova Scotia ever, we've been told. Right. Except that in the 1000's, Greenland was a beautiful pasture, grapes grew up to the Arctic Sea, and our 'betters' are not.
Like the current rises in ocean temperatures that have been noted in such places as the Bering Sea, sections of the Pacific and the Carribean. Must all be caused by us stupid humans, right. Has nothing to do with the uptake of submerged volcanoes in those areas. Sure, absolutely nothing to do with Mother Nature not being very kind, you know, of course.
If they can't or won't show the data, it's not science.
If they can't or won't show the inner working of the models, it's not science.
If the models give the results the benefactors paid for no matter the inputs, it's not science.
If there are witch hunts and denunciations of "Heretic!", it's not science.
Post a Comment