Showing posts with label Global Warming 101. Show all posts
Showing posts with label Global Warming 101. Show all posts

Monday, December 9, 2019

How the Climate Science sausage is made

Ten years ago I laid this out.  This is where the action is when it comes to actual climate science.

Originally posted 9 December 2009.

How to create a Scientific Consensus on Global Warming

We keep hearing people tell us that there is a "consensus" that the planet is warming, because the "science is settled". Longtime readers know my feelings on the latter, so there's no need to rehash old arguments. Instead, I'd like to look at how one might go about manufacturing a consensus. It's actually not hard.

Step 1: Change the data

There are a very small number of data sets on global temperatures, and these are used by essentially all Climatologists worldwide. One (CRUt) is from the Climate Research Unit in the UK, the second (GHCN)is the US Historical Climate Network, and the third is GISS (from NASA's Goddard Institute).

However, not all data sets are created equal: GISS and CRU get almost all of their raw data from GHCN, so that's the one that counts. Meaning, that's the one we'll look at today.

There are two parts to the GHCN data: the raw temperature readings, and adjustments to the readings. The raw numbers are easy - they're just the instrument reported temperature for the weather station. Look outside your house at your thermometer - that's the raw data. Here Chez Borepatch, my thermometer says that it's 39°.

Adjustments are modifications to the readings, to "remove inhomogeneities" in the raw data. You (like me) may look at that and say Whiskey Tango Foxtrot are inhomogeneities? CRU helpfully provides an answer:
Most long-term climate stations have undergone changes that make a time series of their observations inhomogeneous. There are many causes for the discontinuities, including changes in instruments, shelters, the environment around the shelter, the location of the station, the time of observation, and the method used to calculate mean temperature. Often several of these occur at the same time, as is often the case with the introduction of automatic weather stations that is occurring in many parts of the world. Before one can reliably use such climate data for analysis of longterm climate change, adjustments are needed to compensate for the nonclimatic discontinuities.
OK, we don't want a jump in the historical record if you move a station or replace a thermometer with a better one.

But. All the Climatologists in the world will look at this data. How much do the adjustments change the results?

We don't know, but people are starting to look. They're starting to find that adjustments change the data a lot. They change the data so much that they show that the earth is warming when the raw data may show that it's cooling.

Let me say that again: Thermometers may be showing that the Earth is cooling, but adjustments to this data show a rapid temperature rise.

Let me give three examples.

Darwin, Australia:The blue line is the raw data from the five weather stations in Darwin. It shows a 0.7°C cooling over the 20th Century. The Black lines are the adjustments to this data, showing a big jump in 1940 and a substantial increase since then. They turn the raw data decline into a 1.2°C increase over the course of the 20th Century.

Woah. So what's with the adjustments? Fortunately, there is an explanation:
They pick five neighboring stations, and average them. Then they compare the average to the station in question. If it looks wonky compared to the average of the reference five, they check any historical records for changes, and if necessary, they homogenize the poor data mercilessly. I have some problems with what they do to homogenize it, but that’s how they identify the inhomogeneous stations. OK … but given the scarcity of stations in Australia, I wondered how they would find five “neighboring stations” in 1941 …
So I looked it up. The nearest station that covers the year 1941 is 500 km away from Darwin. Not only is it 500 km away, it is the only station within 750 km of Darwin that covers the 1941 time period. (It’s also a pub, Daly Waters Pub to be exact, but hey, it’s Australia, good on ya.) So there simply aren’t five stations to make a “reference series” out of to check the 1936-1941 drop at Darwin.
...
Yikes again, double yikes! What on earth justifies that adjustment? How can they do that? We have five different records covering Darwin from 1941 on. They all agree almost exactly. Why adjust them at all? They’ve just added a huge artificial totally imaginary trend to the last half of the raw data! ...
Those, dear friends, are the clumsy fingerprints of someone messing with the data Egyptian style … they are indisputable evidence that the “homogenized” data has been changed to fit someone’s preconceptions about whether the earth is warming.
You might think that this sort of "adjustment" process is incompetent. You might also think that this process is convenient (but only if you are as nasty and suspicious as I). There is much, much more, and it's much, much worse.

It's not just Darwin, either. Brisbane sees exactly the same thing:
Just out of interest I decided to plot the raw temperature data for my home city of Brisbane, Australia from the GISS (ie the raw GHCN data) against the homogenized or adjusted GISS GHCN data. The temperature sensor is located at the Brisbane Eagle Farm Airport which is now our busy main international airport. The data used is the series available from 1950 to 2008. I have aniumated the result to highlight the difference.

As you can see the raw data shows a downward trend of about -0.6 C per century. The unadjusted data however shows an opposite trend of +0.6 C per century. Intuitively as the airport grew from a quiet strip to a busy international jet airport one would think the more recent data would be adjusted downwards for the heat island effect. Instead we see that the data prior to 1978 is adjusted down and the data in recent times was adjusted up.
He helpfully plots the raw data overlayed with the adjusted data.Don't need a weatherman to know which way the wind blows.

And remember, we already know what the adjustments to the lower 48 states look like:New Zealand, too:Take away the adjustments, and all the warming from 1850 disappears. Change the data, and all the Climatologists will discover that the Earth is "warming". [intentional use of scare quotes]

Let me say this explicitly: I used to believe that the planet was warming, and that this was likely due to natural (as opposed to man made) causes. Now I'm not sure that the planet is warming. The data do not show warming over the last 70 [now 80 - Borepatch] years, maybe longer.

Step 2. Fund only scientific research that confirms warming.

We've seen for some time anecdotal evidence suggesting that researchers are afraid to come out publicly against the "consensus" view:
You can almost smell the fear - the article discusses a series of climate changes over the centuries (not a surprise to either of my regular readers), strongly correlated with changes in Solar activity. But the author feels the need to add a non sequitur about Carbon Dioxide. E pur si muove, indeed.
Well, we now are starting to see explicit charges of warming bias in the research grant application process:
Personal anecdote:
Last spring when I was shopping around for a new source of funding, after having my funding slashed to zero 15 days after going public with a finding about natural climate variations, I kept running into funding application instructions of the following variety:

Successful candidates will:
1) Demonstrate AGW.
2) Demonstrate the catastrophic consequences of AGW.
3) Explore policy implications stemming from 1 & 2.

Follow the money — perhaps a conspiracy is unnecessary where a carrot will suffice.
On a personal note, one of the traits that I find the most charmingly naive among the more shrill of the warming alarmists is their claim that the skeptics are funded by the Oil companies. As if "their side" couldn't possibly have an interest in the outcome.

When you consider that the CRU had received at least £14M and was looking at another £75M more, that's real money.

Opportunity and motive. Does this mean there was a conspiracy? Of course not. It does mean that they data is not to be trusted, that we simply don't know whether the planet is warming or not, and that there is a plausible explanation for why someone would want to manufacture a phony consensus.

The science is settled? Don't make me laugh.

UPDATE 9 December 2009 17:37: Looks like West Point, NY is only warming due to adjustments. Here is the raw data:The Machiavellian explains what's happening:
So, after the raw data is run through the homogenization process, in other words, statistical manipulation, temperatures from 1900 through 1980 are depressed at West Point and temperatures thereafter look as if they are increasing in comparison to the average temperatures based on the raw data.
No warming at all in 100 years in the raw data; a degree warming after adjustments.

UPDATE 9 December 2009 17:47: It looks like the CRU emails include this from Prof. Wibjorn Karlen, who was trying unsuccessfully to reproduce the temperatures shown by the IPCC AR4 report for Scandinavia.
In attempts to reconstruct the temperature I find an increase from the early 1900s to ca 1935, a trend down until the mid 1970s and so another increase to about the same temperature level as in the late 1930s.

A distinct warming to a temperature about 0.5 deg C above the level 1940 is reported in the IPCC diagrams. I have been searching for this recent increase, which is very important for the discussion about a possible human influence on climate, but I have basically failed to find an increase above the late 1930s
This is email 1221683947, to Dr. Jones of the CRU. Jones was particularly unhelpful - essentially blowing Karlen off. So add Scandinavia to the list of places where adjustments look particularly dodgy.

UPDATE 9 December 2009 17:56: Wow, Detroit, too:
Once again, the raw, monthly average temperature for the Detroit area, over the last 111 years shows an amazingly constant climate. It seems that only when the data is run through and adjusted by the proponents of global warming do we get an upward tick in temperatures.
The Machiavellian has more: California, Southwest Ohio.  Click through and scroll.

Here is the $64 Trillion Question: Are there any locations where adjustments are net negative over 100 years? Any at all? I'm willing to listen to justifications why all adjustments are long term net positive, but it will have to be good. There are very well known and documentedissues that cause thermometers to run hot; adjustments for this should cause raw temperatures to decrease, not increase.

UPDATE 9 December 2009 22:24: Joanne Nova has an information-rich post that dissects the adjustments. If the adjustments come mainly from nearby weather stations, what if none of them show warming, either? Someone at GISS or CRU or HCN have some 'splaining to do.

Monday, March 20, 2017

A layman's guide to the science of global warming

I haven't posted much on global warming for the last few years, feeling like I'd said most of that I had to say.  I mean, after a hundred or more posts, what's left to say?  What I haven't done is put together a high level overview for the non-scientist who wants to understand what's going on.  Sort of a nutshell guide, if you will.  And so, if you don't care about the current global warming brouhaha, you can skip this post.  If you want to understand what's behind the science, then read on.

The Starting Point: Climate over the last 1000 years

Probably the most famous image from this whole debate is the "Hockey Stick" graph, showing what was said to be the climate over the last 1000 years:


This was from a 1999 paper by Michael Mann (and co authors Raymond Bradley and Malcolm Hughes; this paper is often referred to as MBH99 after the author's initials and publication date).  When I first saw this, I was pretty skeptical.  It showed a stable climate (notice how flat the blue line is over most of the time?) until very recently followed by a sudden spike in temperature - a long flat line with a sudden right-hand hook looks like a hockey stick (hence the name of the graph).

We didn't hear much about an impending heat death of the globe until fairly recently.  Before the late 1990s, the current scientific consensus was that climate fluctuated, sometimes hotter and sometimes cooler.  The current climate was not seen as being particularly warm - certainly less warm that the Medieval period (called the "Medieval Warm Period", or MWP) or the Roman era (called the "Roman Climate Optimum").  This was all written up in the first Assessment Report from the UN Intergovernmental Panel on Climate Change (IPCC) which periodically published the latest and best scientific understanding on the issue.  Page 202 of that report showed the scientific consensus of climate history over the last thousand years.


You can see the MWP of the left, the "Little Ice Age" where famine ruled Europe in the middle, and then a temperature recovery to the current era on the right.  No hockey stick to be seen anywhere.  Remember, this was the scientific establishment view in 1990.

As it turns out, there's plenty of history to support this establishment view, and which disputes the MBH99 hockey stick.  The Domesday Book was a tax survey compiled by William the Conquerer after he invaded England in 1066.  It detailed everything in his kingdom that was worth taxing, and so it was assembled with care.  It documented wine vineyards in the north of England, far to the north of where wine is produced today, implying that the climate was warmer in 1066 than it is in 2017.  There is excellent documentary history that the MWP was followed by a catastrophic cooling - the Little Ice Age: as todays's glaciers retreat, archaeologists have discovered the remains of alpine villages that were overrun by glaciers.  And recently, the Vatican announced changes to centuries-old prayers to stop the advance of the glaciers.

The important point here is that there is quite a lot of recorded history from the period that does not square with the climate reconstruction from the Hockey Stick paper.  As it turns out, the MBH99 paper has been conclusively debunked: the data sets used were inappropriate and the statistical algorithms were "novel" (the produced hockey stick shaped output even on completely random data; for example, if you ran the numbers from the telephone directory through the algorithm it would give you a hockey stick).

How do we know what the temperature was 1000 years ago?

The thermometer was invented in the early 1600s.  The oldest regularly maintained series of readings are from the Central England Temperature (CET) series that dates to 1659.  So how do we know what the temperature was before that?  Proxies.

A proxy is a measurement that isn't directly a temperature measurement but which maps to what we think the temperature was.  The most famous of these are tree ring widths: rings will be wider in warmer years when growth is faster, and narrower in cold years when growth is slower.  There are a lot of other types of proxies: rings showing growth in coral reefs, layers of sediment from ponds, and most interestingly, layers of ice deposited on glaciers.  Drilling into the glacier results in ice cores which have annual accretions - colder years will have thicker layers and warmer years will have thinner ones.

Proxies reflect temperature and some of these records go back a very, very long time.  The Greenland Ice Core Project (GRIP) ice cores date back thousands of years:



Current climate is on the far right.  Moving leftwards we see first the MWP, then a cool period, then the Roman Climate Optimum, and then a generally warmer climate for thousands of years.  There is corroborating archaeologic evidence to support this data: retreating glacier uncovers pre-viking tunic,  retreating glacier uncovers 4000 year old forest (german newspaper translations).

The Vostok ice cores from Antarctica go much further back, hundreds of thousands of years:


You can see the alteration between ice ages (populated by Woolley Mammoths and other cold weather fauna) and warm inter-glacial periods.  We are currently in one of those interglacials.  It's unclear what caused the ice ages, and what caused the warmer inter-glacials.  However, man-made carbon dioxide is not one of the plausible theories for the interglacials.

The Greenhouse Effect

OK, so we know that climate has been up and down for pretty much as long as we can piece together records.  Rather than history, what's going on right now?

We now need to shift from history to Chemistry. We've heard of the "Greenhouse Effect", where sunlight passes through the atmosphere to the ground, the energy is absorbed and re-emitted as heat, and the heat is trapped by the atmosphere. In more precise scientific terms, certain gases are transparent to visible light, but obaque (blocking) to heat (infrared) radiation.

Carbon Dioxide (CO2 is one of a set of greenhouse gases, including methane and water vapor. One justification for the Hockey Stick that proponents of AGW theory used was that the Industrial Revolution began to produce large amounts of CO2 around 1850, which is when we saw the spike in temperature. There are a couple problems with this:

1. Correlation does not imply causation. Just because something happens at the same time as something else, doesn't mean that it's caused by it. If we see a big increase in, say, the number of lemons imported from Mexico, and simultaneously see a big reduction in the number of traffic fatalities, we shouldn't jump to the conclusion that Mexican lemons reduce traffic deaths. This seems obvious, but is really at the heart of the proposed policy mitigations like Kyoto, Cap and Trade, and Copenhagen.

2. More importantly, CO2 is a very - even surprisingly - weak greenhouse gas. (chart from ICPP AR 1)
What this means is that as you put more CO2 into the atmosphere, it has less and less of a greenhouse effect. This isn't really surprising, because this sort of "exponential decay curve" is the norm in nature - things tend to rapidly achieve equilibrium because this "negative feedback" keeps things from running away out of control. Chemistry (actually spectroscopy) tells us that CO2 is not really opaque to infrared except at a very narrow frequency band, and therefore "leaks" heat back into outer space at the edges of the bands.

The scientific consensus is that doubling the amount of carbon dioxide in the atmosphere results in warming of around 1°C.  We've gone from around 280 parts per million (ppm) atmospheric  CO2 to around 400 ppm an increase of about 50% over the last 100 years or so, so there should have been an increase of around half a degree.  So why do we hear all of this about how we are destroying the planet?  I mean, half a degree doesn't sound like much.

Shaky scientific grounds: "Positive Forcings"

Proponents of catastrophic warming know this, and have proposed a theory of "Positive feedback", where CO2's greenhouse power is multiplied, or "forced", sort of like Popeye after he opens a can of spinach. This forcing is reached after a particular CO2 concentration, and causes a "runaway greenhouse effect". There is a fatal problem with this: we simply don't see this much in nature.  In fact, the universe is stable because of negative feedback, where an increase in one thing results in a decrease in others.

There is, of course, a theoretical justification for positive feedback from the AGW proponents - the details are complex, and I don't particularly want to get into them. Instead, is there a way that we can test the theory? There is indeed. We have measurements of both temperature levels as well as CO concentrations for at least the 20th Century. How do they match?

Poorly:
Rather than lots of science and math and stuff, he looks at what the proponents of AGW say and he finds a lot to be desired:
5. The claimed “proof” of positive feedback is a model prediction of a hot spot in the tropics at mid troposphere levels. However all the experimental evidence from many, many measurements has failed to find any evidence of such a hot spot. In science, a clear prediction that is falsified experimentally means the underlying hypothesis on which the prediction is based is wrong.
...
8. If I adopt this 10:1 ratio by looking at the last 100 years worth of data I find 1910-1940 temperatures rising while CO2 was not. 1940 to 1975 temperatures falling while CO2 rising, 1975 to 1998 temperatures rising while CO2 rising and 1998 to 2009 temperatures falling while CO2 rising. Three quarters of the period shows no correlation or negative correlation with CO2 and only one quarter shows positive correlation. I do not understand how one can claim a hypothesis proven when ¾ of the data set disagrees with it. To me it is the clearest proof that the hypothesis is wrong.
What I would add is that we don't just get temperature proxy data from ice cores, we also get COlevels from gas bubbles that were trapped in each layer.  CO2 maps very neatly to temperature, so the question is why we didn't see positive forcing during, say, the Roman Climate Optimum?

This is the biggest problem that climate scientists have today, and is actually the center of the whole debate: are there positive forcings, if so how big are they, and how are they measured?  There's actually no consensus at all here among climate scientists.  You can get a good overview of this issue here.

Climate Models seem hopelessly broken
Prediction is hard, especially about the future.
- Yogi Bera
The history is decently clear from proxy evidence, so where do scientists think that the climate is going?  There are a bunch of computer models (enormous, complicated computer programs) that predict what climate will be like in the future.  A lot of the most dire predictions that you hear - that temperatures will rise 4 or 5 degrees, devastating the planet - come from these models.

The problem is that models are not climate - they are programs that contain a bunch of algorithms that produce a set of numbers.  Whether these algorithms are valid predictors is the real question.  As we all know, the proof of the pudding is in the eating of it.  So how accurate have the models been?

Not very:


The latest IPCC report (as of 2017) is Assessment Report 5 (AR5) which includes 102 climate model predictions from CMIP-5.  All but a couple of the models run "hot", meaning that the predicted temperatures are higher than what is observed.  The blue and green data points are from measured temperatures from weather balloons and satellites, but we could as easily add in the surface temperature data set used in AR5 (the CRUTEM series) which would show the same divergence between measured temperature and predicted temperature.  You can get more details on models vs. measured temperature at this post.

Something seems very fishy in Climate Science

This is where we stand regarding the historical record, the theory, the chemistry, and the predictive models.  There is really quite a lot of evidence that climate science as currently practiced doesn't have as solid a grasp on the climate as they say.  Indeed, at each stage we see quite a lot of hard evidence that contradicts what the so called "consensus view" of science is.  If the theory were as strong as claimed, you'd expect to see the opposite - data everywhere confirming the theory.

For example, the highest temperature ever recorded in the United States was in 1913.  After a century of positive forcing and year after year reported as "the hottest year ever", we find that the hottest day on record was over a century ago.  Does this prove that the climate isn't warming?  Of course not.  However, if the science were as incontrovertible as we are told, you would expect a more recent record.

But let's look at what's going on in the "consensus climate establishment", because there are some very odd things that you see when you turn over some rocks.  We will talk about some of these now.

ClimateGate and "Hide the Decline"

The University of East Anglia (UK) hosts the Hadley Centre for Climate Prediction and Research, one of the three most influential climate research organizations in the UK. The Hadley Centre is part of the UK Met (Meteorological) Office, the UK's national weather office. Hadley develops computer climate models and provides one of the most influential temperature data sets (CRUTEM3). In 2009, the Hadley Centre controversially refused a Freedom Of Information Act (FOIA) request for the CRUTEM3 raw (uncorrected) data.

Phil Jones is the current director of the Hadley Centre.

In November 2009,  someone posted 61 MB of emails, computer program code, and climate data from Hadley servers to an FTP server on the Internet.  One of the most notorious of the emails in this release was from Dr. Jones, and contained the following:
I've just completed Mike's Nature trick of adding in the real temps
 to each series for the last 20 years (ie from 1981 onwards) amd [sic] from
 1961 for Keith's to hide the decline.
Let's unpack this so you understand each piece.  "Mike" refers to Dr. Michael Mann (of the Hockey Stick graph fame).  "Nature" refers to Nature Magazine, one of (perhaps the) most  prestigious scientific journals.  More specifically, it refers to an article that they published, written by Dr. Mann in which he had a temperature reconstruction.  There is a huge amount of dispute over what "trick" means - skeptics allege sleight of hand while Mann said it just referred to a mathematical technique.  So what was the trick?

Dr. Mann's data sets contained many different proxy series.  This is actually a good thing, because you want confirmation of results from different places and types of proxies (say, including ice cores, tree rings, and corals will probably be more reliable than just using tree rings).  Mann's "trick" (call it a mathematical technique if you want) was to remove all proxy data later than 1960 and replace it with measured temperature data.  The result was a hockey stick shaped temperature graph.  This is what Dr. Jones did in the paper referred to in his email.

The $100,000 question is: why go to the trouble to do this if you have proxy data from 1960 up to the present?  Why replace 50 years of perfectly good data?

Hide the decline.

This is a great, detailed video about ClimateGate and hide the decline by Dr. Richard Muller, head of climate science at the University of California at Berkeley.  He is a high profile climate scientist and he has quite pungent things to say about Dr. Jones and company.  The relevant part about Dr. Jones and the CRU starts around 29 minutes into the lecture.



There's more that I won't go into here (particularly the repeated modification of previously recorded temperature data with little or no justification) but this post is plenty long enough as it is and you have a solid grounding in the key points (with links to original sources so you can check my work).

Thursday, January 22, 2015

2014: One of the coldest years in the last 10,000

Great overview of many of the flaws in the "2014 was the hottest year EVAH" press release from NASA.  This is a great introduction to the problems in the science and you should RTWT, but it ends with this excellent summary:
Evidence keeps contradicting the major assumptions of the anthropogenic global warming (AGW) hypothesis. As T.H. Huxley (1825 – 1895) said,

The great tragedy of science – the slaying of a beautiful hypothesis by an ugly fact.

The problem is the facts keep piling up and the AGW proponents keep ignoring, diverting, or stick-handling (hockey terminology), their way round them. We know the science is wrong because the IPCC projections are wrong. Normal science requires re-examination of the hypothesis and its assumptions. The IPCC removed this option when they set out to prove the hypothesis. It put them on a treadmill of fixing the results, especially the temperature record.
I think it's time to start referring to this as the "Democrat's War On Science" ...

Tuesday, June 19, 2012

Once again, with feeling - the climate temperature databases are lousy

I've said over and over again that the temperature databases are lousy.  Interestingly, the input data for those databases are only a little lousy.  Sure, sometimes there are gaps in a station's data set, and sometimes stations are re-sited (or given different measuring equipment) and the new isn't calibrated with the old.  But we can all live with some of that, because in the long run, it's a minor variation that will still let us see the overall climate change signal.

My problem is the adjustments made to the input (or "raw") data.  These adjustments appear arbitrary, they are poorly explained (if they are indeed explained at all), and the scientific establishment seems to have no intention at all of quality controlling the data.  Indeed, the CRU data set (this is from the "hide the decline" crowd) doesn't even have the original data any more - they threw the backup tapes out some time ago, only keeping the adjusted (or "value added" in their terminology) data.  This is the data set that the IPCC relies on for its reports, and it's entirely impossible to check to see if it's valid.

So for other data sets where the raw data still exists, how does the adjusted data compare?  And here we start to see how the game is played:
A team of independent auditors, bloggers and scientists went through the the BOM [Australian Bureau of Meterology - Borepatch] “High Quality” (HQ) dataset and found significant errors, omissions and inexplicable adjustments. The team and Senator Cory Bernardi put in a Parliamentary request to get our Australian National Audit Office to reassess the BOM records. In response, the BOM, clearly afraid of getting audited, and still not providing all the data, code and explanations that were needed, decided to toss out the old so called High Quality (HQ) record, and start again. The old HQ increased the trends by 40% nationally, and 70% in the cities.
And a picture is worth a thousand words:


"But surely," I hear you say, "this is a one-off, a 'black swan', a one of a kind mistake.  You're cherry picking, Borepatch."  No, I'm not (and please don't call me Shirley):

Darwin, Australia:



Brisbane, Australia:


West Point, NY (raw data):


Detroit, MI (raw data):


New Zealand:


The entire continental US over the entire 20th Century:

The New Zealand case was interesting: an outside group sued the New Zealand Weather Bureau under the Freedom of Information Act to get the data for the official government data set - the same data set that the government had announced with great fanfare showed big, big warming over the course of the last century.  In their Court pleading, the government renounced the database entirely.  It seems that suddenly there is no "official" New Zealand government temperature database.

Hide the Decline, indeed.  In other news from the Antipodes, it seems that we've always been at war with Oceania.

In every one of these cases, the adjustments have made older readings colder, and newer readings warming.  In each of these cases, the entirety of the 20th Century's warming signal disappears when you remove the adjustments.  This isn't just cherry picking, it's continent-wide readings (Oz + New Zealand and the USA) over an entire century.

So is the climate getting warmer?  Maybe.  But it looks like the data do not show this, or if they do it's with major qualification.  So why make the changes, and how were the changes made?  Nobody will say.

Once again, with feeling: nobody will say.

That's one righteous case of "the science is settled," right there.  And when someone does expose corruption in the scientific establishment, they get fired.  It's no wonder that the first major skeptic blog was named Climate Audit - the establishment won't audit themselves.  And that is the most important reason that you should be skeptical of the whole thing.  Until the establishment comes clean and allows a proper audit of the science and the data, the whole thing should be presumed to be a too-comfortable scheme milking the governments of part of that sweet, sweet $100 Billion in grant funding.  No wonder they're changing the data.

And lest you think that I'm being overly harsh, remember "hide the decline"?  Know why they hid the decline?  Because it showed that the temperature had been falling for the last 50 years.



Anyone who ever uttered the words "Republican war on science" can shut up and sit down in the back of the room.  Grown ups are talking.

Monday, June 4, 2012

Global Warming 101: Adjusted data

I'm highlighting some posts from some time back, that will give you a grounding in the science of climate change.  The data seem scientific, but get curiouser and curiouser the more you look.  And attentive readers will recognize this song, which makes periodic appearances here.  I hadn't realized just how long I've been posting it - this blog was only six months old when I originally did this post.

---------------------------------------

Global Warming caused by lousy data  (December 14, 2008)

You read in the press about how much the temperature has risen in the last 100 years. There's an interesting story in the data, but the press doesn't know it.

The data has two components: the raw measurements themselves, and a set of adjustments.

Adjustments are made for a bunch of reasons: time of observation adjustments (you didn't take a reading at exactly the same time each day), environmental changes, weather station site relocations, urbanization, etc.

An interesting question is how much of the 20th century's temperature change is due to adjustments? As it turns out, the answer is all of it.

This chart shows the before-adjustment and after-adjustment temperatures for the 20th century, super imposed. All of the warming is due to adjustments, rather than raw data. Almost all of the adjustments are for readings after 1970.

















Also, via Climate Skeptic, is all you need to know to understand the Global Warming panic.



Regular readers will recognize one of the songs used here. Heh.

UPDATE 14 December 2008 20:24: You want links, we got links! Well, I don't have any, but The Unpaid Bill has all sorts of links, including a short version of the report. While I was busy snarking, he was, you know, doing useful Intarwebz link-fu.

Sunday, June 3, 2012

Global Warming 101: Ghost Data

I'm highlighting some posts from some time back, that will give you a grounding in the science of climate change.  The data seem scientific, but get curiouser and curiouser the more you look.

-------------------------------------------


Ripogenus Dam  (October 26, 2009)

The science is settled.

OK, so what's with the Ripogenus Dam?

You don't get much more rural than that. Way, way up the west branch of the Penobscot river in Maine, it's the sort of place that Boy Scouts go for week long canoe voyages through the wilderness. 1972, Troop 47, a dozen fellow teenagers and I spent a week a hundred miles from any other living soul.


In 1972, there was a Weather station at the Ripogenus Dam. It collected temperature readings every day. Those readings were included in NASA's GISS temperature reading data set. Its readings were included in GISS until 2006, along with data from thousands of other weather stations. There's really only one little problem.

The Ripogenus Dam weather station was decommissioned in 1995.

So for ten years, GISS reported temperature readings from a station that didn't exist. How? Filnet.
Part of the USHCN data is created by a computer program called “filnet” which estimates missing values. According to the NOAA, filnet works by using a weighted average of values from neighboring stations. In this example, data was created for a no longer existing station from surrounding stations, which in this case as the same evaluation noted were all subject to microclimate and urban bias, no longer adjusted for. Note the rise in temperatures after this before the best sited truly rural station in Maine was closed.
"Urban bias" is the technical term for when a weather station reads artificially high temperatures because the station is situated in an urban location where there are lots of buildings and parking lots to absorb the heat from the sun. Filnet took temperature readings from other weather stations - stations in urban locations where reading are higher because of the surrounding asphalt heat collectors - and used it for the most rural station in the state.

Remember how 1998 was the "warmest year in a millennium"? Well, it was warmer than it would have been if the Ripogenus Dam's readings hadn't come from Millinocket.

The question is: just how unreliable is the data? Lots.
How can we trust NOAA/NASA/Hadley assessment of global changes given these and the other data integrity issues? Given that Hadley has destroyed old original data because they were running out of room in their data cabinet, can we ever hope to reconstruct the real truth?
Given that there are only 30 or 40 stations that have been providing temperature readings from the Civil War up to today, given that "adjustments" are made to the temperature data via an arcane and opaque process and may represent most or all of the warming in the 20th Century, given that scientists refuse to release their data (or the raw - unadjusted - data has been destroyed), given how some of the data sets rely on tree rings from a single tree, how can we trust the data?

The Ripogenus Dam weather station was giving ghost readings for a decade. How many other non-existent weather stations are still generating new data? The World wonders.

UPDATE 31 October 2009 17:20: David linked. Thanks! Take a look around - there's a lot more on this sort of thing here.

Friday, June 1, 2012

Global Warming 101: The Data

I'm highlighting some posts from some time back, that will give you a grounding in the science of climate change.  As with any scientific issue, the questions are more important - and more interesting - than the answers.

And note that this is the first reference to the event that would be described as "hide the decline".

-------------------------------------------

Climate Change Data Suspect (August 28, 2009)

Sometimes when you try to turn an Apple into an Orange, you get a Lemon. We hear that the "Science is settled" about Climate Change, and it's All Our Fault. The proof, we're told, includes the "inconvenient fact" that 1998 was the warmest year in a thousand years.

Interesting. How exactly do we know? After all, the Thermometer was only invented in the early 17th Century. There's a chance - albeit a slender one - that the measurements show that 1998 was the warmest year in 350 years. How do they know what the temperature was before that?

Easy, say the Climate Warming Crowd. There are lots of "proxies" - other measurements that map pretty well to temperature. Tree rings will vary - growth will typically be faster in warm years, slower in cold ones. Ice cores, pollen counts from cores drilled into prehistoric bogs, even harvest records from medieval monasteries or Imperial Chinese court documents. These are reasonable proxies - everyone agrees on this.

So we have direct temperature readings for 100 or 200 years (the data is surprisingly weak when you go back more than 60 or 70 years). There are tree rings that go back maybe a thousand years. Ice cores will take you back tens or hundreds of thousands of years.

Ah, but these are different types of data. How do you put them together? Splicing:
Splicing data sets is a virtual necessity in climate research. Let’s think about how I might get a 500,000 year temperature record. For the first 499,000 years I probably would use a proxy such as ice core data to infer a temperature record. From 150-1000 years ago I might switch to tree ring data as a proxy. From 30-150 years ago I probably would use the surface temperature record. And over the last 30 years I might switch to the satellite temperature measurement record. That’s four data sets, with three splices.
What's tricky is how you join them. You don't want big discontinuities in the record occurring where the data sets are spliced. Data sets are calibrated, or zeroed to try to make sure that the record stays smooth. This can be tricky, and can lead to False Positive results - reporting that something is happening, when in reality it's just an artifact of the data instrumentation:
But there is, obviously, a danger in splices. It is sometimes hard to ensure that the zero values are calibrated between two records (typically we look at some overlap time period to do this). One record may have a bias the other does not have. One record may suppress or cap extreme measurements in some way (example - there is some biological limit to tree ring growth, no matter how warm or cold or wet or dry it is). We may think one proxy record is linear when in fact it may not be linear, or may be linear over only a narrow range.
So back to the "Warmest year in 1000 years" headline. Remember Mann's "Hockey Stick" graph, the one from Al Gore's movie An Inconvenient Truth? The temperature was pretty stable until 150 years ago, and then it spiked, remember? What happened 150 years ago?

Well, say the Climate Change Crowd, the Industrial Revolution started cranking out, well, industrial quantities of Carbon Dioxide into the atmosphere. All that CO2 is what's to blame, and they have computer models to show it. Fair enough. Ignore the many problems with with models. The Industrial Revolution was built on steam power, which was driven by Coal. And it did hit its stride around 150 years ago.

But is there anything else that happened around 150 years ago? Why, yes. The temperature data sets changed from proxies to actual thermometer readings. The sudden upswing in average global temperature is entirely from a different set of measurements than the earlier data sets. Entirely.

So, could this be a False Positive, an artifact of splicing two different data sets together. Y es it could be. In fact, it's likely that this is the case, and that's why you don't hear the Climate Change Crowd talk about Hockey Sticks and "Global Warming" anymore. Want proof? What if we ignore the thermometer readings, and just look at the proxy temperatures? We have tree ring data that goes right up to the present - why stop 150 years ago? What does it tell us about recent climate?
You can see that almost all of the proxy data we have in the 20th century is actually undershooting gauge temperature measurements. Scientists call this problem divergence, but even this is self-serving. It implies that the proxies have accurately tracked temperatures but are suddenly diverting for some reason this century. What is in fact happening are two effects:
  1. Gauge temperature measurements are probably reading a bit high, due to a number of effects including urban biases
  2. Temperature proxies, even considering point 1, are very likely under-reporting historic variation. This means that the picture they are painting of past temperature stability is probably a false one.
All of this just confirms that we cannot trust any conclusions we draw from grafting these two data sets together.
So, the proxy data is not accurate enough to get reported by the Climate Change Crowd, but it's plenty accurate enough to show that 800 years ago was cooler? Selection Bias, anyone?

And if you think I'm harsh accusing the scientific community of selection bias, how about this little tidbit about one of the proxy-based data sets:
For some reason, the study’s author cut the data off around 1950. Is that where his proxy ended? No, in fact he had decades of proxy data left. However, his proxy data turned sharply downwards in 1950. Since this did not tell the story he wanted to tell, he hid the offending data by cutting off the line, choosing to conceal the problem rather than have an open scientific discussion about it.

The study’s author? Keith Briffa, who the IPCC named to lead this section of their Fourth Assessment.
When you combine this with repeated errors in the reported data - and with total refusals to release the data for scrutiny - you should be very skeptical of any claims about climate change. Any.

There is a massive, ugly problem with data integrity concerning climate change. Rather than being a done deal, things are getting curiouser and curiouser. Settled? You must be kidding. The science is getting very interesting indeed.

UPDATE 26 November 2009 19:01: More about Dr. Biffra here.

Thursday, May 31, 2012

Global Warming 101: What do we mean by "scientific"?

The Czar of Muscovy says (perhaps excessively) complimentary things about my climate science musings.  Me, I think that I've gotten more ranty and sarcastic over the years, and this is a Bad Thing.  As a service to my readers (and a tip of the шляпа to our Autocrat), I'm pulling some of the better of my (non-ranty, non-sarcastic) climate science posts.  


Plus, I'm on vacation and feeling lazy.  But srlsy, I have almost 300 posts in my junk science category - this is organizing (technically, it's what the medieval monks called glossing).


I'll do one of these a day for the next little bit.  Long time readers will roll their eyes, but it may be new to a bunch of y'all (well, anyone who wasn't reading me 3 years ago).  If you actually want to understand the science, rather than just the arguing, hopefully this will be a starting point.


----------------------------


Falsifiable    (July 23, 2009)


Generally to be considered "scientific", something has to be falsifiable - where anyone can try to duplicate your observations or results. If there's no way that this can be done, then the thing cannot be held to be scientific. Carl Sagan used a typically accessible parable that illustrated this critical part of the Scientific Method:
"A fire-breathing dragon lives in my garage"

Suppose (I'm following a group therapy approach by the psychologist Richard Franklin) I seriously make such an assertion to you. Surely you'd want to check it out, see for yourself. There have been  innumerable stories of dragons over the centuries, but no real evidence. What an opportunity!

"Show me," you say. I lead you to my garage. You look inside and see a ladder, empty paint cans, an old tricycle -- but no dragon.

"Where's the dragon?" you ask.

"Oh, she's right here," I reply, waving vaguely. "I neglected to mention that she's an invisible dragon."

You propose spreading flour on the floor of the garage to capture the dragon's footprints.

"Good idea," I say, "but this dragon floats in the air."

[Lots of ingenious tests for the dragon's existence presented and explained away.]

Now, what's the difference between an invisible, incorporeal, floating dragon who spits heatless fire and no dragon at all? If there's no way to disprove my contention, no conceivable experiment that would count against it, what does it mean to say that my dragon exists? Your inability to invalidate my hypothesis is not at all the same thing as proving it true. Claims that cannot be tested, assertions immune to disproof are veridically worthless, whatever value they may have in inspiring us or in exciting our sense of wonder.
So the primary - perhaps singular - requirement of science is data. Access to data (to see if someone made a mistake or to compare it to a different set of data) is simply a given, if something is to be considered scientific. Otherwise, how is the hypothesis falsifiable? The assertions would be immune to disproof.

An interesting thing is going on in the Global Warming debate - one group of scientists (the global warmers) is refusing to release their data. Steve McIntyre asked the UK Meteorologic Office to send him their data, so he could check it:
You stated that CRUTEM3 data that you held was the value added data. Pursuant to the Environmental Information Regulations Act 2004, please provide me with this data in the digital form, together with any documents that you hold describing the procedures under which the data has been quality controlled and where deemed appropriate, adjusted to account for apparent non-climatic influences.
They said no. Their reasons were very, very interesting:
The Met Office received the data information from Professor Jones at the University of East Anglia on the strict understanding by the data providers that this station data must not be publicly released.
Well now. Leaving aside whether the University of East Anglia in general, and Professor Jones' projects in particular are publicly funded, doesn't this make it hard to analyze the public policy recommendations related to climate change? The Met Office heartily agrees:
We considered that if the public have information on environmental matters, they could hope to influence decisions from a position of knowledge rather than speculation. However, the effective conduct of international relations depends upon maintaining trust and confidence between states and international organisations. This relationship of trust allows for the free and frank exchange of information on the understanding that it will be treated in confidence. If the United Kingdom does not respect such confidences, its ability to protect and promote United Kingdom interests through international relations may be hampered.
Well, well, well.

So what can we say about any conclusions, recommendations, or reports issued by the UK Met Office, that are based on this data? They are unfalsifiable.

McIntyre is very unpopular indeed among the Global Warming set, because he focuses on their data. He's the reason that you never hear about the "Hockey Stick" any more - he found that the data was cooked and the computer model was buggy, in a way that produced the hockey stick shaped curve. How bad is the data? Some of it no longer exists:
In passing, I mention an important archiving problem. Pete Holzmann identified actual tags from the Graybill program. We found that 50% of the data had not been archived. Was this selective or not? No one knows. Graybill died quite young. His 21 notes were notoriously incomplete. Worse, when the Tree Ring Laboratory moved a few years ago, apparently they forgot to arrange for old samples to be protected. Their former quarters were destroyed. Some of the records were apparently recovered from the trash by one scientist but others are permanently lost.
This is what the IPCC's $50 Trillion recommendation is based on. RTWT. The situation isn't just worse than you think. It's worse than you can possibly imagine. And some of you have quite good imaginations.

The science is settled, you see, but no, you can't have the data. You can't even see what was done to quality control the data, because it might damage a government's ability to protect it's national interests.

Oops, gotta go. It's those darn Deniers, back on my lawn again ...

UPDATE: More on the UK Met office here.