Saturday, December 31, 2016

How Trump can pound the final nail into the Global Warming coffin

There's something floating around about how "100% of warming is due to data tampering".  It's worth your time to read it.

Long time readers know that I've been posting about climate science for quite a long time.  Newer readers who are interested in a condensed view of my opinions can read it here (it's sort of a "Climate Science 101" post for the educated layman).  Readers who want more depth and background (or who are gluttons for punishment) can get a list of climate posts here.

But everyone is familiar with the Global Warming scare machine which pumps out a never ending stream of ZOMGTHERMAGEDDON!!!11!!!ELEVENTY!!!  The climate science establishment feeds a stream of "hottest year ever" press releases to a media that is fully on board and which pushes this narrative.  Government funding to the tune of $100 Billion feeds the whole machine.

And yet the public is (rightly) skeptical of the whole thing.  Trump is making some right moves appointing skeptics to positions like head of the EPA.  Some have proposed cutting funding of climate science research by 80% or more.  These are good ideas, but won't directly address the problem of corrupted research and bureaucratic pushback.  Immodestly, I believe that I have something that will stop the global warming machine in its tracks in the space of a month, and keep it derailed for good.  And there's nothing that the bureaucracy and the scientists can do about it.

And it would be 100% scientific, which is why it would be so easy and why it would stick.  You clean up the climate databases:
If you look closely at climate data, you will find that all the major data sets consist of two parts:

Raw Data, which is the instrument reading: satellite, thermometer, or proxy (tree ring, ice core, etc). This is data straight from the sensor.

Adjustments, which are corrections applied to raw data to adjust for inconsistencies. For example, it is important to read the thermometer temperature at the same time every day. If the hottest time of the day is, say, 2:30 PM, but you read the thermometer at 10:00 AM, then the day's reading will be low. Adjustments are also made when weather stations are re-sited, and for other reasons.

An interesting question is how much of the 20th Century's warming came from adjustments, rather than from raw data?
Spoiler alert: according to the scientists themselves, over 85% of reported warming comes from adjustments to the data.  Re-stated, the data as recorded only show 15% of the ZOMGTHERMAGEDDON!!!11!!!eleventy!!! that is being fed to us.  Or all of it, if you believe the new post that's going around.

Now maybe these adjustments are actually correct, but it seems that the scientists should provide very solid and compelling reasons when and why they adjust the data.  Quite frankly, there are some good reasons to think that they are not doing this:
Anyway, lets look at the specific adjustments.  The lines in the chart below should add to the overall adjustment line in the chart above.
Ushcn_corrections2
  • Black line is a time of observation adjustment, adding about 0.3C since 1940
  • Light Blue line is a missing data adjustment that does not affect the data much since 1940
  • Red line is an adjustment for measurement technologies, adding about 0.05C since 1940
  • Yellow line is station location quality adjustment, adding about 0.2C since 1940
  • Purple line is an urban heat island adjustment, subtracting about 0.05C since 1950.
Let's take each of these in turn.  The time of observation adjustment is defined as follows:
The Time of Observation Bias (TOB) arises when the 24-hour daily summary period at a station begins and ends at an hour other than local midnight. When the summary period ends at an hour other than midnight, monthly mean temperatures exhibit a systematic bias relative to the local midnight standard
0.3C seems absurdly high for this adjustment, but I can't prove it.  However, if I understand the problem, a month might be picking up a few extra hours from the next month and losing a few hours to the previous month.  How is a few hour time shift really biasing a 720+ hour month by so large a number? I will look to see if I can find a study digging into this.  
I will skip over the missing data and measurement technology adjustments, since they are small.
The other two adjustments are fascinating.  The yellow line says that siting has improved on USHCN sites such that, since 1900, their locations average 0.2C cooler due to being near more grass and less asphalt today than in 1900.  
During this time, many sites were relocated from city locations to airports and from roof tops to grassy areas. This often resulted in cooler readings than were observed at the previous sites.
OK, without a bit of data, does that make a lick of sense?
Not to me it doesn't, and it shouldn't make sense to the Trump Administration, either.  And so my proposal:

Remove all adjustments from the climate databases and then allow them back only when justified for a single day at a single weather station.  If an adjustment is needed, then have NOAA specify why.  And report the last 100 years without any adjustments.

And this will basically kill the global warming movement.  It will reveal to the public that the data have been manipulated.  Those who complain about this will have to justify why unspecified and unjustified changes should be allowed to the data.  They will have to explain how that is scientific.  Quite, I don't see how the climate science establishment can effectively push back against this without confirming the skeptics' worst accusations.  I mean, do you want honest science or not?

And suddenly all the scientists who use that data set will have a data set without an artificial warming signal.  There will suddenly be a "97% consensus" that no warming is seen.

And this can all be done in a week.  No Congressional action needed, just the stroke of Trump's pen.  And then he can tell the EPA to justify all their new carbon rules ...

1 comment:

coconut commando said...

I’m not a climatologist or meteorologist, I’m a trigger puller who spends an insane amount of time outdoors for both duty and pleasure all over the world. Here’s what I’ve observed but remember, my disclaimer is above:

During the 1920s & 1930s the weather concern was Global Cooling per Donald Baxter MacMillan based off his observations of the Arctic over a period of over 30 expeditions there and the changes in glacier ice size, color and composition.

During the 1940s & 1950s the weather concern was an accelerated, oncoming ice age (nuclear winter) spurred by nuclear war; this was proposed by Dr. Harry Wexler whose observations were based off his research and his experience both in and outside the military and in his career as a meteorologist. Note – Dr. Wexler was the first scientist to deliberately fly into a hurricane to collect scientific data in 1944 – brass cojones.

During the 1960s, at a conference on climate change held in Boulder, Colorado, Dr. Paul Ehrlich provided evidence supporting Milankovitch cycles and triggered speculation on how the calculated small changes in sunlight might somehow trigger ice ages. His assessment was based off the combination of rapid population growth coupled with limited resources. His position of “The greenhouse effect is being enhanced now by the greatly increased level of carbon dioxide”. The greenhouse effect would “cook us” – so that decade’s concern was global warming.

During the 1970s, “The 1970 Study of Critical Environmental Problems” reported the possibility of warming from increased carbon dioxide, but no concerns about cooling, setting a lower bound on the beginning of interest in "global cooling". This was also the time scientists were looking into lead additives in fuel, DDT, asbestos and other FDA-approved products that were actually causing harm to both people and the environment.

During the 1980s, we had a hole in the ozone layer and were going to end up as crispy critters from over-exposure to UV rays or popsicles because of sudden-onset nuclear winter. Because the prevalent use of chlorofluorocarbon (CFCs) which contributed to ozone depletion in the upper atmosphere, the manufacture of such compounds were phased out under the Montreal Protocol, and they are being replaced with other products such as hydrofluorocarbons (HFCs). Lucky for us (sarcasm voice) we had the Chernobyl disaster to irradiate most of the world in one fell swoop. After that, there were several theories as to what would happen to global weather patterns as a result of the released radiation, none of them came to fruition.

During the 1990s, global cooling regained some of its momentum in part to multiple volcanic eruptions throughout the world. Keep in mind that in 1991, a prediction that massive oil well fires in Kuwait would cause significant effects on climate was incorrect. Although it was, in some cases, pitch black during the middle of the day, the oil fires had no effect on the “Hot as F*@#“ temperatures we experienced there during that time.

During the beginning of the 21st century, it was either all of the above but now we had a “fall guy” in the form of the Y2K coding problem and how it was going to affect our ability to monitor and predict weather phenomenon on any scale (especially global). Never mind that we have the Old Farmer's Almanac (published continuously since 1792 and started by Robert B. Thomas).

To calculate the Almanac's weather predictions, Thomas studied solar activity, astronomy cycles and weather patterns and used his research to develop a secret forecasting formula, which is still in use today. This has proven to be, in most cases, more accurate than today’s modern weather-related technology.

During the 2010s (and currently) it went from global warming, to global cooling and now we have “climate change” just to appear to know what the hell is going on. I’ll stick to the Almanac, available (accurate & reliable) research and observation as opposed to waiting for what the Weather Channel is going to come up with next.