CureZone   Log On   Join
 

How the world temperature “record” was manipulated through dropping of stations by pawel110 ..... Global Warming Discussion Forum

Date:   1/28/2011 10:02:17 AM ( 13 y ago)
Hits:   2,402
URL:   https://www.curezone.org/forums/fm.asp?i=1760397

1 readers agree with this message.  Hide votes     What is this?

How the world temperature “record” was manipulated through dropping of stations

Av sophiaalbertina

I have written extensively on this blog about the tweaking, “adjustment” and manipulation of the historic and present temperature “record” which are presented in the official figures.

With the poor placement of stations (91 % of the stations are CRN 3 to 5 = bad to very poor); where they have purposely taken away the urban heat island effect, use huge smoothing radius, the historical “adjustment and tweaking” to cool the past etc.

Not to mention the great slaughter of GHCN stations around 1990 – roughly 63 % of all climate measuring stations were “dropped”. Oddly enough many of them in cold places – Hmmm? Now the number of GHCN stations are back at the same numbers as in 1890.

(See for example my posts:

Rewriting Temperature History – Time and Time Again!,

More on the Blunder with NASA: s GISS Temperature data and the mess they have,

The Big dropout of weather stations since 1989 – A 66% reduction in 11 years,

The Big Difference Between GISS and UAH Temperature Data.

Minus 60 C or not?

The world has never seen such freezing heat OR the Blunder with NASA: s GISS Temperature data)

Just one example of this historical “adjustment and tweaking” they are doing:

On average 20% of the historical record was modified 16 times 2006 to beginning of 2008. The largest single jump was 0.27 C. This occurred between the Oct 13, 2006 and Jan 15, 2007 records when Aug 2006 changed from an anomaly of +0.43C to +0.70C, a change of nearly 68%.

And what a “coincidence” that the data is always “modified” in only on direction – guess which one.

Also remember that the US stations are now nearly a third of the all GHCN world stations.

And as I said in the beginningalways remember that these figures are based on the official data that has been tweaked, “adjusted” and manipulated to fit there agenda (cool the past, ignore UHI and land use change factors, huge smoothing radius – 1200km etc.)..

Just a couple of weeks ago a new report was published by Patrick Frank that shows that there has NEVER been a measurement of Sensor measurement uncertainty in ALL the weather stations used for the “Global” temperature “record”.  And that “the systematic error from uncontrolled variables has been invariably neglected”.

UNCERTAINTY IN THE GLOBAL AVERAGE SURFACE AIR TEMPERATURE INDEX: A REPRESENTATIVE LOWER LIMIT

Patrick Frank, Palo Alto, CA 94301-2436, USA, Energy and Environment, Volume 21, Number 8 / December 2010 DOI: 10.1260/0958-305X.21.8.969

Abstract here:

http://multi-science.metapress.com/content/q557742n3221/?p=e174cd02151f44d7bd...

Abstract :

“Sensor measurement uncertainty has never been fully considered in prior appraisals of global average surface air temperature. The estimated average ±0.2 C station error has been incorrectly assessed as random, and the systematic error from uncontrolled variables has been invariably neglected. The systematic errors in measurements from three ideally sited and maintained temperature sensors are calculated herein. Combined with the ±0.2 C average station error, a representative lower-limit uncertainty of ±0.46 C was found for any global annual surface air temperature anomaly. This ±0.46 C reveals that the global surface air temperature anomaly trend from 1880 through 2000 is statistically indistinguishable from 0 C, and represents a lower limit of calibration uncertainty for climate models and for any prospective physically justifiable proxy reconstruction of paleo-temperature. The rate and magnitude of 20th century warming are thus unknowable, and suggestions of an unprecedented trend in 20th century global air temperature are unsustainable.”

 Summary and Conclusion:

 “The assumption of global air temperature sensor noise stationarity is empirically untested and unverified. Estimated noise uncertainty propagates as

   rather than .

Future noise uncertainty in monthly means would greatly diminish if the siting of surface stations is improved and the sensor noise variances become known, monitored, and empirically verified as stationary.

The ±0.46 C lower limit of uncertainty shows that between 1880 and 2000, the trend in averaged global surface air temperature anomalies is statistically indistinguishable from 0 C at the 1σ level. One cannot, therefore, avoid the conclusion that it is presently impossible to quantify the warming trend in global climate since 1880.”

See also the letter to the Editors (APS Physics) by Patrick Frank:

http://www.aps.org/units/nes/newsletters/fall09.cfm

See also

http://noconsensus.wordpress.com/2011/01/20/what-evidence-for-unprecedented-w...

http://wattsupwiththat.com/2011/01/20/surface-temperature-uncertainty-quantif...

So I thought I show you the drastic dropping of weather stations in 1989-1992. Others have shown this before and done a very good job presenting it. But it is worth repeating because most people has no idea on what shaky grounds the temperature records are based.

And remember –This dropping of stations was done on purpose. And you can see on the graph what “happened” to the temperature after that. For some very “odd” reason it went up sharply.

Hmnnn??

Kan there be a connection???

And remember – Nearly ALL OF THESE STATIONS ARE STILL THERE AND GENERATING DATA.

In 3 years, from 1989 to 1992, 5218 stations were purposely “dropped”.

From 1993 to 2000 1384 more stations were “dropped”. A total of 6602 stations.

And if we compare with 1970 with1992 8445 stations have been “dropped”.

If we compare 1970 with year 2000 9829 stations have been “dropped”.

This is the ”logic and science” behind the Global warming Hysteria.

                        1970 (15 094 Stations)

                      1990 (9 475 Stations)

                        2000 (5 265 Stations)

Where did all the stations in China, India, Asia, Africa, Latin America, Middle East, Russia, Antarctica, and Australia go?????

AND WHY??????

Whole continents “just disappeared” and most of the landmass of Earth is now NOT COVERED.

And how do you compare the “average” Global temperature when they dropped 9829 stations between 1970 and 2000??????

9829 stations that where part of the “average” global temperature????

This is the “science” behind the Global warming Hysteria.

And it gets worse (which in itself s an “achievement”). Look at the map for 2010 – EVEN MORE landmass are “gone” on purpose. Including large parts of USA. See the huge contrast between 2000 an 2010 regarding USA.

                             2010

See also some of my previous post on this subject:

Climate Gate – All the manipulations and lies revealed 241

NASA ”systematically eliminated 75% of the world’s stations with a clear bias toward removing higher-latitude, high-altitude and rural locations.

Climate Gate – All the manipulations and lies revealed 211

How “they” (NASA) make Bolivia a VERY HOT PLACE EVEN WHEN THERE IS NO TEMPERATURE STATIONS OR DATA FROM THERE.

Another brilliant example of the trustworthiness of the Global Warming Hysteria.

And “their science”.

“One Small Problem. There has not been any thermometer data in GHCN since 1990.

None. Nada. Zilch. Nothing. Empty Set.

So just how can it be so Hot Hot Hot! in Bolivia if there is NO data?

Easy. GIStemp “makes it up” from “nearby” thermometers up to 1200 km away.

So what is within 1200 km of Bolivia? The beaches of Peru and the Amazon Jungle. Not exactly the same as snow capped peaks, but hey, you gotta make do with what you have, you know? (The official excuse given is that the data acceptance window closes on one day of the month and Bolivia does not report until after that date. Oh, and they never ever would want to go back and add date into the past after a close date. Yet they are happy to fiddle with, adjust, modify, and wholesale change and delete old data as they change their adjustment methods…)”

Here are some more glaring examples of this “tweaking and adjustment” of the temperature “record”:

NEW ENGLAND’S TEMPERATURE HISTORY AND TRENDS (1911 – 2009)

http://scienceandpublicpolicy.org/images/stories/papers/originals/ne_temp_his...

WHY NOAA AND NASA PROCLAMATIONS SHOULD BE IGNORED

http://scienceandpublicpolicy.org/images/stories/papers/originals/noaa_2010_r...

“NASA/NOAA homogenization process has been shown to significantly alter the trends in many stations where the siting and rural nature suggest the data is reliable. In fact, adjustments account for virtually all the trend in the data. Unadjusted data for the best sites/rural shows cyclical multi-decadal variations but no net long term trend as former NASA scientist Dr. Ed Long showed here. He showed however that after adjustment, the rural data trend was made consistent with the urban data set with an artificial warming introduced.“

See also

http://climateaudit.org/2010/12/26/nasa-giss-adjusting-the-adjustments/

Just look at this “tweaking” done by NASA/NOAA in August 2007 to the temperature “record”. They just “happened” to LOWER the temperature 1880-1900 by OVER 0.3 C and then they just “happened” to RISE the temperature 1990-2007 by OVER 0.2 C. So “suddenly” you have a nice “warming trend” where there were NONE before. In fact it was a lowering trend from year 2000 which “suddenly” change to a warming trend with OVER 0.4 C difference.

This is the “science” behind the Global warming Hysteria.

US Agencies Still Fiddling Temperature Record, Reports SPPI

http://www.transworldnews.com/NewsStory.aspx?storyid=671981&ret=close

NASA and NOAA, which each receive close to half a billion dollars a year in taxpayer funding, have been systematically fiddling the worldwide temperature record for years, making “global warming” look worse than it is, according to a new paper by the Science and Public Policy Institute.  The findings are reported by Joe D’Aleo, a leading meteorologist.

Robert Ferguson, President of SPPI, said: “Despite billions spent on official claims about the supposed threat of catastrophic man-made ‘global warming’, opinion polls show the public are no longer fooled. A  main reason why the voters buy don’t climate alarmism any more is that the tiny but well-connected, lavishly-funded Climategate clique keeps on being caught out bending the scientific evidence.

The problem of data integrity has recently been commented on by MIT’s Dr. Richard Lindzen, “Inevitably in climate science, when data conflicts with models, a small coterie of scientists can be counted upon to modify the data…That the data should always need correcting to agree with models is totally implausible and indicative of a certain corruption within the climate science community.”

Mr. D’Aleo’s paper is a damning exposé of the inner workings of two agencies of the US Government –

The global temperature data from surface stations is “seriously compromised: the data suffer significant contamination by urbanization and other local factors such as changes in land cover and land use”. Numerous peer review papers suggest contamination of 30%, 50% or more.

The state of the temperature database, in the words of one of its operators, is “hopeless”, with “hundreds if not thousands of pairs of dummy  and duplicate stations”.

•The NASA warming is achieved in part by inventing data in arctic areas where no stations exist.

In the US, the warmest decade of the 20th century was the 1930s, and the warmest year was 1934, NASA’s chief climate scientist announced after the last super El Nino.

NOAA tampered with temperature data in 2000, 2007 and 2009 to create an artificial increase of 0.3 F° in the warming trend since the 1930s.

•NASA admits even today on their website, there is no generally-accepted standard for surface air temperatures.

Temperatures for the 1930s to 1950s have been readjusted downward to make the warming since then seem greater than it is.

Temperatures for recent decades have been readjusted upward to make the warming of the 20th century seem greater than it is.

Over time in the NASA database, the warming trend has been steadily increasing – not because the weather is getting warmer but because NASA keeps tampering with the data.

The data tampering became more serious and more frequent in 2007, when a strong la Niña caused widespread and profound global cooling.

Adjustments by NOAA and NASA, rather than real-world temperature changes, account for virtually all the apparent warming trend in the global data.

NASA and NOAA have repeatedly resisted Freedom of Information Act requests for release of the unadjusted data and documentation of adjustments made, probably because they fear independent analysis will demonstrate the adjustments are unwarranted and warming insignificant

Global temperature databases are “seriously flawed” and “can no longer be trusted to assess climate trends or rankings or validate model forecasts”.

In a lengthy paper updated in August 2010, Surface Temperature Records: Policy Driven Deception? , Watts and D’Aleo catalogued numerous case studies of temperature data tampering around the world.  This issue is of critical importance  because these very data sets are used as justification of advocacy for formulating and implementing unprecedented policy decisions seeking  radical transformations of our society and institutions.

Said Ferguson, “So blatantly obvious has the tampering become that Congress must mandate a thorough investigation of the temperature records, independent of the government scientists controlling them.  A ‘B’ team of non-government and non-UN experts must be established with access to all the raw data, records, adjustments, fudges, bodges  and computer codes currently being black-boxed by government scientists.”

SURFACE TEMPERATURE RECORDS: POLICY-DRIVEN DECEPTION?

http://scienceandpublicpolicy.org/images/stories/papers/originals/surface_tem...

SUMMARY FOR POLICY MAKERS

(by SPPI)

1. Instrumental temperature data for the pre-satellite era (1850-1980) have been so widely, systematically, and uni-directionally tampered with that it cannot be credibly asserted there has been any significant “global warming” in the 20th century.

2. All terrestrial surface-temperature databases exhibit signs of urban heat pollution and post measurement adjustments that render them unreliable for determining accurate long-term temperature trends.

3. All of the problems have skewed the data so as greatly to overstate observed warming both regionally and globally.

4. Global terrestrial temperature data are compromised because more than three-quarters of the 6,000 stations that once reported are no longer being used in data trend analyses.

5. There has been a significant increase in the number of missing months with 40% of the GHCN stations reporting at least one missing month. This requires infilling which adds to the uncertainty and possible error.

 

6. Contamination by urbanization, changes in land use, improper siting, and inadequately-calibrated instrument upgrades further increases uncertainty.

7. Numerous peer-reviewed papers in recent years have shown the overstatement of observed longer term warming is 30-50% from heat-island and land use change contamination.

8. An increase in the percentage of compromised stations with interpolation to vacant data grids may make the warming bias greater than 50% of 20th-century warming.

9. In the oceans, data are missing and uncertainties are substantial. Changes in data sets introduced a step warming in 2009.

10. Satellite temperature monitoring has provided an alternative to terrestrial stations in compiling the global lower-troposphere temperature record. Their findings are increasingly diverging from the station-based constructions in a manner consistent with evidence of a warm bias in the surface temperature record.

11. Additional adjustments are made to the data which result in an increasing apparent trend. In many cases, adjustments do this by cooling off the early record.

12. Changes have been made to alter the historical record to mask cyclical changes that could be readily explained by natural factors like multi-decadal ocean and solar changes.

13. Due to recently increasing frequency of eschewing rural stations and favoring urban airports as the primary temperature data sources, global terrestrial temperature data bases are thus seriously flawed and can no longer be representative of both urban and rural environments. The resulting data is therefore problematic when used to assess climate trends or VALIDATE model forecasts.

14. An inclusive external assessment is essential of the surface temperature record of CRU, GISS and NCDC “chaired and paneled by mutually agreed to climate scientists who do not have a vested interest in the outcome of the evaluations.”

15. Reliance on the global data by both the UNIPCC and the US GCRP/CCSP should trigger a review of these documents assessing the base uncertainty of forecasts and policy language.

Läs även andra bloggares åsikter om <a href=”http://bloggar.se/om/milj%F6” rel=”tag”>miljö</a>, <a href=” http://bloggar.se/om/yttrandefrihet” rel=”tag”>yttrandefrihet</a>, <a href=”http://bloggar.se/om/fri-+och+r%E4ttigheter” rel=”tag”>fri- och rättigheter</a>, Läs även andra bloggares åsikter om <a href=” http://bloggar.se/om/USA” rel=”tag”>USA</a>

 

http://uddebatt.wordpress.com/2011/01/23/how-the-world-temperature-%E2%80%9Cr...


 

<< Return to the standard message view

fetched in 0.00 sec, referred by http://www.curezone.org/forums/fmp.asp?i=1760397