Active Users:257 Time:21/05/2024 05:00:57 PM
That dog don't hunt Isaac Send a noteboard - 30/11/2009 07:49:04 AM
Forgetting for the moment the various credible accusations based of their emails of distorting data, tampering with the peer-review process, conspiring to hide information from FOI, etc, there is the repeated exscuse that they could not store all their data in the 80s.

This, of course, is utter BS. By 1974 IBM gave us nice little cylinders you could hold in your hand that stored 50 MB, we had 3 1/2 drives storing a quarter of a Mb by 82 and tape drives storing 200 MB with rapid access by 84 that were slightly smaller than a paperback - and not a thick fantasy novel.

Now how much data was there, that they felt obliged to toss it away? We're spoiled by videos taking up gigabytes but a one megabyte is a lot of memory where data is concerned.

If they stored it in ASCII, as one byte per character, or in decimal at 4 bit instead, with just numbers 0-9 and maybe a break character or two (16 to play with at 4 bit after all) then one megabyte lets them store either one million characters or 2 millon numerical digits.

What was that data? Was it just high and low for the day, maybe a four digit number in celsius? Some abbreviated timestamp and four digits then, with a longer timestamp from time to time and a location identifier? You could record a hundred years of data on 1 megabyte from one monitor station that way and still have room left over.

But maybe they did more, maybe they took hourly readings of temperature, pressure, humidity, wind direction and speed, rain fall. You wouldn't record that in an easy read format, you'd mash it, so maybe 176928161320145 means 17.69 Celsius, 28.1 pressure, 61% humidity, wind at 320 bearing 14.5 km/h. Toss in a timestamp like 840011445 for 001 - January 1st of 1984 and 1445 or 2:45 PM, what was that, 25 characters, 25 bytes, 26 with a break or more likely 13 since you'd not waste memory with ASCII. At hourly readings, about third of a kilobyte a day per station, about 100 kb a year, per station, hourly readings of everything I can think of with no timestamp abridgement and probably more significant digits then were actually recorded, but maybe they did cloud cover percent, dewpoint, etc too. So let's say 100 kb, keeping in mind that a 400k word book like RJ writes contains millions of characters, so one stations hard copy results for a year could be printed in less than a paperbakc, much less, even with nice formatting. One of those 50 Mb cylinders from 1974 shold have been able to store a decade of data from 50 locations, one of those tape drives from 1984 could have done 200 and fit in less space than a paperback. How many monitoring stations did they have, ten thousand? All 10,000, for a decade, 10 Gig, neatly fitting on to 200 3480 tapes or 800 of the old 3850 ones from 1974. And this whole time, a sdata is coming in, the clutter isn't happening because more compact and cheaper memory devices are arriving faster than you're filling your old ones up

So no, I don't buy the memory limitation stuff some people have been chucking around.

You don't get rid of your raw data, especially when it can't be easily replicated in a lab. Pretty hard to repeat weather on earth. Sure, you've got a neat experiment that can be done over quick and easy and shows no anomalies, you don't really need to keep the data around, but even then you do becuase someone might challenge it, and some enterprising student might come by years later, notice an anomaly and find a whole new field of science from it. "Interesting, your temps are .01 degree to high in this region, even a bit higher at night, turns the ground is warmer than it shoud be because you're sitting on a patch of uranium. Weird, holy heck, it's an asteroid impact! And we dug it up and found out that it dates to an extinction event!" etc. Vast amounts of new science emerges from what often looks and essentially was noise, totally irrelevant often to the experiment being run but a vital clue for something else. SO them losing their raw data is bad just for that.

It also makes all their results that rely on that data garbage, probably true anyway, but still garbage, because no one can look at the data to check. It's not just a bunch of anti-GW people pissed at this, it's a bunch of scientists, many firm advocates of AGW. And we know about this without those emails, we've known for months, probably why they got hacked. Dr Pielke - climatoligist and big advocate of AGW - asked for their data months ago and posted their BS answer along with some very nasty remarks.

Now as for the emails, yeah, it was an invasion of privacy, though we now know they were conspiring - out of their own mouths - to avoid FOI, which is contemptible from a scientist. Just because it was obtained immorally doesn't mean we stick our heads in the sand, if someone is peeping in people's bedroom windows and sees some guy with a ten year old tied up on his bed, we don't go "Oh, well you invaded his privacy".
The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.
- Albert Einstein

King of Cairhien 20-7-2
Chancellor of the Landsraad, Archduke of Is'Mod
Reply to message
Global Warming: Climategate - 28/11/2009 05:42:50 PM 646 Views
The WSJ sums it up quite well..... - 28/11/2009 05:58:49 PM 333 Views
i think both sides are likely incorrect - 28/11/2009 06:21:13 PM 319 Views
I worry about more then that - 29/11/2009 05:58:29 AM 324 Views
Reaction must be severe - 29/11/2009 09:24:23 PM 286 Views
I think the problem is that it just isn't actually much of a story, though clearly some wish it was - 29/11/2009 11:49:19 PM 398 Views
That dog don't hunt - 30/11/2009 07:49:04 AM 318 Views
What you do or don't do with your dog is between you and your dog. And perhaps the legal system. - 01/12/2009 04:24:00 PM 361 Views
The loss of the data was known prior to the emails - 01/12/2009 07:39:36 PM 430 Views
I think I love you. - 01/12/2009 10:21:13 PM 423 Views

Reply to Message