After an altercation on wuwt about watts not releasing data for his surface stations and then complaining that another has used some early stuff. I suggested that this was similar to CRU witholding data and he suggests that I am hiding behind a pseudonym. My reply:
tfp (20:20:22) :
Anthony this is why I will not post under my real name:
http://www.guardian.co.uk/environment/georgemonbiot/2010/jan/27/james-delingpole-climate-change-denial
“Within a few minutes of the comments opening, they had published the man’s telephone number and email address, a photo of his house (“Note all the recycling going on in his front garden”), his age and occupation”
http://blogs.telegraph.co.uk/news/jamesdelingpole/100024152/monbiot-an-apology/ Damocles on Jan 28th, 2010 at 2:23:
“I’ve got to say that I saw the comments on that blog and I was rather shocked.
Bue then you pulled it and that was a redeeming act.”
I’m sure you get similar.
I do not want to expose my family to hate mail/death threats/abuse. My pseudonym is my firewall.
I have posted analysis on scientific threads/ I have posted comments like the above where I see unfair bias. Its your blog do what you will!
REPLY: OK Fine, final question then. Your electronics company there in the UK has a contract with the U.S. Navy for some avionics test systems. Somebody takes that design, reverse engineers it, and sells a product based on your work. Is that fair use?
That’s the case with me here. All my pages have a copyright notice on them. I did the work for over two years, and Menne et al took the work and made something from it without permission, against my protestations even. Unless you are prepared to say your company’s designs should be fair game for anyone to use and profit from, I suggest you kindly refrain from criticizing my project further. – Anthony
So the great man exposes the company I work for after my explanation of the my posting anonimity. Is this a threat to me saying he knows my employer and will complain about posting on company time (actually break times) or is he threatening to expose my home address (availablefrom my ip address).
Who knows!
Watts uses this trick whenever he is annoyed by posters. Very sad!
Update 2009/02/14
Good Grief - it makes me so angry!
Posted reasonable stuff and then this happens
tfp (14:56:46) :
DeWitt Payne (10:14:57) :
My guess is that the x axis is in decimal years and so your slope is 2.58E-03 C/year not month.
Too true. my engineering caution deserted me. Apologies!
What is evident from my plot is that the period from 1985 to present does not (yet !!) conform to the general linear trend. Adding a trend line to 1985 to present gives a warming of 4.4degC/century (got it right this time I think)
http://img20.imageshack.us/img20/7086/cet.png
The current trend in CET is negative so there is a possibility that in a decade or so there will be a return to the .3C/100year average. But can we wait to find out?
Looking at satellite data:
http://img200.imageshack.us/img200/6361/amsu.png
data: http://discover.itsc.uah.edu/amsutemps/
the channel CHLT (no longer reported – too much of an incline??!!) gives a temp increase of 11C/century.
It would be interesting to know why this channel was dropped.
REPLY: Still can’t stay away, shiny new email address eh Bill? You never did respond to this after accusing me of improper conduct. Me thinks its time for you to be put into the troll bin, since changing email addresses and handles is a no no.
…
OK Fine, final question then. Your electronics company there in the UK has a contract with the U.S. Navy for some avionics test systems. Somebody takes that design, reverse engineers it, and sells a product based on your work. Is that fair use?
That’s the case with me here. All my pages have a copyright notice on them. I did the work for over two years, and Menne et al took the work and made something from it without permission, against my protestations even. Unless you are prepared to say your company’s designs should be fair game for anyone to use and profit from, I suggest you kindly refrain from criticizing my project further. – Anthony
replied with
I posted as "tfp formerly bill" for a time to alert people to the name change - there were too many bills.
Shiny new email is because I could not remember (and the cookie with the correct one disappeared) which I used on this blog - I have many to trap blog operators who could sell it to spammers.
You never did respond to this after accusing me of improper conduct. Me thinks its time for you to be put into the troll bin, since changing email addresses and handles is a no no.
I responded with a long reply (long since forgotten) but it never appeared - I assumed I was banned. perhaps it is in your spam bin?
OK Fine, final question then. Your electronics company there ...test systems.
I do not appreciate my empoyment being referenced on line by yourself.
From my IP addresses you obviously can determine my home address, real name, and from that many other personal details. I trust these will NOT be exposed?
Reading my comments leading to your original reply I made NO accusation of your improper conduct.
(re-reading I did say accessing CRU data that was not public fell into the computer misuse act :
A person is guilty of an offence if
(a) he causes a computer to perform any function with intent to secure access to any program or data held in any computer;
(b) the access he intends to secure is unauthorised; and
(c) he knows at the time when he causes the computer to perform the function that that is the case.)
---
I would suggest that if you did not access the data then there is no problem. I was only pointing out a fact - the Computer Misuse Act
And it is blocked! I thought it reasonable!!!!!!!!!!!!!!!
2010/01/31
2010/01/14
Spencer: Clouds Dominate CO2 as a Climate Driver Since 2000
13
01
2010
By Dr. Roy Spencer, PhD.
Last year I posted an analysis of satellite observations of the 2007-08 global cooling event, showing evidence that it was due to a natural increase in low cloud cover. Here I will look at the bigger picture of what how the satellite-observed variations in Earth’s radiative budget compare to that expected from increasing carbon dioxide. Is there something that we can say about the relative roles of nature versus humanity based upon the evidence?
What we will find is evidence consistent with natural cloud variations being the dominant source of climate variability since 2000.
CERES Observations of Global Energy Budget Changes
The following graph shows the variations in the Earth’s global-average radiative energy balance as measured by the CERES instrument on NASA’s Terra satellite. These are variations in the imbalance between absorbed sunlight and emitted infrared radiation, the most fundamental quantity associated with global warming or global cooling. Also show (in red) are theoretically calculated changes in radiative forcing from increasing carbon dioxide as measured at Mauna Loa.
Since there is some uncertainty in the absolute accuracy of the CERES measurements, where one puts the zero line is also somewhat uncertain. Therefore, it’s the variations since 2000 which are believed to be pretty accurate, and the exact dividing line between Earth gaining energy and Earth losing energy is uncertain. Significantly, all of the downward trend is in the reflected sunlight portion, not the infrared portion of the variations. We similarly can not reference where the zero line should be for the CO2 forcing, but the reasons for this are more complex and I will not address them here.
In order to compare the variations in the CO2 forcing (in red) to the satellite observations, we need to account for the fact that the satellite observes forcing and feedback intermingled together. So, let’s remove a couple of estimates of feedback from the satellite measurements to do a more direct comparison.
Inferred Forcing Assuming High Climate Sensitivity (IPCC View)
Conceptually, the variations in the Earth’s radiative imbalance are a mixture of forcing (e.g. increasing CO2; clouds causing temperature changes), and feedback (e.g. temperature changes causing cloud changes). We can estimate the forcing part by subtracting out the feedback part.
First, let’s assume that the IPCC is correct that climate sensitivity is pretty high. In the following chart I have subtracted out an estimate of the feedback portion of the CERES measurements based upon the IPCC 20-model average feedback parameter of 1.4 W m-2 K-1 times the satellite AMSU-measured tropospheric temperature variations
As can be seen, the long-term trend in the CERES measurements is much larger than can be accounted for by increasing carbon dioxide alone, which is presumably buried somewhere in the satellite-measured signal. In fact, the satellite observed trend is in the reflected sunlight portion, not the infrared as we would expect for increasing CO2 (not shown).
Inferred Forcing Assuming Low Climate Sensitivity (”Skeptical” View)
There has been some published evidence (our 2007 GRL paper, Lindzen & Choi’s 2009 paper) to suggest the climate system is quite insensitive. Based upon that evidence, if we assume a net feedback parameter of 6 W m-2 K-1 is operating during this period of time, then removing that feedback signal using AMSU channel 5 yields the following history of radiative forcing:
As can be seen, the relative size of the natural forcings become larger since more forcing is required to cause the same temperature changes when the feedback fighting it is strong. Remember, the NET feedback (including the direct increase in emitted IR) is always acting against the forcing…it is the restoring force for the climate system.
What this Might Mean for Global Warming
The main point I am making here is that, no matter whether you assume the climate system is sensitive or insensitive, our best satellite measurements suggest that the climate system is perfectly capable of causing internally-generated radiative forcing larger than the “external” forcing due to increasing atmospheric carbon dioxide concentrations. Low cloud variations are the most likely source of this internal radiative forcing. It should be remembered that the satellite data are actually measured, whereas the CO2 forcing (red lines in the above graphs) is so small that it can only be computed theoretically.
The satellite observed trend toward less energy loss (or, if you prefer, more energy gain) is interesting since there was no net warming observed during this time. How could this be? Well, the satellite observed trend must be due to forcing only since there was no warming or cooling trend during this period for feedback to act upon. And the lack of warming from this substantial trend in the forcing suggests an insensitive climate system.
If one additionally entertains the possibility that there is still considerable “warming still in the pipeline” left from increasing CO2, as NASA’s Jim Hansen claims, then the need for some natural cooling mechanism to offset and thus produce no net warming becomes even stronger. Either that, or the climate system is so insensitive to increasing CO2 that there is essentially no warming left in the pipeline to be realized. (The less sensitive the climate system, the faster it reaches equilibrium when forced with a radiative imbalance.)
Any way you look at it, the evidence for internally-forced climate change is pretty clear. Based upon this satellite evidence alone, I do not see how the IPCC can continue to ignore internally-forced variations in the climate system. The evidence for its existence is there for all to see, and in my opinion, the IPCC’s lack of diagnostic skill in this matter verges on scientific malpractice.
--------------------------------------------------------
Thank you Dr Spencer for this.
So proof at last that Kevin Trenberth is correct in his travesty email
the energy in is greater than the energy out = warming (not seen!!!)
Kevin Trenberth:
" The fact is that we can't account for the lack of warming at the moment and it is a travesty that we can't. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate."
....
" We are not close to balancing the energy budget. The fact that we can not account for what is happening in the climate system makes any consideration of geoengineering quite hopeless as we will never be able to tell if it is successful or not! It is a travesty!"
Michael Mann wrote:
" Kevin, that's an interesting point. As the plot from Gavin I sent shows, we can easily account for the observed surface cooling in terms of the natural variability seen in the CMIP3 ensemble (i.e. the observed cold dip falls well within it). So in that sense, we can "explain" it. But this raises the interesting question, is there something going on here w/ the energy & radiation budget which is inconsistent with the modes of internal variability that leads to similar temporary cooling periods within the models.
I'm not sure that this has been addressed--has it?"
Perhaps an apology to Trenberth is in order?
Spencer attribute the warming to clouds - this does not seem to agree with Svensmark where increased clouds = cooling.
Also clouds should be increasing - causing cooling - due to increased GCRs with quiet sun.
Would it be fair to say it is a travesty?
..........
GCR-climate connection - GCR’s increase low cloud cover via increased CCN production (via increased atmospheric ionization), which acts as a cooling effect on the climate.
IPCC
6.11.2.2 Cosmic rays and clouds
Svensmark and Friis-Christensen (1997) demonstrated a high degree of correlation between total cloud cover, from the ISCCP C2 data set, and cosmic ray flux between 1984 and 1991. Changes in the heliosphere arising from fluctuations in the Sun's magnetic field mean that galactic cosmic rays (GCRs) are less able to reach the Earth when the Sun is more active so the cosmic ray flux is inversely related to solar activity.
boballab (16:49:58) :
Last year I posted an analysis of satellite observations of the 2007-08 global cooling event, showing evidence that it was due to a natural increase in low cloud cover.
Low level Clouds=COOLING
Lack of low level Clouds=WARMING
Not
more low level clouds=Warming
Spencer agrees with Svenmark more clouds, more cooling
Low solar activity = high gcrs
high gcr=more low cloud=cooling = svensmark
as solar minimum was approached (=high gcrs=cooling=svensmark) the plots above show earth gaining excess energy.
I do not, I admit, understand how more energy = cooling
The plots show CO2 line doing the correct thing more co2=earth gaining excess energy so the plots are named correctly.
To me svensmark does not equal spencer
but trenberth = honest:
The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate"
svensmark says -ve
spencer says +ve
trenberth says our observing system is inadequate.
Who is being honest here?
13
01
2010
By Dr. Roy Spencer, PhD.
Last year I posted an analysis of satellite observations of the 2007-08 global cooling event, showing evidence that it was due to a natural increase in low cloud cover. Here I will look at the bigger picture of what how the satellite-observed variations in Earth’s radiative budget compare to that expected from increasing carbon dioxide. Is there something that we can say about the relative roles of nature versus humanity based upon the evidence?
What we will find is evidence consistent with natural cloud variations being the dominant source of climate variability since 2000.
CERES Observations of Global Energy Budget Changes
The following graph shows the variations in the Earth’s global-average radiative energy balance as measured by the CERES instrument on NASA’s Terra satellite. These are variations in the imbalance between absorbed sunlight and emitted infrared radiation, the most fundamental quantity associated with global warming or global cooling. Also show (in red) are theoretically calculated changes in radiative forcing from increasing carbon dioxide as measured at Mauna Loa.
Since there is some uncertainty in the absolute accuracy of the CERES measurements, where one puts the zero line is also somewhat uncertain. Therefore, it’s the variations since 2000 which are believed to be pretty accurate, and the exact dividing line between Earth gaining energy and Earth losing energy is uncertain. Significantly, all of the downward trend is in the reflected sunlight portion, not the infrared portion of the variations. We similarly can not reference where the zero line should be for the CO2 forcing, but the reasons for this are more complex and I will not address them here.
In order to compare the variations in the CO2 forcing (in red) to the satellite observations, we need to account for the fact that the satellite observes forcing and feedback intermingled together. So, let’s remove a couple of estimates of feedback from the satellite measurements to do a more direct comparison.
Inferred Forcing Assuming High Climate Sensitivity (IPCC View)
Conceptually, the variations in the Earth’s radiative imbalance are a mixture of forcing (e.g. increasing CO2; clouds causing temperature changes), and feedback (e.g. temperature changes causing cloud changes). We can estimate the forcing part by subtracting out the feedback part.
First, let’s assume that the IPCC is correct that climate sensitivity is pretty high. In the following chart I have subtracted out an estimate of the feedback portion of the CERES measurements based upon the IPCC 20-model average feedback parameter of 1.4 W m-2 K-1 times the satellite AMSU-measured tropospheric temperature variations
As can be seen, the long-term trend in the CERES measurements is much larger than can be accounted for by increasing carbon dioxide alone, which is presumably buried somewhere in the satellite-measured signal. In fact, the satellite observed trend is in the reflected sunlight portion, not the infrared as we would expect for increasing CO2 (not shown).
Inferred Forcing Assuming Low Climate Sensitivity (”Skeptical” View)
There has been some published evidence (our 2007 GRL paper, Lindzen & Choi’s 2009 paper) to suggest the climate system is quite insensitive. Based upon that evidence, if we assume a net feedback parameter of 6 W m-2 K-1 is operating during this period of time, then removing that feedback signal using AMSU channel 5 yields the following history of radiative forcing:
As can be seen, the relative size of the natural forcings become larger since more forcing is required to cause the same temperature changes when the feedback fighting it is strong. Remember, the NET feedback (including the direct increase in emitted IR) is always acting against the forcing…it is the restoring force for the climate system.
What this Might Mean for Global Warming
The main point I am making here is that, no matter whether you assume the climate system is sensitive or insensitive, our best satellite measurements suggest that the climate system is perfectly capable of causing internally-generated radiative forcing larger than the “external” forcing due to increasing atmospheric carbon dioxide concentrations. Low cloud variations are the most likely source of this internal radiative forcing. It should be remembered that the satellite data are actually measured, whereas the CO2 forcing (red lines in the above graphs) is so small that it can only be computed theoretically.
The satellite observed trend toward less energy loss (or, if you prefer, more energy gain) is interesting since there was no net warming observed during this time. How could this be? Well, the satellite observed trend must be due to forcing only since there was no warming or cooling trend during this period for feedback to act upon. And the lack of warming from this substantial trend in the forcing suggests an insensitive climate system.
If one additionally entertains the possibility that there is still considerable “warming still in the pipeline” left from increasing CO2, as NASA’s Jim Hansen claims, then the need for some natural cooling mechanism to offset and thus produce no net warming becomes even stronger. Either that, or the climate system is so insensitive to increasing CO2 that there is essentially no warming left in the pipeline to be realized. (The less sensitive the climate system, the faster it reaches equilibrium when forced with a radiative imbalance.)
Any way you look at it, the evidence for internally-forced climate change is pretty clear. Based upon this satellite evidence alone, I do not see how the IPCC can continue to ignore internally-forced variations in the climate system. The evidence for its existence is there for all to see, and in my opinion, the IPCC’s lack of diagnostic skill in this matter verges on scientific malpractice.
--------------------------------------------------------
Thank you Dr Spencer for this.
So proof at last that Kevin Trenberth is correct in his travesty email
the energy in is greater than the energy out = warming (not seen!!!)
Kevin Trenberth:
" The fact is that we can't account for the lack of warming at the moment and it is a travesty that we can't. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate."
....
" We are not close to balancing the energy budget. The fact that we can not account for what is happening in the climate system makes any consideration of geoengineering quite hopeless as we will never be able to tell if it is successful or not! It is a travesty!"
Michael Mann wrote:
" Kevin, that's an interesting point. As the plot from Gavin I sent shows, we can easily account for the observed surface cooling in terms of the natural variability seen in the CMIP3 ensemble (i.e. the observed cold dip falls well within it). So in that sense, we can "explain" it. But this raises the interesting question, is there something going on here w/ the energy & radiation budget which is inconsistent with the modes of internal variability that leads to similar temporary cooling periods within the models.
I'm not sure that this has been addressed--has it?"
Perhaps an apology to Trenberth is in order?
Spencer attribute the warming to clouds - this does not seem to agree with Svensmark where increased clouds = cooling.
Also clouds should be increasing - causing cooling - due to increased GCRs with quiet sun.
Would it be fair to say it is a travesty?
..........
GCR-climate connection - GCR’s increase low cloud cover via increased CCN production (via increased atmospheric ionization), which acts as a cooling effect on the climate.
IPCC
6.11.2.2 Cosmic rays and clouds
Svensmark and Friis-Christensen (1997) demonstrated a high degree of correlation between total cloud cover, from the ISCCP C2 data set, and cosmic ray flux between 1984 and 1991. Changes in the heliosphere arising from fluctuations in the Sun's magnetic field mean that galactic cosmic rays (GCRs) are less able to reach the Earth when the Sun is more active so the cosmic ray flux is inversely related to solar activity.
boballab (16:49:58) :
Last year I posted an analysis of satellite observations of the 2007-08 global cooling event, showing evidence that it was due to a natural increase in low cloud cover.
Low level Clouds=COOLING
Lack of low level Clouds=WARMING
Not
more low level clouds=Warming
Spencer agrees with Svenmark more clouds, more cooling
Low solar activity = high gcrs
high gcr=more low cloud=cooling = svensmark
as solar minimum was approached (=high gcrs=cooling=svensmark) the plots above show earth gaining excess energy.
I do not, I admit, understand how more energy = cooling
The plots show CO2 line doing the correct thing more co2=earth gaining excess energy so the plots are named correctly.
To me svensmark does not equal spencer
but trenberth = honest:
The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate"
svensmark says -ve
spencer says +ve
trenberth says our observing system is inadequate.
Who is being honest here?
Labels:
co2. clouds,
spencer
2009/12/17
Interesting
http://www.aip.org/history/powerpoints/GlobalWarming_Oreskes.ppt
Some sea temp stuff
arry Lu (20:11:48) :
” George E. Smith (18:08:33) :
Bob you are so charitable. LWIR warms the top few cm. I figure that atmospheric (tropospheric anyway) LWIR can hardly be significant below about 3-4 microns…; so lets be generous and say it might warm the top 10 microns. How much of that energy remains following the prompt evaporation from that hot skin.”
Have you not forgotten conduction? It operates in all directions!
So we have the top few cm heated by sw and lw and a few 10s meters down heated by UV
http://ies.jrc.ec.europa.eu/uploads/fileadmin/Documentation/Reports/Global_Vegetation_Monitoring/EUR_2006-2007/EUR_22217_EN.pdf
So the surface cm is absorbing a percentage of the SW (as does each cm of the deeper water except the percentage is of a progressively smaller maximum) plus all the LW re-radiated from GHGs.
The surface is also receiving LW from the layer under the surface and radiating LW down to this lower layer. Because the surface is hotter this will average out to an energy transfer downwards.
So the hotter the surface the less the lower water energy will be radiated (lost) into the atmosphere. Less loss with the same SW TSI heating the lower layers will mean a hotter temperature.
Of course the surface is loosing heat via conduction in all directions radiation in all direction, and forced air convection upwards (sideways!)
However, The surface layer heating must effect the lower layer cooling in my books.
According to your diagram of energy buget:
http://wattsupwiththat.files.wordpress.com/2010/03/trenberth_mine_latest_big1.jpg
Only 169 w/m^2 of SW radiation gets absorbed (198w/m^2 hits the ground)
The back radiation from GHGs is 321 w/m^2 absorbed by the ground.
If 321W/m^2 is absorbed in the top layer and 169w/m^2 is absorbed in 10s meters the the top layer will be much warmer than the lower layers.
So is it not true that this top layer must control the temperature of the lower layers?
Some sea temp stuff
arry Lu (20:11:48) :
” George E. Smith (18:08:33) :
Bob you are so charitable. LWIR warms the top few cm. I figure that atmospheric (tropospheric anyway) LWIR can hardly be significant below about 3-4 microns…; so lets be generous and say it might warm the top 10 microns. How much of that energy remains following the prompt evaporation from that hot skin.”
Have you not forgotten conduction? It operates in all directions!
So we have the top few cm heated by sw and lw and a few 10s meters down heated by UV
http://ies.jrc.ec.europa.eu/uploads/fileadmin/Documentation/Reports/Global_Vegetation_Monitoring/EUR_2006-2007/EUR_22217_EN.pdf
So the surface cm is absorbing a percentage of the SW (as does each cm of the deeper water except the percentage is of a progressively smaller maximum) plus all the LW re-radiated from GHGs.
The surface is also receiving LW from the layer under the surface and radiating LW down to this lower layer. Because the surface is hotter this will average out to an energy transfer downwards.
So the hotter the surface the less the lower water energy will be radiated (lost) into the atmosphere. Less loss with the same SW TSI heating the lower layers will mean a hotter temperature.
Of course the surface is loosing heat via conduction in all directions radiation in all direction, and forced air convection upwards (sideways!)
However, The surface layer heating must effect the lower layer cooling in my books.
According to your diagram of energy buget:
http://wattsupwiththat.files.wordpress.com/2010/03/trenberth_mine_latest_big1.jpg
Only 169 w/m^2 of SW radiation gets absorbed (198w/m^2 hits the ground)
The back radiation from GHGs is 321 w/m^2 absorbed by the ground.
If 321W/m^2 is absorbed in the top layer and 169w/m^2 is absorbed in 10s meters the the top layer will be much warmer than the lower layers.
So is it not true that this top layer must control the temperature of the lower layers?
Labels:
interest
2009/12/12
oh dear (updated 2010/06/16)
Where it started:
wattsupwiththat (16:51:06) :
You know, this REALLY chaps my hide, especially since I’m doing a lot of green things myself, and have made major green contributions in the past.
A few examples:
A drop of one or two degrees would devastate vast areas of food production on Canada, Northern Europe and Russia. An increase will do (has done) wonders for production. I can’t speak with authority on warmer climes, but would hazard a guess that a shift of one or two degrees in a warm climate will have less of an impact than a shift
It’s clear that you just don’t get it, thefordprefect: a somewhat warmer, more balmy and pleasant climate is more desirable than what we have now; while colder temperatures will certainly kill people.
If AGW really is a crisis, their is close to nothing you can do about it, and nothing at all that wouldn’t involve massive reduction in energy use and quality of life decline to go along with it. I will feel proud when I look at my children ... that I fought to keep them free and living in a world were their quality of life is at least as good as mine was if not better because I stopped self righteous zealots like yourself from reversing the industrial revolution. Good day sir.
I personally believe that the burning of fossil fuels such as coal are one of the only real things that mankind has done to benefit the earth as a whole. The total amount of available carbon and biomass on the surface of the earth has shrunk dramatically sin
You’ve lumped everyone into your world view and hurled a faux pau of major proportions.
Possibly true! I should have targetted better.
So “thefordprefect” whoever the hell you are (just another anonymous coward making unsubstantiated claims)
No! I sent you a private email with my full name warning about allowing comments that could be considered defamatory by those attacked (the comments could affect their ability to earn, and it would be difficult for you to prove that they truly were trying to defraud). By remaining anonymous as thefordprefect your contributors are welcome to defame me as much as they like! I am unknown!
This is the address section of the email:
To: info@surfacestations.org
Subject: FAO Anthony watts wattsupwiththat - be careful
From: M. xxxxxxx
Date: 01 March 2009 14:40:12
Also most of the statements I have made have been backed up with data. My accusations are in response to insults hurled my and others' way.
let me make this clear: apologize for your broad generalization as “we don’t care about the planet” or get the hell off my blog.
This was a general impression I got from some of the responses. It was aimed at them. I had already read of your low energy home.
I’m not interested in debating the issue, I’m not interested in your excuses. I’m not interested if you are offended. Apologize contritely or leave, those are the terms.
Do what you will - my appology would only be for not tagetting my comment. There has been no possibility of any enlightening debate from the responses I got (although this last comment seems to have drawn some sensible response).
I truly hope I (and others) are wrong about AGW. I truly fear that I am not.
Cheers,
Mike
Anthony Watts - ItWorks (*****@itworks.com)
Sent: Sat 3/14/09 6:20 AM
To: *******@hotmail.com
All you have to do is post an apology an move on. It's real easy. I'll even compose a sample for you.
"I'm sorry that I made a generalization that assumes all posters here don't care about our environment. I will be more careful with my words in the future."
Or something similar. If you don't wish to you are certainly not obligated, but you won't be posting anymore without such an apology.
Your previous multi-level reply won't fly. I have no record of any previous email from you, this email address is what you listed in your comment form.
Anthony Watts
------------------------------------
bill (12:00:53) : Your comment is awaiting moderation
bill (10:40:11) :
REPLY: Get your own blog then, but please don’t tell me how to run mine. I’ll post as many threads as I wish. And where’s your data citation link? Shifted and spliced data? Prove that’s valid. And if you really want to be taken seriously, drop the “galactic hero” meme and come clean with your full name. No need to hide. -Anthony
Anthony you castigated me for posting the same message on the sticky smoking gun threads, I was trying to say that if you start similar threads with the same theme and no different data, then I was requesting to post the same messages on both.
If I recall correctly someone (a “warmist”) on McIntyres blog real name was exposed leading to an event (I missed what it was) that forced the whole thread to be deleted. If I post garbage or wisdom on a topic It should not be made more/less acceptable because of my real status.
Looking at some of the comments made by CRU and other scientists about “nasty” emails they have received I prefer the safe option. My real name would enable google to provide home address (=business), phone, and private email.
As for references
http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=431043600000&data_set=1&num_neighbors=1
Angmagssalik 65.6 N 37.6 W 431043600000 rural area 1895 – 2009
“Raw” data
The ice core data is from the reference in the header
I said It was a rough tack of the instrumental record onto the plot. Is there another way of doing this with so little data available?
To only show data to 1850s and say there is no massive rise in temp in the 21st C is disingenuous
wattsupwiththat (16:51:06) :
You know, this REALLY chaps my hide, especially since I’m doing a lot of green things myself, and have made major green contributions in the past.
A few examples:
A drop of one or two degrees would devastate vast areas of food production on Canada, Northern Europe and Russia. An increase will do (has done) wonders for production. I can’t speak with authority on warmer climes, but would hazard a guess that a shift of one or two degrees in a warm climate will have less of an impact than a shift
It’s clear that you just don’t get it, thefordprefect: a somewhat warmer, more balmy and pleasant climate is more desirable than what we have now; while colder temperatures will certainly kill people.
If AGW really is a crisis, their is close to nothing you can do about it, and nothing at all that wouldn’t involve massive reduction in energy use and quality of life decline to go along with it. I will feel proud when I look at my children ... that I fought to keep them free and living in a world were their quality of life is at least as good as mine was if not better because I stopped self righteous zealots like yourself from reversing the industrial revolution. Good day sir.
I personally believe that the burning of fossil fuels such as coal are one of the only real things that mankind has done to benefit the earth as a whole. The total amount of available carbon and biomass on the surface of the earth has shrunk dramatically sin
You’ve lumped everyone into your world view and hurled a faux pau of major proportions.
Possibly true! I should have targetted better.
So “thefordprefect” whoever the hell you are (just another anonymous coward making unsubstantiated claims)
No! I sent you a private email with my full name warning about allowing comments that could be considered defamatory by those attacked (the comments could affect their ability to earn, and it would be difficult for you to prove that they truly were trying to defraud). By remaining anonymous as thefordprefect your contributors are welcome to defame me as much as they like! I am unknown!
This is the address section of the email:
To: info@surfacestations.org
Subject: FAO Anthony watts wattsupwiththat - be careful
From: M. xxxxxxx
Date: 01 March 2009 14:40:12
Also most of the statements I have made have been backed up with data. My accusations are in response to insults hurled my and others' way.
let me make this clear: apologize for your broad generalization as “we don’t care about the planet” or get the hell off my blog.
This was a general impression I got from some of the responses. It was aimed at them. I had already read of your low energy home.
I’m not interested in debating the issue, I’m not interested in your excuses. I’m not interested if you are offended. Apologize contritely or leave, those are the terms.
Do what you will - my appology would only be for not tagetting my comment. There has been no possibility of any enlightening debate from the responses I got (although this last comment seems to have drawn some sensible response).
I truly hope I (and others) are wrong about AGW. I truly fear that I am not.
Cheers,
Mike
Anthony Watts - ItWorks (*****@itworks.com)
Sent: Sat 3/14/09 6:20 AM
To: *******@hotmail.com
All you have to do is post an apology an move on. It's real easy. I'll even compose a sample for you.
"I'm sorry that I made a generalization that assumes all posters here don't care about our environment. I will be more careful with my words in the future."
Or something similar. If you don't wish to you are certainly not obligated, but you won't be posting anymore without such an apology.
Your previous multi-level reply won't fly. I have no record of any previous email from you, this email address is what you listed in your comment form.
Anthony Watts
------------------------------------
bill (12:00:53) : Your comment is awaiting moderation
bill (10:40:11) :
REPLY: Get your own blog then, but please don’t tell me how to run mine. I’ll post as many threads as I wish. And where’s your data citation link? Shifted and spliced data? Prove that’s valid. And if you really want to be taken seriously, drop the “galactic hero” meme and come clean with your full name. No need to hide. -Anthony
Anthony you castigated me for posting the same message on the sticky smoking gun threads, I was trying to say that if you start similar threads with the same theme and no different data, then I was requesting to post the same messages on both.
If I recall correctly someone (a “warmist”) on McIntyres blog real name was exposed leading to an event (I missed what it was) that forced the whole thread to be deleted. If I post garbage or wisdom on a topic It should not be made more/less acceptable because of my real status.
Looking at some of the comments made by CRU and other scientists about “nasty” emails they have received I prefer the safe option. My real name would enable google to provide home address (=business), phone, and private email.
As for references
http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=431043600000&data_set=1&num_neighbors=1
Angmagssalik 65.6 N 37.6 W 431043600000 rural area 1895 – 2009
“Raw” data
The ice core data is from the reference in the header
I said It was a rough tack of the instrumental record onto the plot. Is there another way of doing this with so little data available?
To only show data to 1850s and say there is no massive rise in temp in the 21st C is disingenuous
Labels:
censorship,
wuwt
Row per year to month per row
Here is some rough excel code. It works.
NOTE that the excel sheet must be saved as a .XLSM macro enabled worksheet and macros will have to be enabled.
excel7
click developer tab
type a new macro name
e. g. YearPerRowToClmn
Assign to keybord if you like
click [create]
then between:
Sub YearPerRowToClmn()
End Sub
paste this (but not the “———”:
‘——————————
ActiveCell.Offset(1, 0).Range("A1:A11").Select
Application.CutCopyMode = False
Selection.EntireRow.Insert , CopyOrigin:=xlFormatFromLeftOrAbove
ActiveCell.Offset(-1, 2).Range("A1:K1").Select
Selection.Copy
ActiveCell.Offset(1, -1).Range("A1").Select
Selection.PasteSpecial Paste:=xlPasteAll, Operation:=xlNone, SkipBlanks:= _
False, Transpose:=True
ActiveCell.Offset(11, -1).Range("A1").Select
repeattranspose:
If Len(ActiveCell.Text) < 2 Then GoTo stopp
ActiveCell.Offset(1, 0).Range("A1:x11").Select
Application.CutCopyMode = False
Selection.EntireRow.Insert , CopyOrigin:=xlFormatFromLeftOrAbove
ActiveCell.Offset(-1, 2).Range("A1:K1").Select
Selection.Copy
ActiveCell.Offset(1, -1).Range("A1").Select
Selection.PasteSpecial Paste:=xlPasteAll, Operation:=xlNone, SkipBlanks:= _
False, Transpose:=True
ActiveCell.Offset(11, -1).Range("A1").Select
GoTo repeattranspose
stopp:
'———————–
to use:
get data in text form with 1 year of 12 months + other stuff
Select data [ctrl]+a selects the lot
Copy data [ctrl+c
open blank sheet in the workbook containing the macro
paste the data at cell a1
You now have a single column of data one year per row. If your excel is set up differently it may convert the data to columns automatically, ifnot
select the first to last year:
click first - scroll to last and click the last year whilst holding shift
select [data] tab
click text to colums
click delimited if you KNOW that there is always a certain character (space, comma etc) between monthly data
or click fixed width
click next (select delimiter character if necessary)
check the colums are correctly selected – move, delete or add. If station number is in data this usually has the date attached without space. If so add a separator to separate the date from the station.
If station number is in first column click next and select station number to be text click finish
else click finish
You should now have the data separated into columns.
Click the cell containing the first date (or the first cell to the left of january’s temperature)
Save the workbook as the next step is not undo-able.
Run the macro above (use keyboard shortcut or go to the developer tab and double click the macro name.
The macro should stop on the last line of data (looks for empty cell). However if it does not press [ctrl]+[break] a number of times. select end or debug from the options according to taste.
No guarantee is given with this software!!!!!
If you ever end up with a correctly formatted column of monthly data you will need to remove all data error indicators.
Mark the data column
on [data] tab select filter
on first line of column click the down arrow
deselect all (click the ticked select all box)
look through offered numbers for error indicators "-" "-9999.9" etc.
click the boxes associated with the error indicators
press[ok]
only the data in error is now shown
mark it all (be careful of the first box as this is patially obscured by the arrow)and press [delete]
Turn off filter by clicking filter in the ribbon again
Data is now clean
I data shows temp*10
the adjacent (right) to the first temp type
= [click temperature to left to enter cell]/10
mark all column including this cell to last row containing temperature
from [home] tab click fill then select fill down
This column now contains correctly scaled temperature but the cell contents are actually formulae.
with column marked copy column [ctrl]+c
right click first cell of this column and select paste special
select values only then ok it
The column now contains actual values and can therfore be copied to another sheet with dates in decimal format i.e. year+(month-1)/12. Note that excel does not like true dates before jan 1st 1900
NOTE that the excel sheet must be saved as a .XLSM macro enabled worksheet and macros will have to be enabled.
excel7
click developer tab
type a new macro name
e. g. YearPerRowToClmn
Assign to keybord if you like
click [create]
then between:
Sub YearPerRowToClmn()
End Sub
paste this (but not the “———”:
‘——————————
ActiveCell.Offset(1, 0).Range("A1:A11").Select
Application.CutCopyMode = False
Selection.EntireRow.Insert , CopyOrigin:=xlFormatFromLeftOrAbove
ActiveCell.Offset(-1, 2).Range("A1:K1").Select
Selection.Copy
ActiveCell.Offset(1, -1).Range("A1").Select
Selection.PasteSpecial Paste:=xlPasteAll, Operation:=xlNone, SkipBlanks:= _
False, Transpose:=True
ActiveCell.Offset(11, -1).Range("A1").Select
repeattranspose:
If Len(ActiveCell.Text) < 2 Then GoTo stopp
ActiveCell.Offset(1, 0).Range("A1:x11").Select
Application.CutCopyMode = False
Selection.EntireRow.Insert , CopyOrigin:=xlFormatFromLeftOrAbove
ActiveCell.Offset(-1, 2).Range("A1:K1").Select
Selection.Copy
ActiveCell.Offset(1, -1).Range("A1").Select
Selection.PasteSpecial Paste:=xlPasteAll, Operation:=xlNone, SkipBlanks:= _
False, Transpose:=True
ActiveCell.Offset(11, -1).Range("A1").Select
GoTo repeattranspose
stopp:
'———————–
to use:
get data in text form with 1 year of 12 months + other stuff
Select data [ctrl]+a selects the lot
Copy data [ctrl+c
open blank sheet in the workbook containing the macro
paste the data at cell a1
You now have a single column of data one year per row. If your excel is set up differently it may convert the data to columns automatically, ifnot
select the first to last year:
click first - scroll to last and click the last year whilst holding shift
select [data] tab
click text to colums
click delimited if you KNOW that there is always a certain character (space, comma etc) between monthly data
or click fixed width
click next (select delimiter character if necessary)
check the colums are correctly selected – move, delete or add. If station number is in data this usually has the date attached without space. If so add a separator to separate the date from the station.
If station number is in first column click next and select station number to be text click finish
else click finish
You should now have the data separated into columns.
Click the cell containing the first date (or the first cell to the left of january’s temperature)
Save the workbook as the next step is not undo-able.
Run the macro above (use keyboard shortcut or go to the developer tab and double click the macro name.
The macro should stop on the last line of data (looks for empty cell). However if it does not press [ctrl]+[break] a number of times. select end or debug from the options according to taste.
No guarantee is given with this software!!!!!
If you ever end up with a correctly formatted column of monthly data you will need to remove all data error indicators.
Mark the data column
on [data] tab select filter
on first line of column click the down arrow
deselect all (click the ticked select all box)
look through offered numbers for error indicators "-" "-9999.9" etc.
click the boxes associated with the error indicators
press[ok]
only the data in error is now shown
mark it all (be careful of the first box as this is patially obscured by the arrow)and press [delete]
Turn off filter by clicking filter in the ribbon again
Data is now clean
I data shows temp*10
the adjacent (right) to the first temp type
= [click temperature to left to enter cell]/10
mark all column including this cell to last row containing temperature
from [home] tab click fill then select fill down
This column now contains correctly scaled temperature but the cell contents are actually formulae.
with column marked copy column [ctrl]+c
right click first cell of this column and select paste special
select values only then ok it
The column now contains actual values and can therfore be copied to another sheet with dates in decimal format i.e. year+(month-1)/12. Note that excel does not like true dates before jan 1st 1900
Labels:
excel
2009/12/10
Darwin
From wuwt
Willis Looking at the unadjusted plots leads me to suspect that there are 2 major changes in measurement methods/location.
This occur in january 1941 and June 1994 – The 1941 is well known (po to airport move) . I can find no connection for the 1994 shift
These plots show the 2 periods each giving a shift of 0.8C
http://img33.imageshack.us/img33/2505/darwincorrectionpoints.png
The red line shows the effect of a suggested correction
This plot compares the GHCN corrected curve (green) to that suggested by me (red).
The difference between the 2 is approx 1C compared to the 2.5 you quote as the “cheat”.
http://img37.imageshack.us/img37/4617/ghcnsuggestedcorrection.png
Willis Looking at the unadjusted plots leads me to suspect that there are 2 major changes in measurement methods/location.
This occur in january 1941 and June 1994 – The 1941 is well known (po to airport move) . I can find no connection for the 1994 shift
These plots show the 2 periods each giving a shift of 0.8C
http://img33.imageshack.us/img33/2505/darwincorrectionpoints.png
The red line shows the effect of a suggested correction
This plot compares the GHCN corrected curve (green) to that suggested by me (red).
The difference between the 2 is approx 1C compared to the 2.5 you quote as the “cheat”.
http://img37.imageshack.us/img37/4617/ghcnsuggestedcorrection.png
Labels:
darwin,
temperature shifts,
willis
2009/12/04
Defunct code found in Stolen data
Robert Greiner you state (on wattsupwiththat):
Line 8
This is where the magic happens. Remember that array we have of valid temperature readings? And, remember that random array of numbers we have from line two? Well, in line 4, those two arrays are interpolated together.
The interpol() function will take each element in both arrays and “guess” at the points in between them to create a smoothing effect on the data. This technique is often used when dealing with natural data points, just not quite in this manner.
The main thing to realize here, is, that the interpol() function will cause the valid temperature readings (yrloc) to skew towards the valadj values.
Lets look at a bit more of that code:
; Apply a VERY ARTIFICAL correction for decline!!
;yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,'Oooops!'
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
yearlyadj=interpol(valadj,yrloc,timey)
Does not this line give a yearly adjustment value interpolated from the 20 year points?
filter_cru,5.,/nan,tsin=yyy,tslow=tslow oplot,timey,tslow,thick=5,color=21
Does not this line plot data derived from yyy
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
The smoking gun line!!!!
Does not his line plot data derived from yyy+yearlyadj The FUDGED FIGURE
BUT............
IT'S COMMENTED OUT!!
This is further backed up by the end of file:
plot,[0,1],/nodata,xstyle=4,ystyle=4
;legend,['Northern Hemisphere April-September instrumental temperature',$
; 'Northern Hemisphere MXD',$
; 'Northern Hemisphere MXD corrected for decline'],$
; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
legend,['Northern Hemisphere April-September instrumental temperature',$
'Northern Hemisphere MXD'],$
colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5
To me this looks as if 'Northern Hemisphere MXD corrected for decline' would have been printed in colour 20 - just the same as the smoking gun line. HOWEVER you will note that this section is commented out also.
This code was written in 1998. If it had been implemented in any document then there would have been no leaked emails about hiding the decline!
So in my view this is code left in after a quick look-see.
Remember engineers ans scientist are human and play if bored and do not always tidy up.
have a look at:
http://micro.magnet.fsu.edu/creatures/index.html
Additional
From wuwt and woodfortrees
....
Here’s Gavin of RC on the subject (which was quoted by “Norman” in comments on your previous posting):
“It was an artificial correction to check some calibration statistics to see whether they would vary if the divergence was an artifact of some extra anthropogenic impact. It has never been used in a published paper (though something similar was explained in detail in this draft paper by Osborn). It has nothing to do with any reconstruction used in the IPCC reports.”
And indeed, in the same set of comments, “Morgan” pointed out that the Osborn et al. paper explicitly describes this step:
“To overcome these problems, the decline is artificially removed from the calibrated tree-ring density series, for the purpose of making a final calibration. The removal is only temporary, because the final calibration is then applied to the unadjusted data set (i.e., without the decline artificially removed). Though this is rather an ad hoc approach, it does allow us to test the sensitivity of the calibration to time scale, and it also yields a reconstruction whose mean level is much less sensitive to the choice of calibration period.”
I’m not sure which one of these your particular code snippet is doing, but either seem perfectly reasonable explanations to me – and both require the code to be added and them removed again. The lazy programmer’s way of doing this is by commenting and uncommenting.
If some hacker accessed some code illegally which contains commented out sections:
1. you do not know the status of the code - development or an issue or final issued. How can you criticise it?
2. Presence of commented out code or separate programme (that this thread is about) does not prove intent to commit fraud. As someone else commented the presence of unwritten code written by the invisible pink unicorn that says invisibly "this code creates a hockey stick" will not stand up in a court of law. To use use the argument here that "it could have been used so it must show intent to commit fraud" is disengenious to say the least.
Mike
20/2/11
WUWT entry
briffa_sep98_e.pro
Line 4:
; Reads Harry’s regional timeseries and outputs the 1600-1992 portion
Line 10:
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
========================
Notice that phrase "fudge factor" doesn't sound like hiding to me!
========================
Lines 53-70
; APPLY ARTIFICIAL CORRECTION
I feel that briffa_sep98_e.pro is the encoding of a lie.
did you notice this:
;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
Hiding????
Next file calibrate_nhrecon
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
next file recon_overpeck
; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
(I think they mean 1960 !! to agree with the code that follows)
Next File recon_esper.pro
recon_mann.pro
recon2.pro
recon_jones.pro
recon_tornyamataim.pro
All the same comment added in the header
Hiding?? I do not think so
recon_tornyamataim.pro
Seems to be the later version of your files:
densadj=reform(rawdat(2:3,*))
ml=where(densadj eq -99.999,nmiss)
densadj(ml)=!values.f_nan
Note no yearlyadj no valadj
So which programme was used to publish??
Line 8
This is where the magic happens. Remember that array we have of valid temperature readings? And, remember that random array of numbers we have from line two? Well, in line 4, those two arrays are interpolated together.
The interpol() function will take each element in both arrays and “guess” at the points in between them to create a smoothing effect on the data. This technique is often used when dealing with natural data points, just not quite in this manner.
The main thing to realize here, is, that the interpol() function will cause the valid temperature readings (yrloc) to skew towards the valadj values.
Lets look at a bit more of that code:
; Apply a VERY ARTIFICAL correction for decline!!
;yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
if n_elements(yrloc) ne n_elements(valadj) then message,'Oooops!'
yearlyadj=interpol(valadj,yrloc,timey)
;
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
;
filter_cru,5.,/nan,tsin=yyy,tslow=tslow
oplot,timey,tslow,thick=5,color=21
yearlyadj=interpol(valadj,yrloc,timey)
Does not this line give a yearly adjustment value interpolated from the 20 year points?
filter_cru,5.,/nan,tsin=yyy,tslow=tslow oplot,timey,tslow,thick=5,color=21
Does not this line plot data derived from yyy
;filter_cru,5.,/nan,tsin=yyy+yearlyadj,tslow=tslow
;oplot,timey,tslow,thick=5,color=20
The smoking gun line!!!!
Does not his line plot data derived from yyy+yearlyadj The FUDGED FIGURE
BUT............
IT'S COMMENTED OUT!!
This is further backed up by the end of file:
plot,[0,1],/nodata,xstyle=4,ystyle=4
;legend,['Northern Hemisphere April-September instrumental temperature',$
; 'Northern Hemisphere MXD',$
; 'Northern Hemisphere MXD corrected for decline'],$
; colors=[22,21,20],thick=[3,3,3],margin=0.6,spacing=1.5
legend,['Northern Hemisphere April-September instrumental temperature',$
'Northern Hemisphere MXD'],$
colors=[22,21],thick=[3,3],margin=0.6,spacing=1.5
To me this looks as if 'Northern Hemisphere MXD corrected for decline' would have been printed in colour 20 - just the same as the smoking gun line. HOWEVER you will note that this section is commented out also.
This code was written in 1998. If it had been implemented in any document then there would have been no leaked emails about hiding the decline!
So in my view this is code left in after a quick look-see.
Remember engineers ans scientist are human and play if bored and do not always tidy up.
have a look at:
http://micro.magnet.fsu.edu/creatures/index.html
Additional
From wuwt and woodfortrees
....
Here’s Gavin of RC on the subject (which was quoted by “Norman” in comments on your previous posting):
“It was an artificial correction to check some calibration statistics to see whether they would vary if the divergence was an artifact of some extra anthropogenic impact. It has never been used in a published paper (though something similar was explained in detail in this draft paper by Osborn). It has nothing to do with any reconstruction used in the IPCC reports.”
And indeed, in the same set of comments, “Morgan” pointed out that the Osborn et al. paper explicitly describes this step:
“To overcome these problems, the decline is artificially removed from the calibrated tree-ring density series, for the purpose of making a final calibration. The removal is only temporary, because the final calibration is then applied to the unadjusted data set (i.e., without the decline artificially removed). Though this is rather an ad hoc approach, it does allow us to test the sensitivity of the calibration to time scale, and it also yields a reconstruction whose mean level is much less sensitive to the choice of calibration period.”
I’m not sure which one of these your particular code snippet is doing, but either seem perfectly reasonable explanations to me – and both require the code to be added and them removed again. The lazy programmer’s way of doing this is by commenting and uncommenting.
If some hacker accessed some code illegally which contains commented out sections:
1. you do not know the status of the code - development or an issue or final issued. How can you criticise it?
2. Presence of commented out code or separate programme (that this thread is about) does not prove intent to commit fraud. As someone else commented the presence of unwritten code written by the invisible pink unicorn that says invisibly "this code creates a hockey stick" will not stand up in a court of law. To use use the argument here that "it could have been used so it must show intent to commit fraud" is disengenious to say the least.
Mike
20/2/11
WUWT entry
briffa_sep98_e.pro
Line 4:
; Reads Harry’s regional timeseries and outputs the 1600-1992 portion
Line 10:
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
========================
Notice that phrase "fudge factor" doesn't sound like hiding to me!
========================
Lines 53-70
; APPLY ARTIFICIAL CORRECTION
I feel that briffa_sep98_e.pro is the encoding of a lie.
did you notice this:
;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********
Hiding????
Next file calibrate_nhrecon
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
next file recon_overpeck
; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
(I think they mean 1960 !! to agree with the code that follows)
Next File recon_esper.pro
recon_mann.pro
recon2.pro
recon_jones.pro
recon_tornyamataim.pro
All the same comment added in the header
Hiding?? I do not think so
recon_tornyamataim.pro
Seems to be the later version of your files:
densadj=reform(rawdat(2:3,*))
ml=where(densadj eq -99.999,nmiss)
densadj(ml)=!values.f_nan
Note no yearlyadj no valadj
So which programme was used to publish??
Labels:
code,
cru,
idiots,
smoking gun,
wuwt
2009/11/13
2009/11/09
CO2 the stuff of life
Lets look at 2 gases
a poison – Hydrogen Sulphide H2S
And a benificial to all life nutrient – CO2
H2S
http://www.drthrasher.org/toxicology_of_hydrogen_sulfide.html
10 ppm
Beginning of Eye Irritation
50-100 ppm
Slight conjunctivitis and respiratory tract irritation after one hour
100 ppm
Coughing, eye irritation, loss of sense of smell after 2-15 minutes. Altered respiration, pain the eyes, and drowsiness after 15-30 minutes followed by throat irritation after one hour. Several hours exposure results in gradual increase in severity of symptoms and death may occur within the next 48 hours.
200-300 ppm
Marked conjunctivitis and respiratory tract irritation after one hour exposure.
500-700 ppm
Loss of consciousness and possibly death in 30 minutes to one hour of exposure.
700-1000 ppm
Rapid unconsciousness, cessation of respiration, and death
1000-2000 ppm
Unconsciousness at once, with early cessation of respiration and death in a few minutes. Death may occur if individual is removed to fresh air at once.
The most dangerous aspect of hydrogen sulfide results from olfactory accomodation and/or olfactory paralysis. This means that the individual can accomodate to the odor and is not able to detect the presence of the chemical after a short period of time. Olfactory paralysis occurs in workers who are exposed to 150 ppm or greater. This occurs rapidly, leaving the worker defenseless. Unconsciousness and death has been recorded following prolonged exposure at 50 ppm.
http://www.ncbi.nlm.nih.gov/pubmed/10998771
There were 80 fatalities from hydrogen sulfide in 57 incidents, with 19 fatalities and 36 injuries among coworkers attempting to rescue fallen workers.
CO2
Carbon dioxide is an asphyxiant. It initially stimulates respiration and then causes respiratory depression.
High concentrations result in narcosis. Symptoms in humans are as follows:
EFFECT: CONCENTRATION:
Breathing rate increases slightly. 1% (10,000ppm)
Breathing rate increases to 50% above normal level. Prolonged
exposure can cause headache, tiredness.
2%
Breathing increases to twice normal rate and becomes labored. Weak
narcotic effect. Impaired hearing, headache, increased blood pressure
and pulse rate.
3%
Breathing increases to approximately four times normal rate, symptoms
of intoxication become evident, and slight choking may be felt.
4 – 5%
Characteristic sharp odor noticeable. Very labored breathing,
headache, visual impairment, and ringing in the ears. Judgment may be
impaired, followed within minutes by loss of consciousness.
5 – 10%
Unconsciousness occurs more rapidly above 10% level. Prolonged
exposure to high concentrations may eventually result in death from
asphyxiation.
10 – 100%
http://yarchive.net/med/co2_poisoning.html
All true, but the subjective distress is almost entirely caused by
the high CO2. Humans don’t have good hypoxia sensors, and people have
walked into nitrogen filled rooms and died, before they even realized
there was anything wrong. You can breathe into a closed circuit which
takes out the CO2 until you pass out from hypoxia, without much
discomfort at all. On the other hand, in a submarine or someplace
where CO2 is building up but there’s plenty of oxygen, it’s intensely
uncomfortable, and feels like dying. So does breathing that 5% CO2 95%
O2 medical mix they treat CO victims with.
And when the CO2 hits about 7% to 10% of your ambient air, you DO
die. Even if the rest is O2. It’s CO2 narcosis, and it shuts you
down. 5% CO2 is about 40 Torr, your normal blood level. So if you
breath that, you go up to 80 Torr, enough to black you out unless you
hyperventilate. Double your minute volume and you can get down to 60
Torr, but you feel crumby. At 10% there’s no way to keep below about
90 Torr, and (unless you’re a chronic COPD patient who’s used to high
CO2s and has a high bicarb and other compensatory mechanisms) you black
out. Then quit hyperventilating. Then quit breathing entirely.
http://www.therhondda.co.uk/gases/carbon_dioxide.html
included to show that the combined effects of carbon dioxide and a shortage of oxygen are much more intense than either of the two conditions alone,
So firstly it is not benign above 50,000ppm
Secondly it is not poisonous but it kills:
deaths :
Look up “choke damp” in mines
look up lake nyos 2000 deaths / lake monoun 37 deaths
So please cut the stuff about how CO2 is the stuff of life
a poison – Hydrogen Sulphide H2S
And a benificial to all life nutrient – CO2
H2S
http://www.drthrasher.org/toxicology_of_hydrogen_sulfide.html
10 ppm
Beginning of Eye Irritation
50-100 ppm
Slight conjunctivitis and respiratory tract irritation after one hour
100 ppm
Coughing, eye irritation, loss of sense of smell after 2-15 minutes. Altered respiration, pain the eyes, and drowsiness after 15-30 minutes followed by throat irritation after one hour. Several hours exposure results in gradual increase in severity of symptoms and death may occur within the next 48 hours.
200-300 ppm
Marked conjunctivitis and respiratory tract irritation after one hour exposure.
500-700 ppm
Loss of consciousness and possibly death in 30 minutes to one hour of exposure.
700-1000 ppm
Rapid unconsciousness, cessation of respiration, and death
1000-2000 ppm
Unconsciousness at once, with early cessation of respiration and death in a few minutes. Death may occur if individual is removed to fresh air at once.
The most dangerous aspect of hydrogen sulfide results from olfactory accomodation and/or olfactory paralysis. This means that the individual can accomodate to the odor and is not able to detect the presence of the chemical after a short period of time. Olfactory paralysis occurs in workers who are exposed to 150 ppm or greater. This occurs rapidly, leaving the worker defenseless. Unconsciousness and death has been recorded following prolonged exposure at 50 ppm.
http://www.ncbi.nlm.nih.gov/pubmed/10998771
There were 80 fatalities from hydrogen sulfide in 57 incidents, with 19 fatalities and 36 injuries among coworkers attempting to rescue fallen workers.
CO2
Carbon dioxide is an asphyxiant. It initially stimulates respiration and then causes respiratory depression.
High concentrations result in narcosis. Symptoms in humans are as follows:
EFFECT: CONCENTRATION:
Breathing rate increases slightly. 1% (10,000ppm)
Breathing rate increases to 50% above normal level. Prolonged
exposure can cause headache, tiredness.
2%
Breathing increases to twice normal rate and becomes labored. Weak
narcotic effect. Impaired hearing, headache, increased blood pressure
and pulse rate.
3%
Breathing increases to approximately four times normal rate, symptoms
of intoxication become evident, and slight choking may be felt.
4 – 5%
Characteristic sharp odor noticeable. Very labored breathing,
headache, visual impairment, and ringing in the ears. Judgment may be
impaired, followed within minutes by loss of consciousness.
5 – 10%
Unconsciousness occurs more rapidly above 10% level. Prolonged
exposure to high concentrations may eventually result in death from
asphyxiation.
10 – 100%
http://yarchive.net/med/co2_poisoning.html
All true, but the subjective distress is almost entirely caused by
the high CO2. Humans don’t have good hypoxia sensors, and people have
walked into nitrogen filled rooms and died, before they even realized
there was anything wrong. You can breathe into a closed circuit which
takes out the CO2 until you pass out from hypoxia, without much
discomfort at all. On the other hand, in a submarine or someplace
where CO2 is building up but there’s plenty of oxygen, it’s intensely
uncomfortable, and feels like dying. So does breathing that 5% CO2 95%
O2 medical mix they treat CO victims with.
And when the CO2 hits about 7% to 10% of your ambient air, you DO
die. Even if the rest is O2. It’s CO2 narcosis, and it shuts you
down. 5% CO2 is about 40 Torr, your normal blood level. So if you
breath that, you go up to 80 Torr, enough to black you out unless you
hyperventilate. Double your minute volume and you can get down to 60
Torr, but you feel crumby. At 10% there’s no way to keep below about
90 Torr, and (unless you’re a chronic COPD patient who’s used to high
CO2s and has a high bicarb and other compensatory mechanisms) you black
out. Then quit hyperventilating. Then quit breathing entirely.
http://www.therhondda.co.uk/gases/carbon_dioxide.html
included to show that the combined effects of carbon dioxide and a shortage of oxygen are much more intense than either of the two conditions alone,
So firstly it is not benign above 50,000ppm
Secondly it is not poisonous but it kills:
deaths :
Look up “choke damp” in mines
look up lake nyos 2000 deaths / lake monoun 37 deaths
So please cut the stuff about how CO2 is the stuff of life
Labels:
co2,
co2 toxicity
2009/11/08
2009/10/18
noise tree rings and stuff
But surely the random sequences added together are just that random. Because they are random there will be random sequences that conform to any curve required, but outside the conformance the sequence will fall back to random = average zero.
Surely what is being proposed is that trees growths are controlled by many factors. no randomness just noise and a combination of factors.
Trees will not grow at -40C trees will not grow at +100c.
Trees do grow well at a temp in between (all else being satisfactory).
Choosing trees that grow in tune to the temperature means that if they extend beyond the temp record than the is a greater possibility that these will continue to grow in tune with the temp. If they grow to a different tune then they are invalid responders.
A long time ago I posted a sequence of pictures showing what can be obtained by adding and averaging a sequence of aligned photos - the only visible data was the church and sky glow. I added 128 of these images together and obtained this photo:

Note that it also shows the imperfections in the digital sensor (the window frame effect)
Image shack did have a single image with the gamma turned up to reveal the only visible image (Church+sky) but they've lost it!

The picture was taken in near dark conditions.
A flash photo of the same:

By removing all invalid data (pictures of the wife, the kids, flowers etc) that do not have the church and sky, a reasonable picture of the back garden appears from the noise.
Of course I may have included a few dark picture with 2 streetlights in those locations, but with enough of the correct image these will have a lessening effect.
--------------
This cap shape must have a dependence on temperature. It may not be linear but it must be there.
Somewhere between 15C and 100C the growth must start declining Did trees pass the optimum in the 60s?
Uncontrolled emissions in the 60s, 70s and 80s was known to cause acid rain (to an extent that some countries were forced to add lime to lakes to prevent damage) there was plenty of evidence that trees were being damage also.
Is it not true to say Damaged trees=slow growth
There are many factors that can slow tree growth but apart from over temperature these effects will be diminished by limited industrialisation (before 1900?).
Trees are rubbish thermometers, but in all the noise there MUST be a temperature signal. A large local sample will lower the noise from sickness, or damage. A large global sample will lower the noise from changes in soil fertility, etc.
Nothing will remove the noise from CO2 fertilisation, or other global events.
Some trees growing at the limit of their water needs may be negatively affected by rises in temperatures from their minimum growing value - growing in heat requires more water. these will always show a negative growth increase with temp. But if averaged with enough positive responders then these will be insignificant.
But the signal that remains must, when averaged contain a temperature signal (not necessarily linear)
wiki:
"Overall, the Program's cap and trade program has been successful in achieving its goals. Since the 1990s, SO2 emissions have dropped 40%, and according to the Pacific Research Institute, acid rain levels have dropped 65% since 1976.[16][17] However, this was significantly less successful than conventional regulation in the European Union, which saw a decrease of over 70% in SO2 emissions during the same time period.[18]
In 2007, total SO2 emissions were 8.9 million tons, achieving the program's long term goal ahead of the 2010 statutory deadline.[19]
The EPA estimates that by 2010, the overall costs of complying with the program for businesses and consumers will be $1 billion to $2 billion a year, only one fourth of what was originally predicted.[16]"
"However, the issue of acid rain first came to the attention of the international community in the late 1960s, having been identified in certain areas of southern Scandinavia, where it was damaging forests. The matter quickly became an international issue when it was discovered that the acid deposits in these areas were a result of heavy pollution in the UK and other parts of northern Europe.
http://www.politics.co.uk/briefings-guides/issue-briefs/environment-and-rural-affairs/acid-rain-$366677.htm
Acid rain and air pollution emerged from the industrial boom of the early 1900s onwards and the increasing levels of chemical production associated with these processes. The building of taller industrial chimneys from the 1960s onwards was largely held to be responsible for pollutants generated in the UK blowing as far as Scandinavia. "
Surely what is being proposed is that trees growths are controlled by many factors. no randomness just noise and a combination of factors.
Trees will not grow at -40C trees will not grow at +100c.
Trees do grow well at a temp in between (all else being satisfactory).
Choosing trees that grow in tune to the temperature means that if they extend beyond the temp record than the is a greater possibility that these will continue to grow in tune with the temp. If they grow to a different tune then they are invalid responders.
A long time ago I posted a sequence of pictures showing what can be obtained by adding and averaging a sequence of aligned photos - the only visible data was the church and sky glow. I added 128 of these images together and obtained this photo:
Note that it also shows the imperfections in the digital sensor (the window frame effect)
Image shack did have a single image with the gamma turned up to reveal the only visible image (Church+sky) but they've lost it!
The picture was taken in near dark conditions.
A flash photo of the same:
By removing all invalid data (pictures of the wife, the kids, flowers etc) that do not have the church and sky, a reasonable picture of the back garden appears from the noise.
Of course I may have included a few dark picture with 2 streetlights in those locations, but with enough of the correct image these will have a lessening effect.
--------------
This cap shape must have a dependence on temperature. It may not be linear but it must be there.
Somewhere between 15C and 100C the growth must start declining Did trees pass the optimum in the 60s?
Uncontrolled emissions in the 60s, 70s and 80s was known to cause acid rain (to an extent that some countries were forced to add lime to lakes to prevent damage) there was plenty of evidence that trees were being damage also.
Is it not true to say Damaged trees=slow growth
There are many factors that can slow tree growth but apart from over temperature these effects will be diminished by limited industrialisation (before 1900?).
Trees are rubbish thermometers, but in all the noise there MUST be a temperature signal. A large local sample will lower the noise from sickness, or damage. A large global sample will lower the noise from changes in soil fertility, etc.
Nothing will remove the noise from CO2 fertilisation, or other global events.
Some trees growing at the limit of their water needs may be negatively affected by rises in temperatures from their minimum growing value - growing in heat requires more water. these will always show a negative growth increase with temp. But if averaged with enough positive responders then these will be insignificant.
But the signal that remains must, when averaged contain a temperature signal (not necessarily linear)
wiki:
"Overall, the Program's cap and trade program has been successful in achieving its goals. Since the 1990s, SO2 emissions have dropped 40%, and according to the Pacific Research Institute, acid rain levels have dropped 65% since 1976.[16][17] However, this was significantly less successful than conventional regulation in the European Union, which saw a decrease of over 70% in SO2 emissions during the same time period.[18]
In 2007, total SO2 emissions were 8.9 million tons, achieving the program's long term goal ahead of the 2010 statutory deadline.[19]
The EPA estimates that by 2010, the overall costs of complying with the program for businesses and consumers will be $1 billion to $2 billion a year, only one fourth of what was originally predicted.[16]"
"However, the issue of acid rain first came to the attention of the international community in the late 1960s, having been identified in certain areas of southern Scandinavia, where it was damaging forests. The matter quickly became an international issue when it was discovered that the acid deposits in these areas were a result of heavy pollution in the UK and other parts of northern Europe.
http://www.politics.co.uk/briefings-guides/issue-briefs/environment-and-rural-affairs/acid-rain-$366677.htm
Acid rain and air pollution emerged from the industrial boom of the early 1900s onwards and the increasing levels of chemical production associated with these processes. The building of taller industrial chimneys from the 1960s onwards was largely held to be responsible for pollutants generated in the UK blowing as far as Scandinavia. "
Labels:
confirmation bias,
tree rings
CO2 and IR absoption
TomVonk:
October 16th, 2009 at 4:03 am
Re: thefordprefect (#186),
"From what I have seen the logarithmic effect is usually explained by the absoption bands getting full - ie. no more radiation can be absorbed. Radiation is then absorbed by smaller absorptions bands and and by the under used width of the CO2 bands.
And the CO2 GH effect is vastly lessened by many of the bands falling within the H20 bands."
.
I don't konw where you have seen that but this explanation is not even wrong .
It is absolutely and totally forbidden that a band , any band gets "full" or "saturated" .
The population that is in an excited state is a CONSTANT for a given temperature (f.ex the CO2 15µ excited state represents 5 % of the total CO2 population at room temperatures) . This is prescribed by the MB distribution of the quantum states .
It doesn't depend on the number of molecules , the intensity of radiation or the age of the captain . Only temperature .
So whatever amount of IR you throw at a CO2 population , they will absorb it all and REEMIT .
They can't do anything else because they must do whatever it takes to keep the percentage of excited states constant .
.
Imagine a dam (CO2 molecules) and a lake whose level (percentage of excited states) is exactly at the top of the dam .
If you increase the flow in the lake (absorbed radiation) all that will happen is that the flow over the top of the dam (emitted radiation) will increase by exactly the same amount .
If you increase the height of the dam (temperature) , the level of the lake will go in a transient untill it gets to the new top and then it's again exactly like described above .
There is no "saturation" .
October 16th, 2009 at 4:03 am
Re: thefordprefect (#186),
"From what I have seen the logarithmic effect is usually explained by the absoption bands getting full - ie. no more radiation can be absorbed. Radiation is then absorbed by smaller absorptions bands and and by the under used width of the CO2 bands.
And the CO2 GH effect is vastly lessened by many of the bands falling within the H20 bands."
.
I don't konw where you have seen that but this explanation is not even wrong .
It is absolutely and totally forbidden that a band , any band gets "full" or "saturated" .
The population that is in an excited state is a CONSTANT for a given temperature (f.ex the CO2 15µ excited state represents 5 % of the total CO2 population at room temperatures) . This is prescribed by the MB distribution of the quantum states .
It doesn't depend on the number of molecules , the intensity of radiation or the age of the captain . Only temperature .
So whatever amount of IR you throw at a CO2 population , they will absorb it all and REEMIT .
They can't do anything else because they must do whatever it takes to keep the percentage of excited states constant .
.
Imagine a dam (CO2 molecules) and a lake whose level (percentage of excited states) is exactly at the top of the dam .
If you increase the flow in the lake (absorbed radiation) all that will happen is that the flow over the top of the dam (emitted radiation) will increase by exactly the same amount .
If you increase the height of the dam (temperature) , the level of the lake will go in a transient untill it gets to the new top and then it's again exactly like described above .
There is no "saturation" .
Labels:
co2,
co2 saturation
2009/10/12
Oceans as temperature controllers
Re: tallbloke (#41),
Are you also posting as Stephen Wilde on the wuwt blog? You seem to be pushing the same ideas!.
SW radiation penetrates sea water further than LW radiation. However, this does not mean that SW radiation penetrates 20m of water suddenly transferring all its energy at that depth. It is progressively absorbed on the way down until at depth there is no more SW radiation left. So IR heats the surface only, UV heats the surface mainly. Air in contact with this sea surface is rapidly heated by the water and the water cools fractionally ONLY if the air temp is less than the water temp. If the water temp is less than the air temp (as it is during the daylight hours - usually) then the air will be cooled and the water warmed very fractionally.
The water temperature varies on a yearly basis round the UK (I assume it does round the rest of the globe?) There is no year long lag in temperature fluctuation as seasons change (perhaps only a month??)
My question to you is the same as it has been to Mr. Wilde - how is the ocean going to store this heat over many years as you suggest and then release it to the atmosphere?
Deep water more than 900m is at 4C 700 m averages 12C and at the surface 22C at an air temp of ????
http://www.windows.ucar.edu/tour/link=/earth/Water/temp.html&edu=high
If the heat is stored in the upper layers then it is continuously losing the "heat" to COOLER air
If it is in layers below 900m then how is 4C water going to up-well to release heat stored at 4C to air at 5C(for example)
Assuming it were possible to get heat energy stored at 4C to transfer the energy to the air at 12C how do you prevent these heat storage layer mixing as the sea slops around for 5 to 10 years?.
I would agree that the oceans act as a big temperature smoothing "capacitor" Reducing the yearly variations. Much more than this I need a better physical explanation for, please.
A further point AMO is often implicated in controlling air temperatures. This was posted on wuwt:
Comparing AMO with Hadcrut3V and Hadcrut3NH there is a wonderful correlation not so good with CET:

Apart from the increased trend caused by ?something? All the slow humps and dips appear in the right places and even the rapid changes appear aligned (to the eye!)
So if we zoom in and look at the signals through a much longer moving average the dips again align.

The dips in HADCRUT seem to occur a few months ahead of AMO and the peaks are a bit off. Not sure what CET has little correlation but hey, there must be a connection.
If Air Temp is driving AMO then one would expect the air temp changes to occur before AMO
and
Vice Versa.
So now lets look at the same date range through shorter moving averages.

Now it becomes interesting. sometimes the air temp leads amo and sometimes amo leads air temp.
If amo drives temp then there is no way that amo can lag air temperature.
and
vice versa
To me this says that there is a external driver, or the data is faulty.
Any thoughts?
Stuff:
Some interseting stuff but not too useful:
http://www.terrapub.co.jp/journals/JO/JOSJ/pdf/2601/26010052.pdf
http://science.jrank.org/pages/4836/Ocean-Zones-Water-depth-vs-light-penetration.html
http://spg.ucsd.edu/People/Mati/2003_Vasilkov_et_al_UV_radiation_SPIE.pdf
interesting book (full)
http://oceanworld.tamu.edu/resources/ocng_textbook/PDF_files/book.pdf
This is the one for wavelength and penetration depth in ocean:
http://www.terrapub.co.jp/journals/JO/JOSJ/pdf/2906/29060257.pdf
IR whacks the water molecules into motion UV less so - check the absoption bands of water vapour.
I believe this is due to the long solar minimum. When the sunspot count is above 40 or so, the oceans are net gainers of solar heat energy. When the sun is quiet for a while, that energy makes it's way back to the surface and is released. The last five solar minima have been followed within 12 months by an el nino.
Are you also posting as Stephen Wilde on the wuwt blog? You seem to be pushing the same ideas!.
SW radiation penetrates sea water further than LW radiation. However, this does not mean that SW radiation penetrates 20m of water suddenly transferring all its energy at that depth. It is progressively absorbed on the way down until at depth there is no more SW radiation left. So IR heats the surface only, UV heats the surface mainly. Air in contact with this sea surface is rapidly heated by the water and the water cools fractionally ONLY if the air temp is less than the water temp. If the water temp is less than the air temp (as it is during the daylight hours - usually) then the air will be cooled and the water warmed very fractionally.
The water temperature varies on a yearly basis round the UK (I assume it does round the rest of the globe?) There is no year long lag in temperature fluctuation as seasons change (perhaps only a month??)
My question to you is the same as it has been to Mr. Wilde - how is the ocean going to store this heat over many years as you suggest and then release it to the atmosphere?
Deep water more than 900m is at 4C 700 m averages 12C and at the surface 22C at an air temp of ????
http://www.windows.ucar.edu/tour/link=/earth/Water/temp.html&edu=high
If the heat is stored in the upper layers then it is continuously losing the "heat" to COOLER air
If it is in layers below 900m then how is 4C water going to up-well to release heat stored at 4C to air at 5C(for example)
Assuming it were possible to get heat energy stored at 4C to transfer the energy to the air at 12C how do you prevent these heat storage layer mixing as the sea slops around for 5 to 10 years?.
I would agree that the oceans act as a big temperature smoothing "capacitor" Reducing the yearly variations. Much more than this I need a better physical explanation for, please.
A further point AMO is often implicated in controlling air temperatures. This was posted on wuwt:
Comparing AMO with Hadcrut3V and Hadcrut3NH there is a wonderful correlation not so good with CET:
Apart from the increased trend caused by ?something? All the slow humps and dips appear in the right places and even the rapid changes appear aligned (to the eye!)
So if we zoom in and look at the signals through a much longer moving average the dips again align.
The dips in HADCRUT seem to occur a few months ahead of AMO and the peaks are a bit off. Not sure what CET has little correlation but hey, there must be a connection.
If Air Temp is driving AMO then one would expect the air temp changes to occur before AMO
and
Vice Versa.
So now lets look at the same date range through shorter moving averages.
Now it becomes interesting. sometimes the air temp leads amo and sometimes amo leads air temp.
If amo drives temp then there is no way that amo can lag air temperature.
and
vice versa
To me this says that there is a external driver, or the data is faulty.
Any thoughts?
Stuff:
Some interseting stuff but not too useful:
http://www.terrapub.co.jp/journals/JO/JOSJ/pdf/2601/26010052.pdf
http://science.jrank.org/pages/4836/Ocean-Zones-Water-depth-vs-light-penetration.html
http://spg.ucsd.edu/People/Mati/2003_Vasilkov_et_al_UV_radiation_SPIE.pdf
interesting book (full)
http://oceanworld.tamu.edu/resources/ocng_textbook/PDF_files/book.pdf
This is the one for wavelength and penetration depth in ocean:
http://www.terrapub.co.jp/journals/JO/JOSJ/pdf/2906/29060257.pdf
IR whacks the water molecules into motion UV less so - check the absoption bands of water vapour.
2009/10/06
McIntyre refuses offer to do real science
from dot earth
October 5, 2009, 2:41 pm Climate Auditor Challenged to Do Climate Science
By Andrew C. Revkin
Bloggers skeptical of global warming’s causes* and commentators fighting restrictions on greenhouse gases have made much in recent days of a string of posts on Climateaudit.org, one of the most popular Web sites aiming to challenge the deep consensus among climatologists that humans are setting the stage for generations of disrupted climate and rising seas. In the posts, Stephen McIntyre, questions sets of tree-ring data used in, or excluded from, prominent studies concluding that recent warming is unusual even when compared with past warm periods in the last several millenniums (including the recent Kaufman et al. paper discussed here).
Mr. McIntyre has gained fame or notoriety, depending on whom you consult, for seeking weaknesses in NASA temperature data and efforts to assemble a climate record from indirect evidence like variations in tree rings. Last week the scientists who run Realclimate.org, several of whom are authors of papers dissected by Mr. McIntyre, fired back. The Capital Weather Gang blog has just posted its analysis of the fight. One author of an underlying analysis of tree rings Keith Briffa, responded on his Web site and at on Climateaudit.org.
What is novel about all of this is how the blog discussions have sidestepped the traditional process of peer review and publication, then review and publication of critiques, and counter-critiques, by which science normally does that herky-jerky thing called knowledge building. The result is quick fodder for those using the Instanet to reinforce intellectual silos of one kind or another.
I explored this shift in the discourse in some e-mail exchanges with Mr. McIntyre and some of his critics, including Thomas Crowley, a University of Edinburgh specialist in unraveling past climate patterns. Dr. Crowley and Mr. McIntyre went toe to toe from 2003 through 2005 over data and interpretations. I then forwarded to Mr. McIntyre what amounted to a challenge from Dr. Crowley:
Thomas Crowley (now in Edinburgh) has sent me a note essentially challenging you to develop your own time series [of past climate patterns] (kind of a “put up or shut up” challenge). Why not do some climate science and get it published in the literature rather than poking at studies online, having the blogosphere amplify or distort your findings in a kind of short circuit that may not help push forward understanding?
As [Dr. Crowley] puts it: “McIntyre is really tiresome - notice he never publishes an alternate reconstruction that he thinks is better, oh no, because that involves taking a risk of him being criticized. He just nitpicks others. I don’t know of anyone else in science who actually does such things but fails to do something constructive himself.”
Here’s Mr. McIntyre’s reply (to follow references to publications you’ll need to refer to the linked papers). In essence, he says he sees no use in trying his own temperature reconstruction given the questions about the various data sets one would need to utilize:
The idea that I’m afraid of “taking a risk” or “taking a risk of being criticized” is a very strange characterization of what I do. Merely venturing into this field by confronting the most prominent authors at my age and stage of life was a far riskier enterprise than Crowley gives credit for. And as for “taking a risk of being criticized”? Can you honestly think of anyone in this field who is subjected to more criticism than I am? Or someone who has more eyes on their work looking for some fatal error?
The underlying problem with trying to make reconstructions with finite confidence intervals from the present roster of proxies is the inconsistency of the “proxies,” a point noted in McIntyre and McKitrick (PNAS 2009) in connection with Mann et al 2008 (but applies to other studies as well) as follows:
Paleoclimate reconstructions are an application of multivariate calibration, which provides a theoretical basis for confidence interval calculation (e.g., refs. 2 and 3). Inconsistency among proxies sharply inflates confidence intervals (3). Applying the inconsistency test of ref. 3 to Mann et al. A.D. 1000 proxy data shows that finite confidence intervals cannot be defined before ~1800.
Until this problem is resolved, I don’t see what purpose is served by proposing another reconstruction.
Crowley interprets the inconsistency as evidence of past “regional” climate, but offers no support for this interpretation other than the inconsistency itself –- which could equally be due to the “proxies” not being temperature proxies. There are fundamental inconsistencies at the regional level as well, including key locations of California (bristlecones) and Siberia (Yamal), where other evidence is contradictory t.o Mann-Briffa approachs (e.g. Millar et al 2006 re California; Naurzbaev et al 2004 and Polar Urals re Siberia,) These were noted up in the N.A.S. panel report, but Briffa refused to include the references in I.P.C.C. AR4. Without such detailed regional reconciliations, it cannot be concluded that inconsistency is evidence of “regional” climate as opposed to inherent defects in the “proxies” themselves.
The fundamental requirement in this field is not the need for a fancier multivariate method to extract a “faint signal” from noise – such efforts are all too often plagued with unawareness of data mining and data snooping. These problems are all too common in this field (e.g. the repetitive use of the bristlecones and Yamal series). I think that I’ve made climate scientists far more aware of these and other statistical problems than previously, whether they are willing to acknowledge this in public or not, and that this is highly “constructive” for the field.
As I mentioned to you, at least some prominent scientists in the field accept (though not for public attribution) the validity of our criticisms of the Mann-Briffa style reconstruction and now view such efforts as a dead end until better quality data is developed. If this view is correct, and I believe it is, then criticizing oversold reconstructions is surely “constructive” as it forces people to face up to the need for such better data.
Estimates provided to me (again without the scientists being prepared to do so in public) were that the development of such data may take 10-20 years and may involve totally different proxies than the ones presently in use. If I were to speculate on what sort of proxies had a chance of succeeding, it would be ones that were based on isotope fractionation or other physical processes with a known monotonic relationship to temperature and away from things like tree ring widths and varve thicknesses. In “deep time,” ice core O18 and foraminifera Mg/Ca in ocean sediments are examples of proxies that provide consistent or at least relatively consistent information. The prominent oceanographer Lowell Stott asked to meet with me at AGU 2007 to discuss long tree ring chronologies for O18 sampling. I sent all the Almagre cores to Lowell Stott’s lab, where Max Berkelhammer is analyzing delO18 values.
Underlying my articles and commentary is the effort to frame reconstructions in a broader statistical framework (multivariate calibration) where there is available theory, a project that seems to be ignored both by applied statisticians and climate scientists. At a 2007 conference of the American Statistical Association to which Caspar Ammann (but not me) was invited, it was concluded:
While there is undoubtedly scope for statisticians to play a larger role in paleoclimate research, the large investment of time needed to become familiar with the scientific background is likely to deter most statisticians from entering this field. http://www.climateaudit.org/?p=2280
I’ve been working on this from time to time over the past few years and this too seems “highly constructive” to me and far more relevant to my interests and skills than adding to the population of poorly constrained “reconstructions,” as Crowley proposes.
In the meantime, studies using recycled proxies and problematic statistical methods continue to be widely publicized. Given my present familiarity with the methods and proxies used in the field, I believe that there is a useful role for timely analysis of the type that I do at Climate Audit. It would be even more constructive if the authors rose to the challenge of defending their studies.
Given the importance of climate change as an issue, it remains disappointing that prompt archiving of data remains an issue with many authors and that funding agencies and journals are not more effective in enforcing existing policies or establishing such policies if existing policies are insufficient. It would be desirable as well if journals publishing statistical paleoclimate articles followed econometric journal practices by requiring the archiving of working code as a condition of review. While progress has been slow, I think that my efforts on these fronts, both data and code, have been constructive. It is disappointing that Crowley likens the archiving of data to doing a tax return. It’s not that hard. Even in blog posts (e.g. the Briffa post in question), I frequently provide turnkey code enabling readers to download all relevant data from original sources and to see all statistical calculations and figures for themselves. This is the way that things are going to go – not Crowley’s way.
So should this all play out within the journals, or is there merit to arguments of those contending that the process of peer review is too often biased to favor the status quo and, when involving matters of statistics, sometimes not involving the right reviewers?
Another scientist at the heart of the temperature-reconstruction effort, Michael Mann of Pennsylvania State University, said that if Mr. McIntyre wants to be taken seriously he has to move more from blogging to publishing in the refereed literature.
“Skepticism is essential for the functioning of science,” Dr. Mann said. “It yields an erratic path towards eventual truth. But legitimate scientific skepticism is exercised through formal scientific circles, in particular the peer review process.” He added: “Those such as McIntyre who operate almost entirely outside of this system are not to be trusted.”
October 5, 2009, 2:41 pm Climate Auditor Challenged to Do Climate Science
By Andrew C. Revkin
Bloggers skeptical of global warming’s causes* and commentators fighting restrictions on greenhouse gases have made much in recent days of a string of posts on Climateaudit.org, one of the most popular Web sites aiming to challenge the deep consensus among climatologists that humans are setting the stage for generations of disrupted climate and rising seas. In the posts, Stephen McIntyre, questions sets of tree-ring data used in, or excluded from, prominent studies concluding that recent warming is unusual even when compared with past warm periods in the last several millenniums (including the recent Kaufman et al. paper discussed here).
Mr. McIntyre has gained fame or notoriety, depending on whom you consult, for seeking weaknesses in NASA temperature data and efforts to assemble a climate record from indirect evidence like variations in tree rings. Last week the scientists who run Realclimate.org, several of whom are authors of papers dissected by Mr. McIntyre, fired back. The Capital Weather Gang blog has just posted its analysis of the fight. One author of an underlying analysis of tree rings Keith Briffa, responded on his Web site and at on Climateaudit.org.
What is novel about all of this is how the blog discussions have sidestepped the traditional process of peer review and publication, then review and publication of critiques, and counter-critiques, by which science normally does that herky-jerky thing called knowledge building. The result is quick fodder for those using the Instanet to reinforce intellectual silos of one kind or another.
I explored this shift in the discourse in some e-mail exchanges with Mr. McIntyre and some of his critics, including Thomas Crowley, a University of Edinburgh specialist in unraveling past climate patterns. Dr. Crowley and Mr. McIntyre went toe to toe from 2003 through 2005 over data and interpretations. I then forwarded to Mr. McIntyre what amounted to a challenge from Dr. Crowley:
Thomas Crowley (now in Edinburgh) has sent me a note essentially challenging you to develop your own time series [of past climate patterns] (kind of a “put up or shut up” challenge). Why not do some climate science and get it published in the literature rather than poking at studies online, having the blogosphere amplify or distort your findings in a kind of short circuit that may not help push forward understanding?
As [Dr. Crowley] puts it: “McIntyre is really tiresome - notice he never publishes an alternate reconstruction that he thinks is better, oh no, because that involves taking a risk of him being criticized. He just nitpicks others. I don’t know of anyone else in science who actually does such things but fails to do something constructive himself.”
Here’s Mr. McIntyre’s reply (to follow references to publications you’ll need to refer to the linked papers). In essence, he says he sees no use in trying his own temperature reconstruction given the questions about the various data sets one would need to utilize:
The idea that I’m afraid of “taking a risk” or “taking a risk of being criticized” is a very strange characterization of what I do. Merely venturing into this field by confronting the most prominent authors at my age and stage of life was a far riskier enterprise than Crowley gives credit for. And as for “taking a risk of being criticized”? Can you honestly think of anyone in this field who is subjected to more criticism than I am? Or someone who has more eyes on their work looking for some fatal error?
The underlying problem with trying to make reconstructions with finite confidence intervals from the present roster of proxies is the inconsistency of the “proxies,” a point noted in McIntyre and McKitrick (PNAS 2009) in connection with Mann et al 2008 (but applies to other studies as well) as follows:
Paleoclimate reconstructions are an application of multivariate calibration, which provides a theoretical basis for confidence interval calculation (e.g., refs. 2 and 3). Inconsistency among proxies sharply inflates confidence intervals (3). Applying the inconsistency test of ref. 3 to Mann et al. A.D. 1000 proxy data shows that finite confidence intervals cannot be defined before ~1800.
Until this problem is resolved, I don’t see what purpose is served by proposing another reconstruction.
Crowley interprets the inconsistency as evidence of past “regional” climate, but offers no support for this interpretation other than the inconsistency itself –- which could equally be due to the “proxies” not being temperature proxies. There are fundamental inconsistencies at the regional level as well, including key locations of California (bristlecones) and Siberia (Yamal), where other evidence is contradictory t.o Mann-Briffa approachs (e.g. Millar et al 2006 re California; Naurzbaev et al 2004 and Polar Urals re Siberia,) These were noted up in the N.A.S. panel report, but Briffa refused to include the references in I.P.C.C. AR4. Without such detailed regional reconciliations, it cannot be concluded that inconsistency is evidence of “regional” climate as opposed to inherent defects in the “proxies” themselves.
The fundamental requirement in this field is not the need for a fancier multivariate method to extract a “faint signal” from noise – such efforts are all too often plagued with unawareness of data mining and data snooping. These problems are all too common in this field (e.g. the repetitive use of the bristlecones and Yamal series). I think that I’ve made climate scientists far more aware of these and other statistical problems than previously, whether they are willing to acknowledge this in public or not, and that this is highly “constructive” for the field.
As I mentioned to you, at least some prominent scientists in the field accept (though not for public attribution) the validity of our criticisms of the Mann-Briffa style reconstruction and now view such efforts as a dead end until better quality data is developed. If this view is correct, and I believe it is, then criticizing oversold reconstructions is surely “constructive” as it forces people to face up to the need for such better data.
Estimates provided to me (again without the scientists being prepared to do so in public) were that the development of such data may take 10-20 years and may involve totally different proxies than the ones presently in use. If I were to speculate on what sort of proxies had a chance of succeeding, it would be ones that were based on isotope fractionation or other physical processes with a known monotonic relationship to temperature and away from things like tree ring widths and varve thicknesses. In “deep time,” ice core O18 and foraminifera Mg/Ca in ocean sediments are examples of proxies that provide consistent or at least relatively consistent information. The prominent oceanographer Lowell Stott asked to meet with me at AGU 2007 to discuss long tree ring chronologies for O18 sampling. I sent all the Almagre cores to Lowell Stott’s lab, where Max Berkelhammer is analyzing delO18 values.
Underlying my articles and commentary is the effort to frame reconstructions in a broader statistical framework (multivariate calibration) where there is available theory, a project that seems to be ignored both by applied statisticians and climate scientists. At a 2007 conference of the American Statistical Association to which Caspar Ammann (but not me) was invited, it was concluded:
While there is undoubtedly scope for statisticians to play a larger role in paleoclimate research, the large investment of time needed to become familiar with the scientific background is likely to deter most statisticians from entering this field. http://www.climateaudit.org/?p=2280
I’ve been working on this from time to time over the past few years and this too seems “highly constructive” to me and far more relevant to my interests and skills than adding to the population of poorly constrained “reconstructions,” as Crowley proposes.
In the meantime, studies using recycled proxies and problematic statistical methods continue to be widely publicized. Given my present familiarity with the methods and proxies used in the field, I believe that there is a useful role for timely analysis of the type that I do at Climate Audit. It would be even more constructive if the authors rose to the challenge of defending their studies.
Given the importance of climate change as an issue, it remains disappointing that prompt archiving of data remains an issue with many authors and that funding agencies and journals are not more effective in enforcing existing policies or establishing such policies if existing policies are insufficient. It would be desirable as well if journals publishing statistical paleoclimate articles followed econometric journal practices by requiring the archiving of working code as a condition of review. While progress has been slow, I think that my efforts on these fronts, both data and code, have been constructive. It is disappointing that Crowley likens the archiving of data to doing a tax return. It’s not that hard. Even in blog posts (e.g. the Briffa post in question), I frequently provide turnkey code enabling readers to download all relevant data from original sources and to see all statistical calculations and figures for themselves. This is the way that things are going to go – not Crowley’s way.
So should this all play out within the journals, or is there merit to arguments of those contending that the process of peer review is too often biased to favor the status quo and, when involving matters of statistics, sometimes not involving the right reviewers?
Another scientist at the heart of the temperature-reconstruction effort, Michael Mann of Pennsylvania State University, said that if Mr. McIntyre wants to be taken seriously he has to move more from blogging to publishing in the refereed literature.
“Skepticism is essential for the functioning of science,” Dr. Mann said. “It yields an erratic path towards eventual truth. But legitimate scientific skepticism is exercised through formal scientific circles, in particular the peer review process.” He added: “Those such as McIntyre who operate almost entirely outside of this system are not to be trusted.”
2009/10/03
Grape harvest
Nothing seems to give a useful proxy to temperature. Some of the better ones are grape harvest and budbust dates. But these only go back to about 1300s.


Note that grape harvest has not been converted to temp. so high temp = early harvest!
Note that grape harvest has not been converted to temp. so high temp = early harvest!
Labels:
grape harvest,
proxy,
temperature records
Subscribe to:
Posts (Atom)