thefordprefect Posted Mar 31, 2011 at 8:40 PM
Some time ago I did an experiment using a digital camera at night (the camera tries to adjust for the lack of light by making the sensor more sensitive. this allows random thermal noise to produce the typical digital noise on such photos - this can only be reduced by operating the sensor at ultra low temperatures)
It is a good technique for producing pictures in near impossible conditions - take a binary number of photos, combine them in pairs using the "add" function in paintshop pro, take each summed photo and add to another summed photo. Continue adding together only 2 photos at a time until the required result is obtained.
What was the purpose:
To show that a signal buried in random noise can be extracted by averaging over many data sources.
I.e. take enough trees. Average the ring data and any common factors in the data may become visible - fertilisation, lack of nutrients; too much water, too little water; etc get reduced. but the temperature/CO2 fertilisation are not locally different and any of these or similar effects should become dominant in the averaged data. By junking obvious non responders (invalid photos of kids etc) the common signal is obtained more quickly. We what the temperature has done over the last few hundred years - is it therefore wrong to dump trees that do not conform? I knew that my photos contained no ships so why should I average my ship photos into the photo of the back garden?
Does anyone suggest that a proxy record is an exact representation of past temperatures - I have not seen such words used. All these proxies are simply work in progress (and done over a decade ago!). Reports generated a decade ago are not necessarily fixed in stone more recent ideas/data can displace such ancient documents. Why are these constantly paraded before us?
The photo experiment can be seen here:
I see no magenta style plot on these papers – can you direct me to the correct one please?
Seeing the Wood from the Trees
Keith R. Briffa and Timothy J. Osborn*
High-resolution palaeoclimatic records for the last millennium: interpretation, integration and comparison with General Circulation Model control-run temperatures
P. D. Jones, K. R. Briffa, T. P. Barnett and S. F. B. Tett The Holocene 1998
Low frequencty temperature variations from a northern tree ring density network
Briffa et al 2001
from the blog:
thefordprefect Posted Mar 26, 2011 at 7:52 AM
I’ll reply here but it will probably get pulled.
Watts is observing the current state of temperature proxies (LIG thermometers, Pt resistance thermometers, Thermistors etc in various enclosures. These all respond in a certain way to temperature – not always linear (lig will have a boiling point where it becomes decidedly non linear. They are all placed over different surfaces Snow, rain, grass growth, new tarmac, etc will all influence the air temperature measured.
Watts then removes manually any he considers does not CURRENTLY (and have not in the past?) meet the standards he is applying (cherry picking). This leaves the “good reponder” proxies.
All thermometers require calibration against kmown standards
Briffa does not have this luxury. His proxies are dead trees – there is no possibility of determining which are to be good proxies for their life. Rivers may change course affecting the water table. All trees have a inverted cup shape growth with temperature. There will be an optimum below and above which growth rates will be lower. This optimum will depend on available nutrients, surrounding competition etc. all of which will change over the life of the tree.
Trees need calibrating against known standards – the intrumental data.
McIntyre’s blog has already castigated Briffa for throwing away trees that are not good proxies (cherry picking). This leaves the good proxies. Briffa is now being called “names” for removing bad data that does not give a good proxy for temperature but which is taken from trees that for some of the period are good responders. This sounds very much like Watts is doing!
The plot Hodrick-Prescott Filtered cf 50 year average taken from data on second page of spreadsheet residing in the basement repository with "beware tigers" on the door. No info Just Briffa et al as a column heading - was this published, was this an intermediary file. Who knows? McIntyre does NOT. But McIntryre assumes much! Also chose your smoothing and end padding to get the results you require.
Adding in the data suggested by commenter with more severe filtering to untangle the spagetti a bit
A link to AR4 where the failing data is discussed:
to WUWT 11-05-03
Richard S Courtney says: May 2, 2011 at 5:00 pm
Wind farms are expensive, polluting, environmentally damaging bird swatters that produce no useful electricity at any time: they merely displace power stations onto standby mode (when the power stations continue to consume their fuel and to produce their emissions) during the periods when the wind is strong enough but not too strong for the wind turbines to generate electricity.
What do you not understand about conservation of resources. A power station runing without producing power (spinning reserve, warm start) consumes very little energy to when fully loaded. This surely is obvious? Otherwise where does the excess fuel energy go?
The RSPB consider a correctly placed windturbine to be OK.
How many birds do windows on your house wipe out (we get perhaps 4 deaths/year despite stickers on the panes).
How many birds/animals does your traveling in road vehicles wipe out?
What is the "bird slicers" to vehicles/homes ratio?
What evidence do you have that wind turbines are polluting. According to Vestas 80% of a turbine can be recycled.
From Vestas web site:
For example, a V90-3.0 MW offshore wind turbine will pay for itself more than 35 times during its lifetime – producing 284,600 MWh over the course of 20 years in
The complete life cycle analysis of a wind turbine:
Neodymium is not always used:
ENERCON news ENERCON WECs produce clean energy without neodymium
ENERCON wind energy converters (WECs) generate electricity in an environmentally friendly way without the use of the controversial element, neodymium. The gearless WEC design on which all WEC types – from the E-33/330 kW to the E-126/7.5 MW – are based includes a separately excited annular generator. The magnetic fields required by the generator to produce electricity are created electrically. By design, and unlike the majority of competing products, ENERCON WECs do without permanent magnets whose production requires neodymium.
No one thinks that a 1kW generator will produce economic electricity to the grid. But connect up a 3+MW generator and for the 28% of the time it produces power it is saving an equivalent in fossil fuels that future generations can use. Is this a bad thing?
No one expects a few hundred turbines to REPLACE fossil/nuclear generators. All know that there are times of no wind. BUT they do displace convenient energy to the future. And they do reduce all pollution.
All those you tube videos of burning and destructing turbines are good propaganda but one has to compare the permanent exclusion zone round a failed turbine to the exclusion zone round a failed reactor.
Posted May 24, 2011 at 7:03 AM
Your comment is awaiting moderation.
Good grief this is getting ridiculous!
If I wrote a paper on the JET fusion processes and why I need to use beryllium tungsten walls.
Who would review it – geologists? research chemists? joe bloggs, the blogging king?, or would it be other researchers in the field of fusion reactions and material scientists.
Would I know these others?
I would be in email, telephonic, and even social contact with them. Some would even be friends! Pals (to you).
For you to complain about my reviewers, you would have to call them dishonest. Would you be prepared to do that?
You ARE prepared to do that to climate scientist reviewers.