The data used is short

The data is noisy

Subtracting noisy signals does not improve accuracy!!

{

**UPDATE**This data has now changed - I have nulled out the day of year changes and the long term variation(whole record) which significantly changes the results - the results will be posted at a later date]

Basically if CO2 is low then "back radiation" (DLWIR) should be lower than when CO2 is high

There is an annual cycly where CO2 dips in late spring and rises in autumn - see other posts.

So if you remove all factors changing downward long wave infrared radiation other than CO2 then what should be left is the yearly change in CO2 plus the long term increase.

The nulled data is inspected and a simple curve fit is applied and limits chosen that provide the best null for that factor.

Returned data that meets the criteria are averaged using a TRIMMEAN function to remove spurious high/low values

If the data is treated as a reapeated annual set then the long term becomes averaged and only the annual effect remains.

In the plots below the Nulled measurements are shown and CO2 at La Jolla is plotted for comparison.

The hourly measurement data is used

The analysis has been run many times each time there is always a dip starting at ~190 ( some ~60 days after the CO2 starts reducing)

Accuracy is nonsensical if less than 3 valid data are returned This unfortunately eliminates dec jan feb!.

However here are the final plots:

The raw data (all points returning under 3 samples ignored) compared to La Jolla CO2 |

The smoothed data (all points returning under 3 samples ignored) compare to La Jolla CO2 |

Precipitation limit is set to eliminate any reading during "precipitation"

Cloud can only be measured during daylight

Only opaque cloud is considered

Humidity % is not used but is converted to absolute water vapour

### The Nulling Process

Each of the variables is nulled by plotting dlwir against the variable. Fitting a polynomial (order 1 to 6) to the resultant and then providing limits that deviate from the polynomial. The polynomial is then applied to the extracted data.Each variable is treated this way and then the process repeated until little change occurs. This produces the follwing limits.

start month | 1 |

End month | 12 |

hour min | 11 |

hour max | 15 |

Temp min | 12.4 |

Temp max | 29.4 |

Humidity Min | 0 |

Humidity Max | 1000 |

opaque Cloud Cover % min | 2.8 |

opaque Cloud Cover % Max | 30.9 |

cloud cover min | -999999 |

cloud cover max | 1000 |

abs humid min | 2.12 |

abs humid max | 10.5 |

dlwir min | 0 |

dlwir max | 1000 |

ulwir min | 445 |

ulwir max | 595 |

dlwir as pc uplwir min | 0 |

dlwir as pc uplwir max | 100 |

start day | 1 |

end day | 19.2499 |

Pressure Min | 809 |

Pressure Max | 825 |

precipitation min | -1 |

precipitation Max | 0.00001 |

These are the corrections applied:

Temperature | opaque cld | ABS HUMIDITY | ULWIR | hour | Station pressure | |

x^6 | -2.925607E-06 | 0.00E+00 | 0 | 0 | 0 | 0 |

x^5 | 4.16E-04 | -1.30E-05 | 0 | -1.73074E-09 | 0.01161075 | 0 |

x^4 | -2.35E-02 | 1.05E-03 | -0.01922243 | 4.37603E-06 | -0.794129 | 0 |

x^3 | 6.78E-01 | -3.04E-02 | 0.5449762 | -0.004404108 | 21.56523 | -0.001826181 |

x^2 | -1.05E+01 | 3.83E-01 | -5.604792 | 2.20558 | -290.4116 | 4.45165 |

x | 8.40E+01 | -1.07E+00 | 29.91783 | -549.6395 | 1938.646 | -3617.085 |

c | 3.22E+01 | 2.81E+02 | 108.4733 | 5.40E+04 | -5133.338 | 979613.7 |

The nulling plots (not prettied up!)

Red plots are the result of nulling

blue lines are before nulling

Excel sheet is available (large)

Data is from (hourly):

http://www.nrel.gov/midc/srrl_bms/

Currently ~ 80,000 lines are analysed

This comment has been removed by a blog administrator.

ReplyDelete