Context/Earth

[[ Check out my Wordpress blog Context/Earth for environmental and energy topics tied together in a semantic web framework ]]


Thursday, October 13, 2011

Vostok Ice Cores

As the partial pressure of CO2 in sea water goes like
$$ c = c_0 * e^{-E/kT}$$
and the climate sensitivity is
$$ T = \alpha * ln(c/c_1) $$
where c is the mean concentration of CO2, then it seems that one could estimate a quasi-equilibrium for the planet’s temperature. Even though they look nasty, these two equations actually solve to a quadratic equation, and one real non-negative value of T will drop out if the coefficients are reasonable.
$$T = \alpha * ln(c/c_1) - \alpha * E / kT $$
For CO2 in fresh water, the activation energy is about 0.23 electron volts. From "Global sea–air CO2 flux basedon climatological surface ocean pCO2, and seasonal biological and temperature effects" by Taro Takahashi,
The pCO2 in surface ocean waters doubles for every 16C temperature increase
(d ln pCO2/ dT=0.0423 C ).
This gives 0.354 eV.

We don’t know what c1 or c0 are and we can use estimates of climate sensitivity for α is (between 1.5/ln(2) and 4.5/ln(2)).

When solving the quadratic the two exponential coefficients can be combined as
$$ ln(w)=ln(c_0/c_1)$$
then the quasi-equilibrium temperature is approximated by this expansion of the quadratic equation.
$$ T = \alpha ln(w) – \frac{E}{k*ln(w)} $$
What the term “w” means is the ratio of CO2 in bulk to that which can effect sensitivity as a GHG.

As a graphical solution of the quadratic consider the following figure. The positive feedback of warming is given by the shallow-sloped violet curve, while the climate sensitivity is given by the strongly exponentially increasing curve. Where the two curves intersect, not enough outgassed CO2 is being produced such that the asymptotically saturated GHG can further act on. The positive feedback essentially has "hit a rail" due to the diminishing return of GHG heat retention.


We can use the Vostok ice core data to map out the rail-to-rail variations. The red curves are rails for the temperature response of CO2 outgassing, given +/- 5% of a nominal coefficient, using the activation energy of 0.354 eV. The green curves are rails for climate sensitivity curves for small variations in α.



This may be an interesting way to look at the problem in the absence of CO2 forcing. The points outside of the slanted parallelogram box are possibly hysteresis terms causes by latencies of in either CO2 sequestering or heat retention.  On the upper rail, the concentration drops below the expected value, while as drops to the lower rail, the concentration remains high for awhile.





The cross-correlation of Vostok CO2 with Temperature:

Temperature : ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/vostok/deutnat.txt
CO2 core : ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/vostok/co2nat.txt

The CO2 data is in approximately 1500 year intervals while the Temperature data is decimated more finely.  The ordering of the data is backwards from the current date so the small lead that CO2 shows in the above graph is actually a small lag when the direction of time is considered.

  

The top chart shows the direction of the CO2:Temperature movements. Lots of noise but a lagged chart will show hints of lissajous figures, which are somewhat noticeable as CCW rotations for a lag. On temperature increase, more of the CO2 is low than high, as you can see it occupying the bottom half of the top curve.

The middle chart shows where both CO2 and T are heading in the same direction. The lower half is more sparsely populated because temperature shoots up more sharply than it cools down.

The bottom chart shows where the CO2 and Temperature are out-of-phase. Again T leads CO2 based on the number you see on the high edge versus the low edge. The lissajous CCW rotations are more obvious as well.

Bottom line is that Temperature will likely lead CO2 because I can’t think of any Paleo events that will spontaneously create 10 to 100 PPM of CO2 quickly, yet Temperature forcings likely occur. Once set in motion, the huge adjustment time of CO2 and the positive feedback outgassing from the oceans will allow it to hit the climate sensitivity rail on the top.

So what is the big deal? We don’t have a historical forcing of CO2 to compare with, yet we have one today that is 100 PPM.

That people is a significant event, and whether it is important or mot we can rely on the models to help.




This is what the changes in temperature look like over different intervals.



The changes follow the MaxEnt estimator of a double sided damped exponential. A 0.2 degree C change per decade(2 degree C per century)  is very rare as you can see from the cumulative.

That curve that runs through the cumulative density function (CDF) data is a maximum entropy estimate. The following constraint generated the double-sided exponential or Laplace probability density function (PDF) shown below the cumulative:
$$\int_{I}{|x| p(x)\ dx}=w$$
which when variationally optimized gives
$$p(x)={\beta\over 2}e^{-\beta|x|},\ x\ \in I=(-\infty,\infty)$$
where I fit it to:
$$\beta = 1/0.27$$
which gives a half-width of about +/- 0.27 degrees C.

The Berkeley Earth temperature study shows this kind of dispersion in the spatially separated stations.






Another way to look at the Vostok data is as a random up and down walk of temperature changes. These will occasionally reach high and low excursions corresponding to the interglacial extremes. The following is a Monte Carlo simulation of steps corresponding to 0.0004 deg^2/year.




The trend goes as:
$$ \Delta T \sim \sqrt{Dt}$$
Under maximum entropy this retains its shape:


This can be mapped out with the actual data via a Detrended Fluctuation Analysis.
$$ F(L) = [\frac{1}{L}\sum_{j = 1}^L ( Y_j - aj - b)^2]^{\frac{1}{2}} $$
No trend in this data so the a coefficient was set to 0. This essentially takes all the pairs of points, similar to an autocorrelation function but it shows the Fickian spread in the random walk excursions as opposed to a probability of maintaining the same value.


The intervals are a century apart. Clearly it shows a random walk behavior as the square root fit goes though the data until it hits the long-range correlations.

Monday, October 10, 2011

Sea temperature correlation

This is a continuation from the Thermal Diffusion and Missing Heat post.

If one takes several months worth of data from a location, the sea surface temperature (SST) and subsurface time series looks like this over a few days.



After a cross-correlation function is applied 75 meters downward, one sees an obvious yet small lag from that level as it mixes with the lower layers.



 The lag is longer the farther the reach extends downward.  If one assumes a typical James Hansen diffusion coefficient of 1.5 cm^2/s, the diffusion is actually very slow over long distance. A Fickian diffusion would only displace 100 meters after 3 years at that rate. So the effect is fairly subtle and to detect it requires some sophisticated data processing.

Bottom-line is that the surface water is heated and it does mix with the deeper waters. Otherwise, one would not see the obvious thermal mixing as is obtained from the TAO sites.


Some other correlations using R. These show longer range correlations. Note the strong correlation oscillations which indicates that temperatures changes happen in unison and in a coherent fashion, given a lag that is less apparent at this scale. Note that the lag is in the opposite direction due to the specification that R uses for defining an ACF ( x(t+k)*y(t), instead of x(t)*y(t+k)).
 

Thermal mixing and an effective diffusivity is occurring. The only premise required is that an energy imbalance is occurring and that it will continue to occur as CO2 is above the historical atmospheric average. This imbalance shows up to a large extent in the oceans waters and is reflected as a temporal lag in global warming.




Notes: 
Have to be careful about missing data in certain sites. Any time you see a discontinuity in a correlation function, bad data (values with -9.999 typically) is usually responsible.

Wednesday, October 5, 2011

Temperature Induced CO2 Release Adds to the Problem

As a variable amount of CO2 gets released by decadal global temperature changes, it makes sense that any excess amount would have to follow the same behavior as excess CO2 due to fossil fuel emissions.

From a previous post (Sensitivity of Global Temperature), I was able to detect the differential CO2 sensitivity to global temperature variations. The correlation of temperature anomaly against d[CO2] is very strong with zero lag and a ratio of about 1 PPM change in CO2 per degree temperature change detected per month.

Now, this does not seem like much of a problem, as naively a 1 degree change over a long time span should only add one PPM during the interval. However, two special considerations are involved here. First, the measure being detected is a differential rate of CO2 production and we all know that sustained rates can accumulate into a significant quantities of a substance over time. Secondly, the atmospheric CO2 has a significant adjustment time and the excess isn't immediately reincorporated into sequestering sites. To check this, consider that a slow linear rate of 0.01 degree change per year when accumulated over 100 years will lead to a 50 PPM accumulation, if the excess CO2 is not removed from the system. This is a simple integration where f(T(t)) is the integration function :
$$ [CO2] = f_{co_2}(T(t)) = \int^{100}_0 0.01 t\, dt = \frac{1}{2} 0.01 * 100^2 = 50 $$
The sanity check on this is if you consider that a temperature anomaly of 1 degree change held over 100 years would release 100 PPM into the atmosphere. This is simply a result of Henry's Law applied to the ocean. The ocean has a large heat capacity and so will continue outgassing CO2 at a constant partial-pressure rate as long as the temperature has not reached the new thermal equilibrium. (The CO2 doesn't want to stay in an opened Coke can, and it really doesn't want to stay there when it gets warmed up)

So, if we try the impulse response we derived earlier (Derivation of MaxEnt Diffusion) to this problem, with a characteristic time that matches the IPCC model for Bern CC/TAR, standard:


As another sanity check, the convolution of this with a slow 1 degree change over the course of 100 years will lead to at least a 23 PPM CO2 increase.


Again, this occurs because we are far from any kind of equilibrium, with the ocean releasing the CO2 and the atmosphere retaining what has been released. The slow diffusion into the deep sequestering stores is just too gradual while the biotic carbon cycle is doing just that, cycling the carbon back and forth.

So now we are ready to redo the model of CO2 response to fossil-fuel emissions (Fat-Tail Impulse Response of CO2) with the extra positive feedback term due to temperature changes. This is not too hard as we just need to get temperature data that goes back far enough (the HADCRUT3 series goes back to 1850). So when we do the full combined convolution, we add in the integrated CO2 rate term f(T), which adds in the correction as the earth warms.

$$ [CO2] = FF(t) \otimes R(t) + f_{co_2}(T(t)) \otimes R(t) $$

When we compute the full convolution, the result looks like the following curve (baseline 290 PPM):


The extra CO2 addition is almost 20 PPM just as what we had predicted from the sanity check. The other interesting data feature is that it nearly recreates the cusp around the year 1940.  The previous response curve did not pick that up because it is entirely caused by the positive-feedback warming during that time period. The effect is not strong but discernible.


We will continue to watch how this plays out. What is worth looking into is the catastrophic increase of CO2 that will occur as long as the temperature stays elevated and the oceans haven't equilibrated yet.