Jump to content

GISTEMP: 2009 tied for second warmest year on record


bascule

Recommended Posts

NASA GISS has published their GISTEMP analysis for 2009:

 

http://www.giss.nasa.gov/research/news/20100121/

 

2009 was tied for the second warmest year in the modern record, a new NASA analysis of global surface temperature shows. The analysis, conducted by the Goddard Institute for Space Studies (GISS) in New York City, also shows that in the Southern Hemisphere, 2009 was the warmest year since modern records began in 1880.

 

Although 2008 was the coolest year of the decade, due to strong cooling of the tropical Pacific Ocean, 2009 saw a return to near-record global temperatures. The past year was only a fraction of a degree cooler than 2005, the warmest year on record, and tied with a cluster of other years — 1998, 2002, 2003, 2006 and 2007 1998 and 2007 — as the second warmest year since recordkeeping began.

 

So yes, contrary to various anecdotal reports of unusually cold winters, that "global warming has stalled", and other such nonsense, 2009 was actually unusually warm. Warming is continuing, and the earth has not gone into a "cooling phase".

 

NASA also notes that the previous decade was the warmest decade on record.

 

Latest plot of global mean surface temperature over time available here

Link to comment
Share on other sites

  • Replies 84
  • Created
  • Last Reply

Top Posters In This Topic

Given GISS and Hansen's recent record on "warmest ever" proclamations I would give this a few months to stew before checking it into evidence.

 

Hansen and GISS glide through the climate debate with these proclamations when the dirty little secret is that GISStemp records, and certainly the proclamations, are not subject to the same peer review processes that they are quick to use in disqualifying their detractors.

 

GISSTEMP has numerous outstanding issues that hopefully now can be addressed in the open. Some of the past errors that were corrected after undue hardship (due to stonewalling on Hansen's part):

 

There was the Y2K flub where they started measuring temps later in the day after the Y2K software update and failed to adjust.. this was the source of the "8 of the 10 warmest years were in the last decade" claim... which had to be downgraded to 2 out of 10. Oops.

 

And the California problem where they have eliminated inland weather station data from their calculation of global mean leaving 6 coastal stations which artificially inflated the warming across all of California and a 0.6 bias across the U.S.

 

And then their was the failure to adjust ocean temp records for the switch from bucket water collection to engine intake.

 

The kicker is that these errors have since been acknowledged after some resistance, but instead of casting a skeptical eye on GISSTemp and Hansen, those who get it right are the ones who take the heat from the climate community.

 

So, like I said, let this stew for a while. Hansen's track record is abysmal. Hell, there are more demonstrable problems with GISS than with CRUTemp and HADCrut, and those were taken offline for concerns of data tampering.

Edited by jryan
Link to comment
Share on other sites

 

This has already been discussed ad nauseam on these forums (most recently on this thread). Also, what you're saying is blatantly wrong.

 

Here's the end results of the Y2K bug:

 

http://data.giss.nasa.gov/gistemp/2007/

 

Data Flaw

 

Finally, we note that a minor data processing error found in the GISS temperature analysis in early 2007 does not affect the present analysis. The data processing flaw was failure to apply NOAA adjustments to United States Historical Climatology Network stations in 2000-2006, as the records for those years were taken from a different data base (Global Historical Climatology Network). This flaw affected only 1.6% of the Earth's surface (contiguous 48 states) and only the several years in the 21st century. As shown in Figure 4 and discussed elsewhere, the effect of this flaw was immeasurable globally (~0.003°C) and small even in its limited area. Contrary to reports in certain portions of the media, the data processing flaw did not alter the ordering of the warmest years on record. Obviously the global ranks were unaffected. In the contiguous 48 states the statistical tie among 1934, 1998 and 2005 as the warmest year(s) was unchanged. In the current analysis, in the flawed analysis, and in the published GISS analysis (Hansen et al. 2001), 1934 is the warmest year in the contiguous states (not globally) but by an amount (magnitude of the order of 0.01°C) that is an order of magnitude smaller than the uncertainty.

 

Fig4_correction.gif

 

 

I don't know enough about this to comment' date=' nor can I find any sort of verifications of this analysis. This includes the typical FUD of "climate skeptics": he immediately jumps to the conclusion that "selection bias" is involved and apparently won't entertain any other explanation. This sounds like what jackson33 was referencing earlier on the other Global Warming thread so so perhaps he wishes to opine.

 

That said, just to keep this in perspective the state of California comprises 0.08% of the Earth's surface.

 

 

And the anticipated effect on the analysis is?

 

So, like I said, let this stew for a while. Hansen's track record is abysmal. Hell, there are more demonstrable problems with GISS than with CRUTemp and HADCrut, and those were taken offline for concerns of data tampering.

 

FUD FUD FUD. "Hansen's track record is abysmal" because of three errors, only one of which given your description can actually be quantified? (because I pasted the relevant information)

 

jryan, you are posting blatant falsehoods (see swansont's post) and slandering Dr. Hansen. I have a suggestion. Take the devil's advocate position. Try to think about these things through the lens of someone who isn't actively trying to disparage climate science. Think critically about the issue instead of just copying and pasting whatever drivel you find on "climate skeptic" blogs.

 

Let's take a look at Clear Climate Code, an open source attempt to reimplement GISTEMP in Python and improve the clarity of their codebase so it is easier for climate skeptics and laymen to understand:

 

http://clearclimatecode.org/gistemp/

 

We have now converted all of the GISS code to Python. Naturally we have found (minor) bugs while doing this, but nothing else. We are currently “catching up” so that the code in ccc-gistemp reflects the changes that GISS have made to GISTEMP (such as using the USHCN version 2 dataset; see issue 7).

 

It is our opinion that the GISTEMP code performs substantially as documented in Hansen, J.E., and S. Lebedeff, 1987: Global trends of measured surface air temperature. J. Geophys. Res., 92, 13345-13372., the GISTEMP documentation, and other papers describing updates to the procedure.

 

Yes, GISTEMP has bugs and data errors. But that's no reason to speak in ridiculous hyperbole like "Hansen's track record is abysmal." What matters is how much these bugs and errors actually affect the final analysis. So far, the answer is: not much.

Link to comment
Share on other sites

I wouldn't have thought a < 0.01 ºC change could have done that to the global temperature averages.

 

Oh, wait. It didn't.

http://en.wikipedia.org/wiki/Temperature_record_since_1880

http://www.sciencedaily.com/releases/2007/12/071213101419.htm

 

Here is a good example of why you should be skeptical about what you read.

 

That second article ignores the correction a full 3 months after NASA admitted that they got it wrong.

 

The first wiki article is comparing temperatures over the last 20 years, not the last 100.. so excuse me if I'm not bowled over by it.

Link to comment
Share on other sites

Here is a good example of why you should be skeptical about what you read.

 

That second article ignores the correction a full 3 months after NASA admitted that they got it wrong.

 

the re-ranking did not affect global records

Do you contend this or not?

 

There is a major difference ebwteen what these scientists do and what the denialists do — the scientists corrected their analysis. You might object to the time scale over which that occurred, but consider the number of zombie arguments made from the denialist camp, that have NEVER been changed, even though they are continually debunked.

 

 

The first wiki article is comparing temperatures over the last 20 years, not the last 100.. so excuse me if I'm not bowled over by it.

 

Uh, no, that's not what the page says. It's just that all of the highest temperatures on record are recent.

 

The website[1] of the National Oceanic and Atmospheric Administration contains detailed data of the annual land and ocean temperature since
1880
.[2]

(emphasis added)

 

 

1983 is listed, which trivially falsifies your claim.

Link to comment
Share on other sites

This has already been discussed ad nauseam on these forums (most recently on this thread). Also, what you're saying is blatantly wrong.

 

Here's the end results of the Y2K bug:

 

http://data.giss.nasa.gov/gistemp/2007/

 

Data Flaw

 

Finally, we note that a minor data processing error found in the GISS temperature analysis in early 2007 does not affect the present analysis. The data processing flaw was failure to apply NOAA adjustments to United States Historical Climatology Network stations in 2000-2006, as the records for those years were taken from a different data base (Global Historical Climatology Network). This flaw affected only 1.6% of the Earth's surface (contiguous 48 states) and only the several years in the 21st century. As shown in Figure 4 and discussed elsewhere, the effect of this flaw was immeasurable globally (~0.003°C) and small even in its limited area. Contrary to reports in certain portions of the media, the data processing flaw did not alter the ordering of the warmest years on record. Obviously the global ranks were unaffected. In the contiguous 48 states the statistical tie among 1934, 1998 and 2005 as the warmest year(s) was unchanged. In the current analysis, in the flawed analysis, and in the published GISS analysis (Hansen et al. 2001), 1934 is the warmest year in the contiguous states (not globally) but by an amount (magnitude of the order of 0.01°C) that is an order of magnitude smaller than the uncertainty.

 

Fig4_correction.gif

 

 

 

I don't know enough about this to comment, nor can I find any sort of verifications of this analysis. This includes the typical FUD of "climate skeptics": he immediately jumps to the conclusion that "selection bias" is involved and apparently won't entertain any other explanation. This sounds like what jackson33 was referencing earlier on the other Global Warming thread so so perhaps he wishes to opine.

 

That said, just to keep this in perspective the state of California comprises 0.08% of the Earth's surface.

 

And keep in mind that Hansen is making such decisions on data covering 100% of the Earth's surface. Or, if you like.. consider that the buck/intake/unkown adjustment error covers the oceans... which is 71% of the Earths surface.

 

 

 

And the anticipated effect on the analysis is?

 

It's in the article. It is a 0.3 bias in sea temps following the move to engine intake. But the bigger answer is "who knows"? There is a definite bias between the two measurement types (0.3C), but the trouble is that, as you can tell from the graph provided in the first CA article I posted, those trying to quantify these varying measurement types can't really be certain of larges amounts of data as late as 1985 when nearly 1/4th of all measurements were from unknown methods... and therefor can't be adjusted for accuracy.

 

So the bigger answer is who really knows the effect? Do you not care that historic global ocean surface temp data is that muddled?

 

FUD FUD FUD. "Hansen's track record is abysmal" because of three errors, only one of which given your description can actually be quantified? (because I pasted the relevant information)

 

Yes, they are quantified, and his "warmest ever" claims are particularly abysmal once anyone bothers to check the math.

 

jryan, you are posting blatant falsehoods (see swansont's post) and slandering Dr. Hansen. I have a suggestion. Take the devil's advocate position. Try to think about these things through the lens of someone who isn't actively trying to disparage climate science. Think critically about the issue instead of just copying and pasting whatever drivel you find on "climate skeptic" blogs.

 

 

Retracted. The U.S. claim was countered by a global claim.

 

Let's take a look at Clear Climate Code, an open source attempt to reimplement GISTEMP in Python and improve the clarity of their codebase so it is easier for climate skeptics and laymen to understand:

 

http://clearclimatecode.org/gistemp/

 

We have now converted all of the GISS code to Python. Naturally we have found (minor) bugs while doing this, but nothing else. We are currently “catching up” so that the code in ccc-gistemp reflects the changes that GISS have made to GISTEMP (such as using the USHCN version 2 dataset; see issue 7).

 

It is our opinion that the GISTEMP code performs substantially as documented in Hansen, J.E., and S. Lebedeff, 1987: Global trends of measured surface air temperature. J. Geophys. Res., 92, 13345-13372., the GISTEMP documentation, and other papers describing updates to the procedure.

 

Yes, GISTEMP has bugs and data errors. But that's no reason to speak in ridiculous hyperbole like "Hansen's track record is abysmal." What matters is how much these bugs and errors actually affect the final analysis. So far, the answer is: not much.

 

This is a bizarre defense of GISSTemp, Bascule, and I would expect you to know why.

 

Converting computer code for GISSTemp is not the same as auditing the underlying assumptions that are being plugged into the program (UHI, Bucket adjsutments, Siberian data transposition, and so on). I could write a program in COBOL that calculates bullet trajectory on Earth but if I use a gravity constant of 8.91 m/s2 then taking that code and converting it to Python will not make my calculations any more accurate because my underlying data was incorrect.

 

This Python conversion makes the GISS program easier to follow, but these guys weren't looking at the data, or the UHI, or the validity of the data point selections, etc. etc. etc.

 

The Climatecode project is a handy tool, but it is not a valuable audit of GISSTemp data as a final product.

Edited by jryan
Link to comment
Share on other sites

And keep in mind that Hansen is making such decisions on data covering 100% of the Earth's surface. Or, if you like.. consider that the buck/intake/unkown adjustment error covers the oceans... which is 71% of the Earths surface.

 

Remote sensing of SST provides excellent accuracy. That doesn't help historical data but going forward it's some of the best data analyses like GISTEMP have to work with.

 

It's in the article. It is a 0.3 bias in sea temps following the move to engine intake. But the bigger answer is "who knows"?

 

Anyone who plugs the relevant data into GISTEMP or CCC and performs their own analysis knows. Yet none of these "skeptics" seem to do that for some reason...

 

So the bigger answer is who really knows the effect? Do you not care that historic global ocean surface temp data is that muddled?

 

FUD FUD FUD... honestly I'm getting tired of it jryan. I certainly sympathize with the scientists who withheld their data because they have to deal with people like you all day.

 

It would be great if you started with what we actually know and work forward instead of starting with the foregone conclusion that climate science is a big conspiracy and work backward.

 

Yes, they are quantified, and his "warmest ever" claims are particularly abysmal once anyone bothers to check the math.

 

No? To reiterate what I said earlier: why aren't these skeptics plugging their alleged inaccuracies into GISTEMP and checking the maths themselves?

 

Anyone can do that. The source code is freely available. The data are freely available. CCC is written in Python and can be easily run on any modern computer with a Python interpreter installed.

 

This will demonstrate the effect these alleged inaccuracies have on the final GMST assessment. If it's big, then it's a cause for concern. If it's within the expected error bars, then, well, GISS has already admitted there is that much room for error and it's to be expected.

 

This is a bizarre defense of GISSTemp, Bascule, and I would expect you to know why.

 

Converting computer code for GISSTemp is not the same as auditing the underlying assumptions that are being plugged into the program

 

They aren't auditing the data but they are auditing the GISTEMP program itself. Both are important. They are attempting to understand the underlying maths as described in Hansen, J.E., and S. Lebedeff, 1987: Global trends of measured surface air temperature. J. Geophys. Res., 92, 13345-13372. In the process they have discovered bugs in GISTEMP. Much like when Steve McIntyre discovered problems with the GISTEMP data, GISS corrected the reported bugs. Allegations that GISS is unwilling to correct problems with their code or data (allegations you made earlier) are completely unsubstantiated.

 

The Climatecode project is a handy tool, but it is not a valuable audit of GISSTemp data as a final product.

 

To reiterate: they're not auditing the data, they're auditing the code.

 

GISS does not produce the data, sources like NOAA/NCDC do. There is certainly concern that GISS is using the NOAA/NCDC data properly. The Y2K bug was an incident where they were not applying the appropriate corrections. However this is not a problem with the data itself. It is a problem with how GISTEMP was using the input data.

 

So again: it's extremely important to audit the code. The bugs CCC has found, and the bug Steve McIntyre found, were both bugs in the GISTEMP code, not problems with the source data. GISS is not responsible for the source data.

Link to comment
Share on other sites

the re-ranking did not affect global records

Do you contend this or not?

 

There is a major difference ebwteen what these scientists do and what the denialists do — the scientists corrected their analysis. You might object to the time scale over which that occurred, but consider the number of zombie arguments made from the denialist camp, that have NEVER been changed, even though they are continually debunked.

 

You countered the Hansen screw up with U.S. records with a Global climate record dating to 1850. Sorry I missed your switcheroo.

 

This changes the point back to global climate trends rather than Hansen's numerous failures in regional claims.

 

But as for the "warmest on record claims" in general, that's are really not terribly informative since recorded temperature is a 150 year span bounded at 1850 by the end of the little ice age. (to use your California argument, 150 years is only 0.0006% of the time since the beginning of the current Earth Epoch (end of the Tertiary)).

 

So by claiming that 2009 is the warmest since 1850 is pointless unless I was arguing that the Earth hasn't warmed since 1850, which I haven't.

 

 

Uh, no, that's not what the page says. It's just that all of the highest temperatures on record are recent.

 

Again, you countered the Y2K argument with a different claim. My bad for not catching it, but your claim does nothing to defend Hansen's veracity.

 

The website[1] of the National Oceanic and Atmospheric Administration contains detailed data of the annual land and ocean temperature since
1880
.[2]

(emphasis added)

 

 

1983 is listed, which trivially falsifies your claim.

 

Apologies. It doesn't change the triviality of the claim, however. For the sake of argument I will assume that Hansen's numbers are correct, however, and we can look instead at the real importance of the last 150 years of warming.

 

According to the climate reconstruction models, even if the last 150 years is the "warmest on record", that would be because, according to the chart of reconstructions, 1940 onward (or so) is the first extended period in 1000 years where the global climate has been above the mean.

 

1000_Year_Temperature_Comparison.png

(click for source)

 

Should we expect an endless streak of negative anomalies (assuming, for the sake of argument, there is value to these reconstructions).

 

On the shorter 150 graphs you have provided it shows that about 1979 and on (or 30 years of the last 150.. plus a short period in the 1940s) that has even been above Earth's mean temperature:

 

Instrumental_Temperature_Record.png

 

Care to explain to me why being above the mean for 30 years is more important than the 120 (or 1000) tears before that we weren't?

 

The headline could just as easily read "Last 30 years is the first time in 1000 years that the earths temperature was above average: Breaks 1000 year cold streak".

 

I guess that wouldn't have the same political bite, however.

Link to comment
Share on other sites

Care to explain to me why being above the mean for 30 years is more important than the 120 (or 1000) tears before that we weren't?

 

The headline could just as easily read "Last 30 years is the first time in 1000 years that the earths temperature was above average: Breaks 1000 year cold streak".

 

I guess that wouldn't have the same political bite, however.

 

because you don't understand how averages work? if you lop off the last century the average temperature over the course of a century is on the flatish bit of the graph.

 

not that the scale on the y axis represents above/below averages. it represents the difference between that year and a reference year.

 

use the graphs right, if you don't it just takes the wind out of your arguement and gives the impression you don't really know what you're talking about.

Link to comment
Share on other sites

Remote sensing of SST provides excellent accuracy. That doesn't help historical data but going forward it's some of the best data analyses like GISTEMP have to work with.

 

 

 

Anyone who plugs the relevant data into GISTEMP or CCC and performs their own analysis knows. Yet none of these "skeptics" seem to do that for some reason...

 

 

 

FUD FUD FUD... honestly I'm getting tired of it jryan. I certainly sympathize with the scientists who withheld their data because they have to deal with people like you all day.

 

It would be great if you started with what we actually know and work forward instead of starting with the foregone conclusion that climate science is a big conspiracy and work backward.

 

 

 

No? To reiterate what I said earlier: why aren't these skeptics plugging their alleged inaccuracies into GISTEMP and checking the maths themselves?

 

Anyone can do that. The source code is freely available. The data are freely available. CCC is written in Python and can be easily run on any modern computer with a Python interpreter installed.

 

This will demonstrate the effect these alleged inaccuracies have on the final GMST assessment. If it's big, then it's a cause for concern. If it's within the expected error bars, then, well, GISS has already admitted there is that much room for error and it's to be expected.

 

 

 

They aren't auditing the data but they are auditing the GISTEMP program itself. Both are important. They are attempting to understand the underlying maths as described in Hansen, J.E., and S. Lebedeff, 1987: Global trends of measured surface air temperature. J. Geophys. Res., 92, 13345-13372. In the process they have discovered bugs in GISTEMP. Much like when Steve McIntyre discovered problems with the GISTEMP data, GISS corrected the reported bugs. Allegations that GISS is unwilling to correct problems with their code or data (allegations you made earlier) are completely unsubstantiated.

 

They are completely substantiated. Again, that is why there was a need for an FOIA filing in the first place.

 

I'm now wondering is Nicholas Barnes is the same "Nicholas" that has been working with Steve McIntyre to clean up Hansen code since 2008?

 

The timing seems to match between the two.

 

Anyway, it took a full year for McIntyre to get Hansen's code following his initial discovery of Hansen's Y2K error

 

 

To reiterate: they're not auditing the data, they're auditing the code.

 

GISS does not produce the data, sources like NOAA/NCDC do. There is certainly concern that GISS is using the NOAA/NCDC data properly. The Y2K bug was an incident where they were not applying the appropriate corrections. However this is not a problem with the data itself. It is a problem with how GISTEMP was using the input data.

 

So again: it's extremely important to audit the code. The bugs CCC has found, and the bug Steve McIntyre found, were both bugs in the GISTEMP code, not problems with the source data. GISS is not responsible for the source data.

 

 

We seem to be trouble communicating here. When I say there is a problem with GISS data I am talking about the value added that spits out the other end of the code. But Hansen is certainly responsible for the data he feeds into his programs as much as he is the data that comes out the other end.

 

He is also responsidle for the meta data that he includes in the code that smooths the data within the code itself.

 

I'm sure we can keep passing the buck all the way down to the bowsens mate that was dragging the buckets onboard ship if you want, but it isn't particularly helpful. McIntyre is/was an outsider with no access to the programs themselves and he was able to find problems with Hansen's code that Hansen apparently couldn't. Why should I expect less diligence from Hansen than from McIntyre?

Link to comment
Share on other sites

jryan, I'll wait for you to clean up the formatting of your previous post, but...

 

Again, that is why there was a need for an FOIA filing in the first place.

 

You keep claiming the GISTEMP code was released under an FOIA claim. Can you substantiate that at all?

 

According to the Wayback Machine the GISTEMP source code has been available off the GISS web site since September, 2007:

 

http://web.archive.org/web/20070911181959/http://data.giss.nasa.gov/gistemp/sources/

 

This is one month after Steve McIntyre's original request.

 

I don't think a FOIA request was ever involved. Why are you jumping to the conclusion that it was? Do you have some information I don't? Or are you just alleging malice...

 

 

A full year? When are you alleging Steve McIntyre made his original request? The page you link shows he made the request on Aug 2007, and Hansen delivered Sept 2007. Did you instead mean "a full month"?

Edited by bascule
Link to comment
Share on other sites

because you don't understand how averages work? if you lop off the last century the average temperature over the course of a century is on the flatish bit of the graph.

 

not that the scale on the y axis represents above/below averages. it represents the difference between that year and a reference year.

 

use the graphs right, if you don't it just takes the wind out of your arguement and gives the impression you don't really know what you're talking about.

 

IA- he does bring up a good point that deserves a better answer. I can't tell for certain without actual data, but just from looking at the past 1000 years, there is very clearly a much larger area between the deviation lines and average on the low temperature side than the area between the deviation lines and the average on the high temperature side. If they were equal, the areas between the deviation and the average line would be equal.

 

Even the 1880 to 2010 graph appears to show a larger area under the curve on the low temperature side than on the high temperature side.

 

If the average is correctly chosen, shouldn't the areas between the deviation lines above and below the average be equal?

 

Take off the last 70 years or so and everything is below the average. This is clearly not possible since the influence of the past 70 years isn't enough to bring the average that high.

 

I suspect you are correct and the global "average" may be taken from a much longer timeframe...but I don't have time right now to look into it further.

 

Jryan - Your observation appears correct to me, however I'd like to point out it indicates that the "warm-up" is actually more severe than is being claimed...again I don't have time now to look into it further, maybe over the weekend...

Link to comment
Share on other sites

IA- he does bring up a good point that deserves a better answer. I can't tell for certain without actual data, but just from looking at the past 1000 years, there is very clearly a much larger area between the deviation lines and average on the low temperature side than the area between the deviation lines and the average on the high temperature side. If they were equal, the areas between the deviation and the average line would be equal.

 

Even the 1880 to 2010 graph appears to show a larger area under the curve on the low temperature side than on the high temperature side.

 

you are making the same mistake. the anomaly is with reference to a specific year, not an average. the average on the graph is a running five year average and is represented by the red line.

 

If the average is correctly chosen, shouldn't the areas between the deviation lines above and below the average be equal?

 

if an average had actually been chosen...

 

Take off the last 70 years or so and everything is below the average. This is clearly not possible since the influence of the past 70 years isn't enough to bring the average that high.

 

if you take off the last 70 years you need to calculate the new average. i thought this was self evident. obviously not.

 

I suspect you are correct and the global "average" may be taken from a much longer timeframe...but I don't have time right now to look into it further.

 

again, it isn't an average. its based on a reference point.

Link to comment
Share on other sites

because you don't understand how averages work? if you lop off the last century the average temperature over the course of a century is on the flatish bit of the graph.

 

not that the scale on the y axis represents above/below averages. it represents the difference between that year and a reference year.

 

use the graphs right, if you don't it just takes the wind out of your arguement and gives the impression you don't really know what you're talking about.

 

 

That's not the point of an anomaly graph. The fact that the 0 in the 150 graph falls roughly in the middle is by range selection.... but there is no reason to list an anomaly graph wherein the entire data set is presented.

 

And your claim is certainly not the case for the 1100 year reconstruction graph or the mean for the 1100 years would be well below the given zero. Otherwise the anomalies would average out to zero across the time line with a self contained mean. Clearly they don't.

 

Furthermore, the 1100 graph and the 120 year graph both show the 2000 anomaly as +0.4, so that 0 line appears to be the same, or very close to the same, mean temperature for both graphs.

Link to comment
Share on other sites

That's not the point of an anomaly graph. The fact that the 0 in the 150 graph falls roughly in the middle is by range selection.... but there is no reason to list an anomaly graph wherein the entire data set is presented.

 

the zero point on that graph is taken as the average temperature from 1961 to 1991. if you are going to talk about the area with positive values and negative vaules the nyou can only consider the points within that time frame, outside that the analysis is invalid. this is merely to be taken as a reference point for the temperature anomaly, they needed a zero value and that was a handy timeframe where accurate data was available.

 

do you understand now?

 

And you claim is certainly not the case for the 1000 year reconstruction graph and the mean for just the 1100 years is well below.. otherwise the anomalies would average out to zero across the time line with a self contained mean. Clearly they don't.

 

i said 100 years. not 1000.

 

Furthermore, the 1100 graph and the 120 year graph both show the 2000 anomaly as +0.4, so that 0 line appears to be the same, or very close to the same, mean temperature for both.

 

this is waht happens when two graphs representing the same thing (temperature with time)use the same reference point, they agree with each other. funny that.

 

EDIT: i just seen that sherlock got my point so i obviously wasn't being too cryptic there.

Link to comment
Share on other sites

the zero point on that graph is taken as the average temperature from 1961 to 1991. if you are going to talk about the area with positive values and negative vaules the nyou can only consider the points within that time frame, outside that the analysis is invalid. this is merely to be taken as a reference point for the temperature anomaly, they needed a zero value and that was a handy timeframe where accurate data was available.

 

do you understand now?

 

i said 100 years. not 1000.

 

this is waht happens when two graphs representing the same thing (temperature with time)use the same reference point, they agree with each other. funny that.

 

EDIT: i just seen that sherlock got my point so i obviously wasn't being too cryptic there.

 

Yeah, I see that. But why choose 1961-1991 as the zero, and why 1100 years? Is it more than just arbitrary?

Link to comment
Share on other sites

1961-1991 had good temperature records. so they averaged them and set that as a baseline. the reference point doesn't really matter in particular, they could use absolute zero if they wanted to.

 

and where are you getting 1100 years from? the other graph covers a span of 1000(with 2004 tacked on the end) years.

 

i have to think that you're not really reading what you yourself are posting.

Link to comment
Share on other sites

Here is what I am trying to get at with the anamoly line in these graphs:

 

What if, arbitrarily, we chose to zero the graphs on, say, 1650-1680.

 

If we are to assume that historic reconstructions are robust and of value there is no reason not to place the anomaly 0 at say 1675 (if that is indeed the mean temp of that time scale).

 

It would certainly change the way we view the climate trends, with positive anomalies starting well before the industrial age on many of the reconstructions.

Link to comment
Share on other sites

You countered the Hansen screw up with U.S. records with a Global climate record dating to 1850. Sorry I missed your switcheroo.

 

You never mentioned you were talking about US temperatures in your post. Since the discussion is about global warming, I think that this objection is disingenuous at best.

 

 

According to the climate reconstruction models, even if the last 150 years is the "warmest on record", that would be because, according to the chart of reconstructions, 1940 onward (or so) is the first extended period in 1000 years where the global climate has been above the mean.

 

 

The headline could just as easily read "Last 30 years is the first time in 1000 years that the earths temperature was above average: Breaks 1000 year cold streak".

 

 

 

Wait, what? Do you understand what a mean (average) is? A graph that's 1000 years long, and all but the last 150 (or 30) years is below the mean, and this isn't a cause for concern? Lop off the most recent data, and the past is not a "cold streak" anymore — it's business as usual. You can only call it a cold streak in light of recent, dramatic warming. That's spin, pure and simple.

Link to comment
Share on other sites

Here is what I am trying to get at with the anamoly line in these graphs:

 

What if, arbitrarily, we chose to zero the graphs on, say, 1650-1680.

 

If we are to assume that historic reconstructions are robust and of value there is no reason not to place the anomaly 0 at say 1675 (if that is indeed the mean temp of that time scale).

 

It would certainly change the way we view the climate trends, with positive anomalies starting well before the industrial age on many of the reconstructions.

 

you still don't get it, it doesn't matter where the zero point is. what happened was we had a long period of stable temperatures and they are now rising significantly. you are taking relative measurements as absolute measurements. this is just wrong.

 

to give you an example of how wrong this is, is 300K warmer than 27*C ?

 

they use the same relative stepping, but different zeros. they are exactly the same.

 

changing the zero points does not cahnge the conclusions drawn from the climate data, to say otherwise is either foolish or demonstrating a poor understanding of data analysis.

 

EDIT: i just had a brainwave that may explain your behaviour. Do you think that 'anomaly' means the difference in temperature between one year and the year before it?

 

if so then that is wrong.

 

the anomaly value means that the global average temperature was hotter or colder by the anomaly value than the average. a year with anomaly -0.2*C was 0.2*C colder than the zero point whether it was in 1850 or 1050

Link to comment
Share on other sites

Here is what I am trying to get at with the anamoly line in these graphs:

 

What if, arbitrarily, we chose to zero the graphs on, say, 1650-1680.

 

If we are to assume that historic reconstructions are robust and of value there is no reason not to place the anomaly 0 at say 1675 (if that is indeed the mean temp of that time scale).

 

It would certainly change the way we view the climate trends, with positive anomalies starting well before the industrial age on many of the reconstructions.

 

It would also push the current anomaly higher. Politically, being able to say that the current anomaly is a whole degree (or more) rather than 0.4 or 0.5 would probably shake people up more, but scientifically, using better data is the way to go. And the scientists chose using better data.

Link to comment
Share on other sites

1961-1991 had good temperature records. so they averaged them and set that as a baseline. the reference point doesn't really matter in particular, they could use absolute zero if they wanted to.

 

and where are you getting 1100 years from? the other graph covers a span of 1000(with 2004 tacked on the end) years.

 

i have to think that you're not really reading what you yourself are posting.

 

But it DOES matter, IA. As I said before, many of the reconstructions show warming trends well before the start of the industrial age. The appearance of the graph, as it is, make it look like there are only real warming after WWII... and you will often see the graph even worse than the one I presented where bars replace lines and the - anomalies are blue and the + anomalies are red.... again giving the impression that it was "cool" before the mean and "hot" after.

 

If 1961-1991 is "good data" then what is 1880 to 1960?

 

This goes back to the article I posted elsewhere where the UK Climate adviser is calling for greater honesty and clarity is communicating unknowns (see here). It is OK to say that 1961-1991 was chosen because we are uncertain about climate data prior to 1880 (and actually later than that).

 

That would tone down the AGW narrative to admit that historical climate is still too uncertain... but you would mean we also have to stop using "warmest in 10,000 years" and such nonsense. Because we are comparing a largely unknown to a very specific when we do so.

 

Anyway, for this reason I take any claim to "warmest in a century" with a grain of salt.. given that the period in question is directly following a period that was "coldest in a millennium".

 

Without a clear grasp on the history we can't translate "+0.74C +/- .16" into anything meaningful on a historic climate scale. Like claiming "Friday noon in the warmest hour of the week... but I have no definitive data before Thursday at 9:00pm"


Merged post follows:

Consecutive posts merged
It would also push the current anomaly higher. Politically, being able to say that the current anomaly is a whole degree (or more) rather than 0.4 or 0.5 would probably shake people up more, but scientifically, using better data is the way to go. And the scientists chose using better data.

 

 

Maybe, but you would have a hard time pinning it on CO2 since anthropogenic CO2 wasn't a factor until centuries later.

Edited by jryan
Consecutive posts merged.
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.