Jump to content

global warming: salvaging fact from heaps of BS


Recommended Posts

npts2020, very good point. By that argument you would also agree that a warmer TOA would also radiate heat faster into space? Especially at night when the differential is higher?

 

However scalbers said: "though much of it would continue to be reradiated downward to the surface as IR." Which I can't interpret any way other than heat flowing from the cold area to the warm one. Heat blocked in the upper atmosphere should preferentially radiate into space, not down to the surface.

Link to comment
Share on other sites

However scalbers said: "though much of it would continue to be reradiated downward to the surface as IR." Which I can't interpret any way other than heat flowing from the cold area to the warm one. Heat blocked in the upper atmosphere should preferentially radiate into space, not down to the surface.

 

Why? How would the atmosphere know that it is warmer below so that it would radiate the heat upward? (hint: it doesn't)

Link to comment
Share on other sites

JohnB: Yes the upper atmosphere will also radiate into space faster but not as much faster as surface upwards. (remember the temperature in space does not significantly change so the difference will be less). The main effect, however, is caused by greenhouse gases like carbon dioxide and methane that tend to congregate in the lower part of the atmosphere. Higher energy photons pass through the atmosphere to be absorbed by the earth and re-emitted as lower energy photons (usually infra-red). One of the properties of both mathane and carbon dioxide is to allow high energy photons to pass through but reflect infra-red ones, causing temperature increase at the surface of the earth as the amounts of those gases increase from more reflections back toward the source (earths surface in this case).

Link to comment
Share on other sites

Are you really asking why heat doesn't tend to flow from a cold object to a hot one? :eek:(Treating the TOA and surface as two objects.)

 

(Hint: Thermodynamics, 2nd Law of)

 

No, I'm asking why you think an object won't radiate in one direction just because it is colder in that direction. As I implied, a cold object will indeed radiate toward a hotter object. This is not to say that heat will flow from the cold object to the hotter one, but rather that photons are going both ways. So the cold object radiates photons in all directions, even toward hotter objects, but the hotter objects do too, and will in fact radiate more photons toward the colder object than the colder object does toward the hotter object.

Link to comment
Share on other sites

Confusion, I think.

 

My comment was originally in response to scalbers; "though much of it would continue to be reradiated downward to the surface as IR."

 

Technically, and in the first instance, this comment is true. However it becomes untrue very quickly.

 

At surface concentrations and pressures, IR is trapped in a matter of metres. It isn't "absorbed" by CO2, it is caught and reradiated. The same thing happens at TOA, but it travels further between "traps".

 

Any given photon blocked at TOA has a roughly 50% chance of being sent down or up. So about 1/2 will go each way. (Slightly more that 1/2 will escape to space but for the sake of argument I'm ignoring height above ground and the curvature of the earth.) Those that are sent up are radiated into space and vanish from the equation. Those that go down will get trapped again and again 1/2 will be reradiated up and 1/2 will be reradiated down. The process will continue innumerable times. Hence "much" will not make it to the surface. Have I explained myself better?

 

Roughly 2/3 of incoming radiation already fall into these traps, one way or another, so increasing the "trapping" gases can only drive this proportion up. Thereby causing a net negative forcing.

 

Yes the effect is very minor. However the difference of 2.4 w/m-1 over the last 100 years or so represents a change (compared to TOA radiation) of .211%, an extremely small change. Hence, an increase of .1% in the amount of incoming radiation blocked (an extra 1 in 1,000 photons blocked) would give a negative forcing of 1/2 the calculated positive forcing for the 20th C.

 

This why I find writing things off as "slight" factors bothersome. We are already dealing with very slight changes, so any apparently minor factor may still be large when compared to the primary change.

 

For example, an increase of SI (at TOA) of only 1 W/M-1 is slight, some .08% increase. However, 1/3 of that will reach the surface (roughly) giving a .3 W/M-1 which represents 12.5% of the calculated increase over the 20th C, hardly a "slight" change.

 

npts2020, IR is not reflected back by CO2. A radiated IR photon is trapped by the CO2 molecule and reradiated. Some go down, some go up and some then hit Water Vapour or other heat storing molecules. Basically the same process as I described for the TOA, but in the other direction.:D

 

In a nutshell, because of the short "intercept" distance for IR at surface concentrations and pressure a lot of the IR spends its time either heating WV or simply bouncing around from CO2 to CO2 molecule rather than radiating into space. This of course creates a livable earth.

 

This thread is about salvaging facts from BS and honestly, from where I sit, I see heaps of BS on both sides. I also see a lot of guesswork and what I consider unfounded assumptions.

Link to comment
Share on other sites

JohnB: So you are saying that even though more energy "is trapped and reradiated" (quibbling over words imo) by greenhouse gases, more energy is not put into the lower atmosphere thus causing warming? Where exactly does all that energy go? If we could replace our atmosphere with methane the surface temperature would stay the same or not change significantly? Or are we only talking degree?

Link to comment
Share on other sites

I_A. I don't doubt that it is factored in in the GCMs, it's just it isn't mentioned in the literature. I find that odd.

 

You mean the heat would flow from the cold upper atmosphere to the warm surface? This concept does not sit easily with me for some reason.:D

 

We do need to consider slight factors. Since we are talking about a change of circa 2 W/M-1 out of the 324 reaching the surface and given that the 324 is barely a quarter of the 1300 odd that hits the TOA, I would suggest that "slight factors" are exactly what we are dealing with.:D

 

I think heat is radiated even from the cold upper atmosphere to the surface, after all the cold upper atmosphere is still much warmer than the alternative of outer space. So if the upper atmosphere warms, the net downward IR radiation increases. One can notice that a night with high cirrus clouds tends to be warmer than a clear night. Therefore when you increase the emissivity in the upper troposphere (either from more CO2 or more cirrus clouds), the surface becomes warmer.

 

Yes, I think all the slight factors are illustrated in the radiation diagram I linked to previously: http://www.euronet.nl/users/e_wesker/gh.gif

Edited by scalbers
Link to comment
Share on other sites

That's what I thought.

 

It just doesn't seem to get into the calculations anywhere. It's like water vapour.

Solar_Spectrum.png

It must block IR in both directions. Because we are dealing with extremely low changes in RF, under 0.7%, I doubt any factor can be ignored.

 

From the graph above, it's obvious that no matter the increase in water vapour, there will be no change at some wavelengths as the WV is already blocking the IR. However, an increase would progressively block the circa 900 and 1150 bands.

 

Note that this blocking is happening at the top (sort of) the atmosphere where it will be radiated out into space. It strikes me that it should be factored in, but it doesn't seem to get a mention. It just seems odd.

 

But the curve you've shown is for the solar radiation (incoming) and not the terrestrial radiation (outgoing)

 

fig3.jpg

 

This shows the absorption band at around 15 microns (note the lower axis is in wavenumbers, so the ordering is reversed, and you can't just reverse and overlay the curves)

 

 

The blocking happens through the atmosphere, wherever the water is, not just at the top. The opacity happens because of the thickness. Increasing the concentration will change the average radius where absorption is occurring.

 

Here's a smaller one that compares them directly

 

ocean33.gif

 

CO2 absorption at 4 and 15 microns has only a small effect on incoming radiation, but a huge effect on the outgoing.

 

I think heat is radiated even from the cold upper atmosphere to the surface, after all the cold upper atmosphere is still much warmer than the alternative of outer space. So if the upper atmosphere warms, the net downward IR radiation increases. One can notice that a night with high cirrus clouds tends to be warmer than a clear night. Therefore when you increase the emissivity in the upper troposphere (either from more CO2 or more cirrus clouds), the surface becomes warmer.

 

Yes, I think all the slight factors are illustrated in the radiation diagram I linked to previously: http://www.euronet.nl/users/e_wesker/gh.gif

 

I think that's a good point — if the atmosphere is opaque at a given wavelength, the radiation reservoir is the temperature of the atmosphere, not of the earth. And let us not forget that not all radiation outward will be directed into the 3K reservoir of space — there is a 5800 K source there is well.

Link to comment
Share on other sites

But the curve you've shown is for the solar radiation (incoming) and not the terrestrial radiation (outgoing)

That was my point. Especially WV will increasingly block incoming radiation. Hence less radiation reaches the surface.

The blocking happens through the atmosphere, wherever the water is, not just at the top.

I guess I was being a bit lax in my terminology, I will improve.:D This doesn't change the fact that WV blocks the radiation in both directions.

One can notice that a night with high cirrus clouds tends to be warmer than a clear night.

Which is a perfect demonstration of how powerful WV is compared to CO2, wouldn't you say? Also a cloudy day is cooler that a sunny day, would that not imply that radiation is being blocked from reaching the surface?

So you are saying that even though more energy "is trapped and reradiated" (quibbling over words imo) by greenhouse gases, more energy is not put into the lower atmosphere thus causing warming?

I might seem to quibble at times, but where I can I do try to be precise, that's all.:D I have no idea how you got the impression you did from what I said though.

 

I'm not trying to put values on this, just commenting that unless partially condensed as clouds, the blocking of incoming radiation by WV etc. gets little mention in the literature.

Link to comment
Share on other sites

Possibly because water vapor reaches equilibrium for a given temperature? You may certainly have locally thicker clouds but for the entire earth the average water vapor remains fairly constant, only increasing as much as temperature increases.

Link to comment
Share on other sites

Possible. This would mean that as temp rises, WV would increasingly block incoming radiation.

 

There appears to be some sort of limiting factor somewhere, because the records show the maximum "warmed" temp seems to be independent of CO2 concentration.

Link to comment
Share on other sites

Possible. This would mean that as temp rises, WV would increasingly block incoming radiation.

 

 

True enough but the blocking does not occur until after the warming, then it reaches an equilibrium at some higher temperature.

 

 

There appears to be some sort of limiting factor somewhere, because the records show the maximum "warmed" temp seems to be independent of CO2 concentration.

 

 

I hadn't heard this. I have always thought (maybe wrongly) that warming followed CO2 concentrations pretty closely with the increasing temps lagging some time behind increasing CO2. Maybe somebody can be more definitive about this than I

Link to comment
Share on other sites

To npts

 

If you restrict your observations to the current Ice Age - meaning the last million years approximately, we have about 10 glacials and 10 interglacial periods. Warming leads to interglacials, and cooling to glacial periods.

 

Over that time, the pattern of warmings has been for initial temperature increase to precede CO2 increases by about 800 years. Obviously the current (last 30 years) warming is different.

 

Interglacial peak temperatures have been slowly rising over that million years. The last one, 120,000 years ago, reached a maximum about 2 to 3 Celsius warmer than the temperatures we are currently experiencing. If our current interglacial follows the long term trend, it will peak out at, or higher, than that.

Link to comment
Share on other sites

npts, and others

 

Let's bear a few issues in mind: most of you are discussing the surface energy balance, not the top-of-atmsophere energy balance. Greenhouse warming does not rely on greehouse gas increases being able to directly increase the downward IR. It's possible to increase CO2 and not get a big change in downward IR if the lower atmosphere is already emitting like a blackbody at its temperature, and any change in downward IR may not really warm the surface (depending on evaporation, sensible heat, etc). Actually the downward IR will increase moreso because the atmosphere is warmer, not because of the direct contribution of more GHG's. And clearly energy does flow from a colder atmosphere to a warmer surface, but the net flux is always from the surface to the colder atmosphere, so there is no violation of the thermodynamics.

 

More GHG's primarily affect what’s going on higher up in the atmosphere, which warms the whole troposphere (which is meshed together by convection so it tends to warm and cool as a unit). Adding GHG's creates a situation where the bulk of emission from the planet to space comes from higher altitudes where it is colder. Because it is colder, it emits radiation weaker than it was before, so the now the planet is less efficient at losing heat, but the incoming heat is held steady. Because the net energy flux into the planet is now greater than zero, the planet warms. From this angle, you can see that GHG mixing into higher altitudes as well as the fact that temperature drops with altitude is going to be a requirement for an enhanced greenhouse effect (actually there would be no GHG effect in an isothermal atmosphere). The atmospheric warming then warms the surface through increasing all the heat fluxes which couple the surface to the air, not just the radiative ones.

 

//" I have always thought (maybe wrongly) that warming followed CO2 concentrations pretty closely with the increasing temps lagging some time behind increasing CO2."//

 

Depends on when you're talking about. If you widen your scope to the geologic timescale as a whole there really is no "lagging" issue...we see higher temperatures when CO2 concentrations are higher for the most part. In fact most of geologic time had more CO2, higher temperatures, less ice, etc. During the PETM we see a big spike in CO2, and a big spike in temperature. You have a similar situation on Venus: the planet should be below freezing if it were for no CO2, but it's the hottest in the solar system.

 

During the last million years when we see transitions between glacials and interglacials, CO2 doesn't seem to matter much for forcing the climate change. That's related to orbital changes. But there are carbon feedbacks related to changing biosphere, changing ocean temperatures, etc, and the changes in CO2 acted as a positive feedback to amplify the orbitally-paced warming. Right now, the CO2 is going up too fast, and is much higher in concentration to be a feedback.

 

C

Link to comment
Share on other sites

Hi Chris, where you been hiding?:D And how the hell are you?

 

Your post made me think that I'm not being clear.

 

If we reduce to the basics. Incoming LW radiation hits the surface and is converted to IR radiation. The IR should then radiate out, but is blocked by the CO2, leading to warming. Warming also leads to an increase in water vapour in the air.

 

However, WV blocks LW radiation. Hence, with increased WV, there should be less LW radiation reaching the surface to be converted to IR. So WV must act as both a positive and negative forcing, depending on the direction and type of radiation. This is the factor I don't see discussed in the literature.

 

npts2020, as SL said, the lag over the last 40,000 years is around 800 years. Temps go up and then CO2 rises. If we compare these two graphs.

CO2_0-400k_yrs.gif

Temp_0-400k_yrs.gif

Time goes from right to left on these. We can clearly see that CO2 lags the temp rise. As Chis says, this demonstrates that CO2 acts as a positive feedback once another factor (presumably orbital changes) starts the process off. What it also shows is that CO2 was not strong enough a factor to keep the warming going, once the initiating factors disappeared. The CO2 continues to rise even though temps are dropping.

 

You might also notice that previous interglacials were somewhat warmer than we are today. Also our current interglacial is unusually smooth.

You have a similar situation on Venus: the planet should be below freezing if it were for no CO2,

Bit of an unfair comparison don't you think? If there were no CO2 in the atmosphere of Venus, there would be no atmosphere. 96.5% CO2 isn't it? Comparing the climate of a world with 96.5% CO2 with a world with a .0387% CO2 concentration would have to be apples and oranges.

Depends on when you're talking about. If you widen your scope to the geologic timescale as a whole there really is no "lagging" issue...we see higher temperatures when CO2 concentrations are higher for the most part. In fact most of geologic time had more CO2, higher temperatures, less ice, etc.

Carbon_Dioxide_Geological.jpg

If you can see a correlation, you're doing better that I am.

 

We see from the graph that during the Cambrian, Silurian and Devonian periods temps and CO2 were much higher than today. Temps around 80 higher with a whopping CO2 concentration of 2240ppm. Yet we also see the Ordovician period with temps at or lower than todays without a drop in CO2 concentration. This is not a blip, but a period of millions of years with low temps and high CO2.

 

We also see the Pennsylvanian through to the Jurassic period had much lower CO2 (compared to earlier times), yet the temps were as high as the earlier periods. The temp drop in the Mississippian is associated with a drop in CO2, but then the temp goes back up while CO2 stays low. Incidentally, that's why I said that there must be some sort of limting factor. Regardless of anything else, CO2 or whatever, the temps always peak at around the same temp, ca 80 above present. I wonder why?

 

The other thing I find interesting is that the whole thing goes to pot ca 65 million BP. I wonder if that asteroid did more than just kill off dinosaurs?

Link to comment
Share on other sites

Hi John, I'm doing fine

 

first of all, just so we're all on the same page with shorthand

LW= longwave=IR

SW=incoming solar (shortwave)

 

 

The dominant effect of changed water vapor is indeed in its longwave trapping effect. There is a much smaller, but still significant effect on solar energy...actually there's some influence at higher latitudes where reflected solar radiation is sent back up and water vapor can absorb in SW areas as well. This is discussed in a recent paper here for instance

http://ams.allenpress.com/perlserv/?request=get-abstract&doi=10.1175%2F2007JCLI2110.1

 

The paper also states, "...the increased water vapor acts to increase the net incoming solar radiation (i.e., it results in increased solar absorption). The magnitudes are roughly a factor of 5-10 smaller than those computed for the longwave part of the spectrum, except at high latitudes." So actually it's more of a positive-positive feedback. The only substantial negative feedback is the planck response (i.e., that a hotter planet emits more radiation). There is also a small but negative lapse rate feedback, but that only cancels out a small portion of WV.

 

Yes, Venus has much more CO2 than Earth, but it's temperature differential between an atmosphere and no-atmsophere case is ridiculously large!! Again we're talking about something like below freezing to something like double the boiling point. For a planet that receives much less sunlight than Earth (due to its cloud cover, so when I said below-feezing in a no atmosphere case, I was still leaving the albedo the same) the role of CO2 highlights how much it matters to make an atmosphere more opaque to infrared. What's more, Earth is relatively sensitive to external changes because of the presence of water vapor on a planet situation where it changes phase so easy, so for that reason doubling CO2 matters quite a bit, even as the absolute values are relatively small.

 

Concerning your graph--After the ice core record, especially when you go tens of millions of years back, there are substantial uncertanties in both temperature and CO2. If you look at the IPCC AR4 chart (in the paleoclimate chapter in the section on deep-time climate) the CO2 uncertainty ranges for instance can be more than 1000 ppmv, and you need to be careful at interpreting these temperatures-- such deep time proxies may reflect bottom water conditions on the shelf and not open oceanic SSTs, further on the dataset has only a very poor temporal resolution for many intervals. I've not looked at the papers that it credits, but I have a hard time believing a graph like that was published in a refereed article. The numbers are far too concise, and there are no error ranges, which makes such a graph essentially useless for this kind of comparison. I'd recommend a couple of papers

https://wesfiles.wesleyan.edu/home/droyer/web/KurschnerCommentary(2008).pdf

https://wesfiles.wesleyan.edu/home/droyer/web/PhanCO2(GCA).pdf

Edited by Chris C
Link to comment
Share on other sites

first of all, just so we're all on the same page with shorthand

LW= longwave=IR

SW=incoming solar (shortwave)

:doh: Don't ask me why, but I nearly always get them round the wrong way.:doh:

 

I'll read the links and get back to you. The first is behind a paywall, but the abstract reads as extremely interesting.

 

I agree that the graph is problematic due to low temporal resolution and natural uncertainties. I've seen numerous versions showing similar data so am acting on the assumption that graphs that go that far back are operating on the "Best informed estimate we have" basis.

 

There is also the possibility that over ca 140 million years ago, our temp resolution is so poor that we can only tell if it was an ice age or not.

Link to comment
Share on other sites

Which is a perfect demonstration of how powerful WV is compared to CO2, wouldn't you say? Also a cloudy day is cooler that a sunny day, would that not imply that radiation is being blocked from reaching the surface?

 

A cloud is different than water vapor so wouldn't directly bear on how strongly WV absorbs/emits IR. I'm showing an example of how downward radiation from the upper atmosphere can have an impact of net warming on surface temperature. While cirrus clouds would cool the surface during the day, WV absorbs less solar radiation thus reinforcing the net warming of WV overall.

 

Interestingly, I believe more cirrus clouds by themselves would actually cause a net warming of the earth and is one of the uncertainties of climate models.

 

Also, as I implied earlier, I agree with Chris C that having absorption and emission occur at high altitudes helps to warm the surface both for radiative and convective (adiabatic warming) reasons.

 

Steve

Edited by scalbers
Link to comment
Share on other sites

  • 3 weeks later...

Rather than start a new thread, I thought I'd add this on here.

 

Previously I've posted this pic for Wellington and asked "Why?"

attachment.php?attachmentid=1767&d=1206844975

It's from GHCN data showing the adjustment made to the raw data. Note how the 1890 to 1920 period has been significantly shifted down, thereby giving warming without that warming being shown in the raw data.

 

In trying to get to the bottom of this I arrived at NOAA. The linked page shows the adjustment methods used for the USHCN data. (Presumably something similar is used for GHCN) Most of the steps seem quite reasonable but number 3 has me flummoxed. No.3 is the adjustment for the move from Stevenson Screens to the more modern MMTS system, which are presumably more accurate than the old screens.

 

Adjustments are shown here.

ts.ushcn_anom25_diffs_pg.gif

 

The red line is the MMTS adjustment. Adjustment has gone from zero to ca .050. I don't get it. The old system needed no adjustment, but the new improved one does? The paper used for this adjustment is here. Just above the "Conclusions" section they say;

Overall it appears that the MMTS may be the more precise instrument, and that older CRS data should be adjusted when high precision is required.

So why is NOAA doing the opposite?

 

The black line is also interesting as it is the TOBS adjustment, one that overshadows all the others. It would appear that between 1955 and 1990 the Time of Observation gradually changed across the US. According to the TOBS paper, it did. Odd really. TOBS paper is here.

 

However it's the total that has me really beat.

ts.ushcn_anom25_diffs_urb-raw_pg.gif

 

From zero cumulative adjustment in 1955 to .60F in the 1990s. I would have expected the need for adjustment to go down as tech improves, not up. It doesn't quite gell with me.

 

Some will look at the graph and go "Aha, half of the warming in the US is due to adjustments". (Especially given my sceptical leanings in this area.:D) However, while true the "Aha" may not be relevent. If the methodology is correct, then the adjustments are correct and show the true state of affairs.

 

The purpose of this post is to ask others to have a read and offer comments concerning the methodology used. Not it's ir/relevence to the various GW debates.

 

Note that there are apparently two differing methodologies being used. The one for GHCN seems to move the distant past readings down, while the USHCN moves the recent readings up.

 

Again, not arguing for a position here, just trying to understand.

 

Thoughts?

Edited by JohnB
typos
Link to comment
Share on other sites

I post this because mistakes are hilarious.

 

A surreal scientific blunder last week raised a huge question mark about the temperature records that underpin the worldwide alarm over global warming. On Monday, Nasa's Goddard Institute for Space Studies (GISS), which is run by Al Gore's chief scientific ally, Dr James Hansen, and is one of four bodies responsible for monitoring global temperatures, announced that last month was the hottest October on record.

 

This was startling. Across the world there were reports of unseasonal snow and plummeting temperatures last month, from the American Great Plains to China, and from the Alps to New Zealand. China's official news agency reported that Tibet had suffered its "worst snowstorm ever". In the US, the National Oceanic and Atmospheric Administration registered 63 local snowfall records and 115 lowest-ever temperatures for the month, and ranked it as only the 70th-warmest October in 114 years.

 

So what explained the anomaly? GISS's computerised temperature maps seemed to show readings across a large part of Russia had been up to 10 degrees higher than normal. But when expert readers of the two leading warming-sceptic blogs, Watts Up With That and Climate Audit, began detailed analysis of the GISS data they made an astonishing discovery. The reason for the freak figures was that scores of temperature records from Russia and elsewhere were not based on October readings at all. Figures from the previous month had simply been carried over and repeated two months running.

 

http://www.telegraph.co.uk/opinion/main.jhtml?xml=/opinion/2008/11/16/do1610.xml

Link to comment
Share on other sites

I see nothing there that underpins the notion that there has been a dramatic warming trend globally over the past couple of decades.

 

I've also never seen much talk by skeptics about how to avoid the risk of irreversable changes in the Earth's climate if we fail to act in the near future, due to inertia in the system.

Link to comment
Share on other sites

It's hilarious because….

 

1) Sloppy work should always be ridiculed, and finding humor in people's mistakes is perhaps the mildest form of ridicule.

 

2) Its one thing to make a mistake, it's quite another to attract attention to it by declaring your mistake record breaking. When declaring a record, there is an implication that the data received extra scrutiny. By not providing that scrutiny, the record is simply a record breaking whopper. :doh:

 

3) Who found the mistake? Global warming skeptics. It would have been less embarrassing to Hanson and global warming enthusiasts if anyone else would have discovered and reported the error. The fact that it was found and reported by skeptics reinforces the opinion of skeptics that enthusiasts will accept any report that supports their enthusiasm. Hanson again provided skeptics with reasons to doubt.

Link to comment
Share on other sites

Skeptics do a good service by pointing out mistakes such as this. I think they also do a disservice by magnifying the doubt since the bottom line is that faster action should be taken based on the inertia considerations I've mentioned from time to time.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.