Jump to content

Cthulhu

Members
  • Posts

    19
  • Joined

  • Last visited

Retained

  • Quark

Cthulhu's Achievements

Quark

Quark (2/13)

10

Reputation

  1. Yes and weather models are compounded by the same problem, but more so because they are trying to predict much lower resolution events than climate models. Trying to predict the rainfall in london on october 28th this year is near impossible. However predicting the average rainfall across england in 2008 is a lot easier. I could just take the average rainfall of the last 5 years and be quite sure it will lie close. I cannot however average the rainfall in london over the last five october 28th's and expect to have a good prediction of rainfall for this october 28th. Regional weather is lower resolution and suffers from wilder variations than climate. (I note that USA Today's 5 day weather prediction on the back page cannot even predict cloud patterns 5 days in advance. That is their prediction for friday changes over monday, tuesday, wednesday and thursday) I do know nothing about the state of regional atmospheric modelling. But I do know that predicting the global average temperature for 2008 will be a lot easier than predicting the temperature of london on September 18th this year. Perhaps you are talking about something else.
  2. Yet we can predict next februarys global average temperature a lot better than we can predict the temperature in london in two weeks time. Climate predictions are not simply weather predictions far off in the future. It is an averaging of weather over some period of time. This averaging makes it less succeptible to the chaotic nature of weather. Sure, long term climate predictions will be less accurate than short term climate predictions. But long term climate predictions will not necessarily be less accurate than short term weather predictions.
  3. It's a simple consequence of the difference between weather and climate. Analogous to the difference between the difficultly of predicting the result of a single coin toss and the result of 1000 coin tosses. The latter is a lot easier.
  4. This one is pretty good' date=' probably not 100% up to date though: http://www.mnh.si.edu/anthro/humanorigins/ha/a_tree.html [img']http://www.mnh.si.edu/anthro/humanorigins/ha/images/bigtree2.GIF[/img]
  5. I believe short term regional weather is chaotic, but the average result of global weather over long periods of time is not. This means that while it is near impossible to predict today what the weather will likely be in new york on july 15th 2080, it is a lot easier to predict what the global climate will likely be in the year 2080.
  6. Did this ad remind anyone else of the type of ads released by sinister corperations in many sci-fi movies? Personally I think whoever made this advert was either greenpeace, or clueless about marketing.
  7. Realclimate.org came up with a great line in reference to this add:
  8. dude, you are going to get torn to pieces if you tell a chimp that they are a monkey.
  9. it will be due to a bug that someone was lucky enough to find
  10. How do you know clouds are more important in determining greenhouse gases if it is an unknown parameter? Clouds are the least understood part of modelling climate, but I guarantee that even when clouds are accurately understood and factored into the models the skeptics will still not accept they have any accuracy. I don't believe that is true. Models that include aerosol forcings do predict the cooling during the 1940-1976. For example: http://www.gfdl.noaa.gov/~tk/early_20th_cent_warming.html Arguably we can. The models that best fit the 20th century temperature trends are the ones that predict warming of something like 1-5C by 2100. There is no basis to say there will be no warming or there will be 10C of warming by 2100. To the best of our knowledge, and that is far from guessing, it is constrained to the range 1C-5C depending on what the future trends in carbon emissions are. Im sure when clouds are better understood this projection will be refined, but I doubt it change dramatically.
  11. Computer models are wrongly dismissed by many skeptics. An often heard claim from critics is that computer models are worthless because "you can make a computer say anything" (I know you didn't make this claim or any other claims I will point out below, im just saying) This is an absurd argument. Computers are essentially just a device to enable faster and more powerful calculations. Yet no skeptic would dream of arguing that climate models were useless because "you can put any numbers in an equation" yet this is essentially what the argument boils down to. Of course in science the numbers have to, and are, based on measurements of reality, not plucked from thin air. If I write a computer model that simulates a ball rolling down a mountainside I cannot simply invent my own value for the gravitational constant. I have to use the measured range for the gravitational constant. That's a huge constraint on the results of the model. Models are also testable against reality. They can point out what isn't known and hint at which places to look. For example if my ball falls faster down the hill than it is observed to in real life then I will have to start looking for some other factor to put in the model. I might look for something that would cause the ball to slow down, and that might lead to me discovering air friction. Add that new factor to the model the model becomes closer to reality.
  12. I can see from your post history that you know a lot about evolution and understand how many popular anti-evolutionist claims are flawed. You probably realise that many of them are not flawed in that they are false, but they are flawed as arguments nontheless. For example the claim "evolution is just a theory" is true, but it isn't a valid argument against evolution. Equally while it is absolutely true that water vapour contributes about 75% of the greenhouse effect, this is not an argument against anthropogenic warming. There is a great post here http://www.realclimate.org/index.php?p=142 that describes the big difference between water vapor and greenhouse gases like co2. Also the phenomenon that is being explained by global warming is not what contributes to the greenhouse effect, but what has contributed to recent warming. Even if some greenhouse gas accounted for 99% of the greenhouse effect, it can only be a good explaination for recent warming if it has increased. I don't think global warming theory is even 10% as strong as the theory of evolution, but even so there are a hell of a lot of popular arguments and urban legends against global warming out there that are just as flawed as many anti-evolution arguments.
  13. The reason is that businesses have always really needed software engineers and not computer scientists. Computer scientists have generally left university and found that the theorectical concepts they have learnt are not what most companies are after. For many companies they just need existing products to be strung together. Now software engineering and computer science are really diverging, and I agree with you that this is largely because the business specific tools are getting more advanced and higher level and don't require good knowledge of computer science to use. Exactly. The specificness of .Net is one thing that annoys me about it the most. The other thing is how microsoft butchered c++ This is understandable. Unfortunately businesses are after general knowledge of concepts like XML, common databases and encryption and especially their application to businesses. Businesses are generally not after knowledge of more theoretical and acedemic stuff like different fields of AI, compiler theory, the real details of how databases work, etc. There are far more jobs in the software engineering side of things and as computer science and software engineering diverge I can imagine more students going for the software engineering if it offers more job prospects. Not that computer science will go extinct - there will always be a demand there, just not quite as much. I agree that teaching a specific progamming language like c++ or java shouldn't be part of a computer science course, but teaching generic programming concepts like OOP should. If I had my way I would make good knowledge of programming a pre-requisite of CS courses rather than currently the courses having to teach it from scratch. But then I don't know what % you would put it at but I would say probably only about 10% of Computer Science graduates could get a job in programming and keep it. Most were simply not interested enough in programming to use it in their spare time which is the key to being proficient in it in my opinion. Perhaps you are using the phrase "managed code" differently from how I am accustomed to. I am using it in relation to .Net where managed code is safe due to running on a virtual machine with a garbage collector. Unmanaged win32 code can still be written but microsoft and industry are moving away from it (which is sad in my opinion). Then again the embedded device industry is growing and is more reliant on unmanaged code (when they aren't using java)
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.