Jump to content

The LHC Project


Just the Facts

Recommended Posts

The LHC Project.

 

I completely understand what they are attempting to find.

It will result in massive FAIL. (already has)

 

All the time and billions spent on building something that will never prove anything worth knowing.

I am certain that within the next ten years the whole thing will be disassembled and sold for scrap.

Then they will use the place for some expensive fancy Condominiums.

 

In the mean time... somebody's making a ton of money on it.

 

 

Link to comment
Share on other sites

It's certainly debatable whether a lot of money should be spent for fundamental research with little or no obvious application. And LHC no doubt is very expensive compared to the average university's backyard experiment. In my experience, it's not someone making a lot of money there, but a lot of people making some. What exactly do you think LHC is trying to find?

Link to comment
Share on other sites

actually, the LHC is spitting out huge amounts of useful data right now.

As someone whose computing resources also serve as a Tier-2 node for LHC, I agree with "huge amounts" more than I do with "useful" :P
Link to comment
Share on other sites

The LHC Project.

 

I completely understand what they are attempting to find.

It will result in massive FAIL. (already has)

 

All the time and billions spent on building something that will never prove anything worth knowing.

I am certain that within the next ten years the whole thing will be disassembled and sold for scrap.

Then they will use the place for some expensive fancy Condominiums.

 

In the mean time... somebody's making a ton of money on it.

 

Once again you display a profound misconception of how science works. Even if the LHC did not find even one predicted particle, that in itself would be information worth knowing. An experiment that does not produce the expected result can push back the boundaries of knowledge just as much, if not more, than one that does.

Link to comment
Share on other sites

Once again you display a profound misconception of how science works. Even if the LHC did not find even one predicted particle, that in itself would be information worth knowing. An experiment that does not produce the expected result can push back the boundaries of knowledge just as much, if not more, than one that does.

 

 

I have a very realistic understanding on how science works.

 

That’s why I know the LHC is a waste of time and money for the little dribble of information they might learn from that massive waste. Just who really benefits from this knowledge they claim will change the world of science. Not you or I.

Nothing tangible will come from it.

Follow the money.

 

It is a vain pursuit of recognition, award and money by a small group of individuals with way too much influence.

 

Just like Einstein was.

 

 

 

As someone whose computing resources also serve as a Tier-2 node for LHC, I agree with "huge amounts" more than I do with "useful" :P

 

 

KUDOS to you. Perfect.

Link to comment
Share on other sites

Since practically everything that Just the facts writes in this forum is unsupported by any experimental evidence it is no surprise to me to see that he doesn't want to see anyone spending money on experiments that will generate real data.

He just doesn't seem to understand that you can't advance science without experiments.

He seems to think that the way to "prove" a theory is to write bits of it in CAPITAL letters and call all the people whose work shows him to be wrong nasty names.

Edited by John Cuthber
Link to comment
Share on other sites

All, we need to be pleased with this post. JTF has finally made a testable objective prediction!

 

Even if after two or three years of analysis and still finding the Higgs Boson to be a mystery; this LHC program isn't a failure. Much new knowledge will come from such research. JTFs are constantly needed, if only to spur the best minds in the world to focus on bigger and better things. I actually wish him luck! Edited by rigney
Link to comment
Share on other sites

I just cannot digest the LHC project. They are working on theories which are highly influenced by The Big Bang. The Big Bang itself is an incomplete "if...then" story. How can science work on a very basic false or defective assumptions? Its like assuming "Assume 1+1=9 where the operators have their usual meaning". Please comment whether I am wrong or not!

 

From a new member

Chinmay Shah

(ChinzFactory Scientifique)

:)

Link to comment
Share on other sites

Why would that help over a regular supercomputer? It's not like we have gigabytes of weather data being generated every second.

 

Although...

 

Formerly named AWS Convergence Technologies and operators of the Weather Bug Web application, Earth Networks said today it will invest $25 million over five years to equip about 100 locations worldwide with sensors to measure the concentration of greenhouse gases in the atmosphere, including carbon dioxide and methane.

http://news.cnet.com/8301-11128_3-20028265-54.html

Link to comment
Share on other sites

What exactly do you mean by "a system like it"?

 

A lot of CPU power? Climate science does use a lot of CPU power. I do not know how exactly it compares to LHC requirements, but if it's significantly less then that's likely either because of a lack of funding or because CPU power is not the limiting factor in climate science. Particle physics at colliders is much more accurate than climate physics (at least I think so) or actually almost any other branch of physics (that I do know for at least some other fields).

 

A distributed system? I don't know about climate research, but collider data is remarkably well-suited for trivial parallelization. What you have is collisions of protons, which are independent from another. A huge amount of them. So in principle, you can send each of them to a separate computer, analyze them there, and only send the result back to a central facility. That's not possible in systems where the events in one part of the system influence other parts. A de-centralized processing might require a huge amount of data transfer for communication between the different nodes.

Link to comment
Share on other sites

Why would that help over a regular supercomputer? It's not like we have gigabytes of weather data being generated every second.

 

Although...

 

 

http://news.cnet.com/8301-11128_3-20028265-54.html

because than you could have real time temperature (from thermal satellite scans and co2 data from everywhere on the planet)

record the data every 10 minutes.

that would generate allot of data in a short time. would a normal supercomputer be able to handle THAT much data?

but this is a little off topic

perhaps a new thread is required.

Edited by dragonstar57
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.