Jump to content

Particle duality


Advaithi

Recommended Posts

No, not really. It has nothing to do with the question.

 

The question is about particle/wave duality you wrote:

 

"Einstein was wrong.

It's all waves, but you don't notice the wave behavior for particles with large momenta."

 

The point I am trying to make is that it is not just "all waves" MIT for one use a particle interpretation. Einstein was not wrong, but like everyone, Einstein failed to realise the interaction between graviton and other particles. The spin and wave actions of a free particle are determined by the variation in graviton density; it is no different from the wave action that astromers know full well, nature repeats the same simple cause and action on all scales.

 

 

 

I have speculated on the aplication of Loop Quantum Gravity to atomic structure and the similarity of the Quantum Mechanical Compton Radius to a particle radius calculated using G/2 as a constant; clearly indicating how gravity and gravitons fit into the overall picture. Loop Quantum Gravity shows that Einstein's insistance on a classical interpretation was justified.

 

Surely the LHC is looking for the Higg's particle in the form of mass, not in wavelength.

 

edit: I'm moving this post to start a new thread on volume and empty space in an atom.

 

Cannot find your new thread, has it been started?

Edited by elas
Link to comment
Share on other sites

What can be confusing is that there are different approaches to quantum theory, and they give different interpretations as to what is happening. For example QED represents electrons as particles. But these particles have a "phase" and the probability of a single electron arriving at a detector is calculated by considering the sum of all possible paths of that electron (and the phase for each path). Weird, huh!

The hard part of legitimating any statistical approach, for me, is that there is a degree of separation between the levels of empirical events and that of analysis. It would be like if there was a model for predicting the time it takes a race car to go around a certain track by taking the average duration of all possible paths around the track under all conditions. While the average would represent something pertinent, it would not be a direct representation of an actual empirical situation where a particular car went around a particular track under specific conditions, etc. So if I wanted to theorize what factors influence the car's time around the track, it would be impossible because my model would work purely at the level of arriving at a mean and working with that number in other ways.

 

The question about the probability of Feynman being wrong was well-answered but it was also meant a little rhetorically as a play on the fact that Feynman said that uncertainty is a fundamental characteristic of nature - so why shouldn't that apply to Feynman's claims about nature as well? If QP can say that Einstein was wrong about God not playing dice, then why can't anyone else say that using statistics to work probabilistically is like physicists playing dice with nature and just getting good enough with spreading their bets to always come out ahead without really caring how the game (might) work(s)?

Link to comment
Share on other sites

lemur

The solution is to remove the averaging at its source, the averaging shown in the Particle Data Group Tables cannot be justified; Einstein was (and still is) correct.

Edited by elas
Link to comment
Share on other sites

You guys are confusing wave-particle duality with Quantum mechanics. Wave-particle duality has been around since the time of Newton and Hyugens, way before QM was a twinkle in Max Planck's eye.

The model we use to explain phenomena and make predictions about a system is tailored to particular aspects of the system but not necessarily to all. As an example, try to explain the double slit diffraction experiment with the particle (photon ) model of light. Or try to explain the photoelectric effect with the EM wave model of light. You cannot do either.

However particulate photons easily explain the photoelectric effect and diffraction is easily handled by the wave model.

Use the model appropriate to the circumstances, since a photon, in reality, may be neither a wave or a particle ( it may be a certain harmonic of a vibration of a 10^-34m segment of space/time, otherwise known as a string, another model ).

Link to comment
Share on other sites

If QP can say that Einstein was wrong about God not playing dice, then why can't anyone else say that using statistics to work probabilistically is like physicists playing dice with nature and just getting good enough with spreading their bets to always come out ahead without really caring how the game (might) work(s)?

 

I think physicists care deeply about "how the game might work". There are many interpretations of quantum mechanics which try to explain the deeper meaning of the mathematics. It is a very difficult task. No one has come up with a deeper explanation which all can agree on. Using the probabilistic (statistical) approach, quantum mechanics works superbly in making accurate predictions. It tells us that there is an inherent uncertainty in nature itself.

 

Physicists are not playing dice with nature. Nature itself is playing dice. For example, no one knows exactly what a single electron will do. We can only predict the probability of what it will do. But why nature behaves this way remains a mystery.

Link to comment
Share on other sites

I think physicists care deeply about "how the game might work". There are many interpretations of quantum mechanics which try to explain the deeper meaning of the mathematics. It is a very difficult task. No one has come up with a deeper explanation which all can agree on. Using the probabilistic (statistical) approach, quantum mechanics works superbly in making accurate predictions. It tells us that there is an inherent uncertainty in nature itself.

 

Physicists are not playing dice with nature. Nature itself is playing dice. For example, no one knows exactly what a single electron will do. We can only predict the probability of what it will do. But why nature behaves this way remains a mystery.

I realize all this. Realize, too, that the source of my annoyance with statistics comes from social science where researchers DO have direct access to human individuals and yet still choose to elevate the mathematical accuracy gained statistically above the direct knowledge that can be gained regarding interactions at the level of empirical observabilities. Physics is different because there is no direct access to atoms and electrons to see what they're doing while they're doing it. Still, I don't think that statistical problem of social science is limited to social science. It is a fundamental problem of epistemological collectivism. If you can't think of physical reality as consisting of individual interactions, you're obfuscating the most fundamental level where events always have to occur insofar as the entities in question behave independently of others. E.g. if each electron or photon is ultimately independent of all other particles, then I don't think it should be fundamentally epistemologically modeled as a determinee of statistical probabilities. Its behavior may be more or less predictable by such statistics, but it is a step too far, imo, to imply or suggest that there is nothing beyond probability and randomness to its behavior.

 

BTW, claiming that nature itself is playing dice implies that the basis for the probabilistic nature of these things lies in nature rather than in the approaches that have led to studying it in this way. This comes down to the question of whether there is such a thing as a natural theoretical/methodological approach and I don't think you can say there is until you claim to definitively know nature, which I don't think is possible. I think there is an inherent unbridgeable clove between what is and the ability to model it in knowledge. In this sense, nature is not so much uncertain because of probabilism as it is because of knowledge relying on concepts and representations that bring their own logical biases with them to the table. I don't think this ultimately matters, though, because I don't see the point of knowledge as being exhaustive. I just think its supposed to keep progressing by identifying shortcomings and constructing various solutions to those and then further criticizing. This may result in multiple divergent approaches/theories, etc., but at least you don't reach a point of saying that you have to accept the shortcomings of a particular model just because it's the one that works the best.

 

 

Link to comment
Share on other sites

I realize all this. Realize, too, that the source of my annoyance with statistics comes from social science where researchers DO have direct access to human individuals and yet still choose to elevate the mathematical accuracy gained statistically above the direct knowledge that can be gained regarding interactions at the level of empirical observabilities. Physics is different because there is no direct access to atoms and electrons to see what they're doing while they're doing it. Still, I don't think that statistical problem of social science is limited to social science. It is a fundamental problem of epistemological collectivism. If you can't think of physical reality as consisting of individual interactions, you're obfuscating the most fundamental level where events always have to occur insofar as the entities in question behave independently of others. E.g. if each electron or photon is ultimately independent of all other particles, then I don't think it should be fundamentally epistemologically modeled as a determinee of statistical probabilities. Its behavior may be more or less predictable by such statistics, but it is a step too far, imo, to imply or suggest that there is nothing beyond probability and randomness to its behavior.

 

BTW, claiming that nature itself is playing dice implies that the basis for the probabilistic nature of these things lies in nature rather than in the approaches that have led to studying it in this way. This comes down to the question of whether there is such a thing as a natural theoretical/methodological approach and I don't think you can say there is until you claim to definitively know nature, which I don't think is possible. I think there is an inherent unbridgeable clove between what is and the ability to model it in knowledge. In this sense, nature is not so much uncertain because of probabilism as it is because of knowledge relying on concepts and representations that bring their own logical biases with them to the table. I don't think this ultimately matters, though, because I don't see the point of knowledge as being exhaustive. I just think its supposed to keep progressing by identifying shortcomings and constructing various solutions to those and then further criticizing. This may result in multiple divergent approaches/theories, etc., but at least you don't reach a point of saying that you have to accept the shortcomings of a particular model just because it's the one that works the best.

 

 

 

Thanks for the reply. Maybe some day we'll have a new theory which gives us a deeper meaning to all this. But for now, what physicists understand is that this uncertainty is inherent in nature itself.

Link to comment
Share on other sites

During QM's formative years there was some attempt at making it more common sensical. I believe it was originally de Broglie' s idea to introduce a pilot wave to particle motion in an effort to explain the differences between the double slit diffraction of an electron and the single slit diffraction, and how does a single electron know if the second slit is open or closed. This was during the time of Einstein's famous 'challenges' to Bohr's stochastic interpretation of QM. The pilot wave was later re-introduced by Bohom ( I don't know if that's the correct spelling ) in the 50s, but because of his history as a communist he was forced to leave the country to work in Brazil ( yes, that sort of thing did happen in the 50s ). The latest questioning of QMs interpretation was by Bell in the 80s who obtained a probability of an event happening more often than always ( non-sensical ). So maybe there are some hidden variables which haven't been considered in our formilation of QM and which may eventually move us away from a probabilistic interpretation towards a deterministic interpretation in which reality and causality are front and centre.

 

Don't get me wrong, I've known QM for 30 yrs and as a theory it has made thousands if not millions of accurate predictions to as many decimal places as required, so it certainly does work. If it turns out to be the equivalent of Newton's gravitation, ie. not complete as compared to Einstein's gravitation, don't forget that we can still get to the moon using Newton's theory and its all we need for most problems.

Link to comment
Share on other sites

During QM's formative years there was some attempt at making it more common sensical. I believe it was originally de Broglie' s idea to introduce a pilot wave to particle motion in an effort to explain the differences between the double slit diffraction of an electron and the single slit diffraction, and how does a single electron know if the second slit is open or closed. This was during the time of Einstein's famous 'challenges' to Bohr's stochastic interpretation of QM. The pilot wave was later re-introduced by Bohom ( I don't know if that's the correct spelling ) in the 50s, but because of his history as a communist he was forced to leave the country to work in Brazil ( yes, that sort of thing did happen in the 50s ). The latest questioning of QMs interpretation was by Bell in the 80s who obtained a probability of an event happening more often than always ( non-sensical ). So maybe there are some hidden variables which haven't been considered in our formilation of QM and which may eventually move us away from a probabilistic interpretation towards a deterministic interpretation in which reality and causality are front and centre.

 

Full details in THE UNDIVIDED UNIVERSE by D.Bohm and B.J.Hiley ISBN 0-415-06588-7 (replace 7 with X for pbk).

Edited by elas
Link to comment
Share on other sites

Hi Advaithi,

Physicst say according to the rules of quantum physics, sub atomic particles have duality as "particle and waves" . I dont understand this.

 

Then what is reality?

 

That's because you are referring to an incomplete theory (QM) and then you have an incomplete understanding of the reality.

The reality come if you consider as basic the Quantum field e not the particle, defined for each space-time point.

The particle is only the distilled of the field, its quantum.

When you interact with a matter field (i.e. you measure some of its microscopic properties) the particle aspect dominate.

When you don't interact with it the reality is that of a "classical" field with a wave behaviour.

 

To be more clear: a laser physicist or a radio-engineering simply are not interested in the number of photons of the e.m. field but only in the phase of their wave.

The same concept holds in the case of matter when you are interested in probing at micro or at macroscopic scale.

 

What we are seeing in every day life is just a pattern of waves?

 

It is a field. The Quantum wave-field of matter (Dirac) and radiation (Gauge) in interaction.

post-45566-0-72827800-1303979317_thumb.jpg

post-45566-0-70990300-1303979332_thumb.jpg

post-45566-0-58143000-1303979339_thumb.jpg

post-45566-0-98009800-1303979346_thumb.jpg

post-45566-0-03445100-1303979355_thumb.jpg

Edited by mgb2
Link to comment
Share on other sites

  • 3 weeks later...
The basic behavior of anything on the scale of h/p (planck's constant divided by momentum) is wave-like. Some properties are quantized, though, so they act like particles at times. As far as atoms go, they are mostly empty space, even of you look at the constituents as particles. Solids seem that way because of electromagnetic interactions that get very strong at short range.

 

An electron is a point particle. Yet, its physical presence, as measured by its Wave Function, is perpetually seeking to 'spread out' through space. So, said electron 'ghosts out', becoming 'phantasmal', partially present, in many spatial locations, at one time. Now, electrons exist, as such spatially extended 'phantasms', when they are freely propagating through space (e.g., through a double-slit apparatus) -- in which case their Wave Function 'phantasmal forms' simply spread outwards more & more. And, electrons also exist, as spatially extended 'phantasms', when they are electro-magnetically (EM) bound, to a positively charged atomic nucleus -- in which case the spatial spreading, of their 'phantasmal form' Wave Functions, is 'arrested', by EM 'inward pull', of the positively charged center of attraction. And yet, that spatially extended probability density distribution [math]| \Psi |^2[/math], of the electron's Wave Function, generates EM forces, as if it were a spatially extended charge density distribution [math]-e | \Psi |^2[/math]. And so, when two atoms (or molecules of atoms) EM interact, it is those spatially extended 'phantasmal charge density' distributions, which mutually repel. And so, again, the actual physical forces felt, are generated by 'ghosted out' electrons.

 

Now, you could prepare an electron, with a Wave Function, having a bifurcated, double-peaked probability density distribution. And, you could 'ventriloquistically project' one part of that probability profile, to a remote location -- where physical forces would be felt (if at 'half strength'). Then, you could cause a Wave Function Collapse to occur, such that the 'ventriloquistically projected' part of the probability profile vanished 'into thin air'. By such means, you could exert fleeting-yet-physical forces, in remote locations, with partially present particles, leaving no traces, but these 'phantom effects':

 

physicalforcesphantompa.jpg

Edited by Widdekind
Link to comment
Share on other sites

The best explanation I have seen regarding photon wave-particle duality is:

 

Wave-Particle Duality

 

A statement from that page reads: "The photon-as-particle concept was (naively) brought in to explain how a wave front could terminate at a space point. What actually explains photon termination at a space point is the point conversion of photon potential mass to kinetic (released) mass"

 

The author's argument is that photon emission is the release of energy stored by a matter quantum [an atom], while photon absorption is the release of mass stored by a radiation quantum [photon]. Photons do store [relativistic] mass, hence the momentum they impart when they terminate on a target.

Link to comment
Share on other sites

Is there a question in there?

 

The wave behavior of the spatial distribution of the electron has to be considered in terms of behavior in the time-domain as well.

 

An electron is, fundamentally, a point-like particle. Yet, in atoms & molecules, the electro-chemistry is generated, never by point charges, and always by 'ghosted-out-phantasmal-Wave-Functions-of-electrons-partially-present-in-many-places-at-one-time'. Thus, all (electro-)chemical phenomena, are generated by 'partially present' electrons. As I now understand it, all that current human Quantum Physicists have done, is 'bifurcate' those 'ghosted-out-phantasmal-Wave-Functions', of atoms, which were already in a 'phantasmal' form, into two spatially separated probability peaks, ~1 micron apart, under laser-controlled lab conditions. Were one to tolerate my choice of words, would they take exception to the underlying impressions?

 

 

 

 

The best explanation I have seen regarding photon wave-particle duality is:

 

Wave-Particle Duality

 

A statement from that page reads: "The photon-as-particle concept was (naively) brought in to explain how a wave front could terminate at a space point. What actually explains photon termination at a space point is the point conversion of photon potential mass to kinetic (released) mass"

 

The author's argument is that photon emission is the release of energy stored by a matter quantum [an atom], while photon absorption is the release of mass stored by a radiation quantum [photon]. Photons do store [relativistic] mass, hence the momentum they impart when they terminate on a target.

 

In my understanding, half-silvered mirrors actually split photon Wave Functions, so that half is transmitted, and half is reflected. The result, at least until some Wave-Function-collapse-causing QM observation, is two 'partially present' photon probability peaks -- "one guy goes through the door, two ghosts go down the hall beyond, in opposite directions".

Edited by Widdekind
Link to comment
Share on other sites

An electron is, fundamentally, a point-like particle. Yet, in atoms & molecules, the electro-chemistry is generated, never by point charges, and always by 'ghosted-out-phantasmal-Wave-Functions-of-electrons-partially-present-in-many-places-at-one-time'. Thus, all (electro-)chemical phenomena, are generated by 'partially present' electrons. As I now understand it, all that current human Quantum Physicists have done, is 'bifurcate' those 'ghosted-out-phantasmal-Wave-Functions', of atoms, which were already in a 'phantasmal' form, into two spatially separated probability peaks, ~1 micron apart, under laser-controlled lab conditions. Were one to tolerate my choice of words, would they take exception to the underlying impressions?

 

If you're asking about the effect of superposition of 2 states, I'm not sure the experiment has been done as you describe.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.