Jump to content

Occam's Razor VS Randomness


Mr Skeptic

Recommended Posts

Randomness is the new god. Well, I think so anyhow. To explain the result of a quantum measurement where you can predict the probability of getting various results, but cannot predict which of them you will get, it seems reasonable to say that it appears random. But if you go on to add that it is random; that is to say, that the reason you got a result is inherently unknowable (ie, metaphysical, not part of physics) -- that is an extra statement that does not explain anything new, hence could be removed by Occam's Razor. Not only that, but that extra statement completely changes the universe -- whereas without randomness the universe was considered deterministic, with everything having a cause, with randomness, the universe is considered non-deterministic -- some things have no cause and are unknowable.

 

How can giving up like that be justified? Just to say that we know the answer? Just like they used to say, "God did it." shall we now say "It is random." ? How can any scientist say, "We know the answer, and the answer is that there is no answer"? How can any scientist say that something is unknowable, since there will never be any proof of that?

Link to comment
Share on other sites

Quantum randomness is the current paradigm. A recent New Scientist article suggested that it may change, when we understand more of what lies behind and beneath the quantum world. The article suggested that there may be a deterministic underlying truth, which we have yet to discover.

 

Don't ask me. I just write here.

Link to comment
Share on other sites

How can any scientist say, "We know the answer, and the answer is that there is no answer"? How can any scientist say that something is unknowable, since there will never be any proof of that?

 

I think this is an interesting reflection.

 

IMO there are at least to distinct types of cases where "things" are unpredictable.

 

The uncertainty principles of position and momentum are of a type that I consider to be kind of "logical entanglements", similar to that A and "not A" simply can't be true at the same time. It's just that on first sight it might not be apparent that they are related. This relats to most normal QM.

 

IMO, the other kind of unpredictability, is what I consider related to the constraints of limited information capacity. All our predictions and expectations are based on experience, and in particular the compilation of retained experience that makes up our identity. This I think relates to intertia and the flow of time, as well as touching decoherence ideas of QM.

 

Predictions are based on expectations about our environment and these expectations are based on the limited amount of information the observer is in possession of. This information has in turn been formed and _selected_ during the observers history.

 

Moreover this constraint limits the possible questions the observer asks. Which I also see as the possible measurements or interactions the observer _expect_ to participate in. The predictions can not possibly be anything else but expectations. Sometimes the expectations are wrong, and then feedback about this causes the observer to revise the expectations for the future "predictions". But the revision of expectations are subject to inertia. One or two datapoints can not possible just flip the expectations backed up by billions of datapoints.

 

For example, I may examine myself and find 0 support for a particular event in my pretained past. Then I have not reason expect that this will happen, and I assign it "probability" 0 which techincally is just a "expected probability", and I see no real to calculate in a sensible way the probabillity for that expectation beeing correct - the limit is set by the information capacity. Then maybe this event still happens. But that could IMO be explained by the limited "resolution" of the observer. The effect is that things are "sufficiently" unlikely (from the birds view) are assigned 0 probability from the subjective view - they are not EXPECTED to happen. That however does not mean they wont happen. However it's obvious that if our expectations is consistently violated, then evidence is accumulating and this will update the expectations.

 

Similarly, what is sufficiently probably, may end up as a "certain expectation" of proability 1 - as induced from our limited (incomplete) knowledge. Still this may be violated as new information arrives. This way I imagine that the probabilities and in the extension (yet to be properly explained) the "wavefunction" are actually physically real, and they make up the observers identity. And to the limits of the observers microstructure, the born rule are IMO nothing but emergent expectations.

 

Now if a set of interacting systems behave this may, it means that they can reach a state where the probability of change is 0 - the expected probability. But if a local group of system agrees on this, then stability is expected, in despite of the incompletenss. This is one possible way to picture the quantum stabilisation in an intuitive way.

 

This makes pretty decent sense to me, but current physics does not as far as I know give a satisfactory explanation of this. So there are many missing things that I think when done will make us understand QM much better - the mystery of the complex amplitudes, as well as intertia.

 

I think the posed question are relevant. And I for one do not think the last word in QM is on the table yet. But those who look to restore classical realism are in for a dark journey IMO :)

 

/Fredrik

Link to comment
Share on other sites

In my electron and photons studies I approach with the sensibility of a plasma physicist seeking to understand quanta. I allow the "vacuum" to respond to E&M disturbance just by stating "there exists a wave packet describable by a local divergence". This is the same game offered by Swanson's evocation of Feynman's picture of "virtual e-p pairs", really. I simply assumed for the sake of modelling simplification that there is a smooth Gaussian falloff of field intensity for the photon. If I deal with a packet of "many wavelenths" extent, this renders a linear mathematics to first order in that parameter, a/k. The results allow a beautiful interpretation clearly like that of a superconducting medium. I have played a related game in supposing an electron near-field. The relations to be seen in these models do not necessarily depend on the smooth exponentials used, so it does not so much matter if these first expositions are exact. I noted that I can use a different, truly Gaussian falloff in my electron field; I chose the first-power exponent as a first guess. The second point to make is one is free to see the inhomogeneous systems I present as the states manifest in the polarizable vacuum medium which I treat in a differential calculus because I can. On scales larger than the Planck length, I am quite comfortable with the QM picture of virtual inhomogeneity; the fundamental background may be, to me, chaotic on Planck scale and indeed I need this to be not stuck in a classical model. . . . .<>These then are the fundamental polar modes of the vacuum field. If you think of a lab plasma, there are gas characteristics like pressure and acoustic waves, and also, polar characteristics like Alfven waves and other instabilities. So I am able to say gravitation is the neutral mode of the same vacuum field. In essence this neutrality is illusory, because neutral matter must have intrinsic characteristics which are of the dipolar vacuum. We experience a dielectric-like metric change of spacetime because we are in and of this soup.

Link to comment
Share on other sites

Here is more... the purpose is to convey what IMO is potentially an appealing intuitive potential understanding of some of the weirdness of modern physics.

 

The effect is that things are "sufficiently" unlikely (from the birds view) are assigned 0 probability from the subjective view - they are not EXPECTED to happen. That however does not mean they wont happen. However it's obvious that if our expectations is consistently violated, then evidence is accumulating and this will update the expectations.

 

This is quite analogous to the idea of geometry and geodesics, straight lines and force.

 

The statement that

 

"we base our actions upon our expectations, and without evidence suggesting otherwise, we have no reason to revise our expectation"

 

pretty much says the same thing as

 

"an object follows a geodesic when no forces are acting upon it"

 

It's just that the statement can be more or less abstract, given a more or less general interpretation. It is also striking that the statement almost appears trivial and follows by definition.

 

So it seems the non-trivial part is when the expectations are updated beyond the expected self evolution.

 

When analysing that, the interesting probablistic interpretation and analogy between intertia is not far away. This basic idea is exploited in a few approaches trying to connect information geometry with general relativity. But I don't think anyone has acually succeded yet, so it's an open challange.

 

Questions

 

1) Excactly what is "mass"? and how does the mass of a system evolve?

 

2) Exactly how do we restore the consistent probabilistic interpretation of "information" induced and retained from our history of interactions, so that the "selv-evolution" we typically describe by say the schrödinger equation can be seen as the expected self evolution? (of course we would expect corrections to the schrödinger equation to account for information capacity limits / mass and inertia)

 

This are what I cnosider to be two key questions for physics.

 

"we base our actions upon our expectations, and without evidence suggesting otherwise, we have no reason to revise our expectation"

 

is generalised

 

"A system bases it's actions upon it's expectations, and without new interactions suggesting otherwise, it is unlikely to revise it's expectations and thus actions"

 

I suspect this seems remote from the original topic to some, but I think it is close, if you see it abstracted the principle can apply to an arbitrary scale.

 

/Fredrik

Link to comment
Share on other sites

We've already proved things to be unknowable through the Uncertainty Principle and Bell's Inequality.

 

Didn't Bell's Inequality only rule out local hidden variables?

 

I understand that the Uncertainty Principle is often stated to say that the position and momentum (or time and energy) of something is inherently uncertain, rather than just the measurements thereof. But I don't know why this is so.

 

In any case, I don't see how we can justify saying that some things are inherently unknowable or random, rather than just appearing to be unknowable or random. It seems like an extra but unprovable addition.

Link to comment
Share on other sites

I shall attempt to voice an idea: consider a wave packet of strongly localized extent, but still, say, a Gaussian envelope. To understand its momentum you need to sample it at different times to know the net movement of it. The combination of these pieces of information can tell you the "ebb and flow" of the disturbance. On the other hand, to describe position you need a "snapshot" which tells you nothing of motion. <> I'm sort of asking, what are the rules for pin the tail on the wavepacket?

Link to comment
Share on other sites

Didn't Bell's Inequality only rule out local

In any case, I don't see how we can justify saying that some things are inherently unknowable or random, rather than just appearing to be unknowable or random. It seems like an extra but unprovable addition.

 

 

 

 

 

That’s where I start to think of relativity not in the pure sense of Einstein(who of course was a pure genius) but in the scale of things in regards to energy and any geometry it might take on even if such is purely temporal, I mean I don’t know of any arrangement of matter or mass that is invulnerable to change and at certain scales such as with the dreaded singularity our models fall apart right? Yet that same singularity would imply then also vectors or simply the regular passage of time to some extent either at the scale of the universe or at least in a local sense to some extent. I know QM predicts particles to bump in and out of existence but I don’t know about that happening with stuff much bigger then the subatomic and certainly not a star in size yet.

 

Also for me the idea of the uncertainty principle basically states that interference or decoherence on quantum states, for that to end, would basically require one to observe nothing?

 

Yet in plain old reality as I would have it as a person reality exits as a continuum. So maybe it really is just physics changing with scale, which is my bid currently in the thread:D

 

My only other option to think is that we simply misunderstood so much overall and happen to be worshipping the wrong way. Personally I know we cant grow and study stars but even on the scale we can judge things giving current equipment in the face of concepts like relativity and QM I am sure we can conduct the full range of proper experiments in which to attach the math to as a description.

 

The only problem I see with that is the somewhat regular occurrence of phenomena in the universe, such as the existence of the star. I have no real reason to think this as so much could possibly be eluding me right now, one basic aspect is that we don’t know if the existence of energy is infinite or finite. Conservation laws would have it as infinite right?

 

So overall its still just following for the sake of sanity the current Copenhagen interpretation overall of just sticking to the empirical we can measure.

Link to comment
Share on other sites

Yet in plain old reality as I would have it as a person reality exits as a continuum. So maybe it really is just physics changing with scale' date=' which is my bid currently in the thread:D

So overall its still just following for the sake of sanity the current Copenhagen interpretation overall of just sticking to the empirical we can measure.[/quote']

 

Well, foodchain, my personal reality is no continuum. Physics changing with scale is, we mostly all agree, quite so. The Copenhagen interpretation, well...?!?!?!?!?

Link to comment
Share on other sites

Well, foodchain, my personal reality is no continuum. Physics changing with scale is, we mostly all agree, quite so.

 

Right but you don’t go one second say at your workplace then unknown to you appear magically in the center of the earth, I just don’t know if that happens period though if it did such would have to I guess go beyond what we can observe which would be at that uncertain point QM right? I mean that sort of falls along those same lines of black holes destroying information rendering ability for full objective description of the whole. It would also though seem as if we could do that humanity could know the future before it happens yet we can know the past which is plainly obvious.

 

The whole idea of say the many worlds hypothesis to the reality of say the double slit experiment and the inability to destroy the basic foundation of QM which predicts the uncertain randomness of things I think should change just about everyone’s perspective on reality yet the study of such seems to I feel lack the raw integrity of exploring such a mystery by a group of curious professionals as it simply falls to the wayside of flawed and temporal logical mechanisms.

 

I only say this because the classical view of the world would have to emerge from such a system or states or whatever it is really simply because objects can interact in time and become recorded and understood or else why don’t planes fall out of the sky all the time. The implications of this stretch to anything you would want to consider even organic evolution which has many physical bounds but of which can you separate into classical phenomena(is such temporal) down to quanta or quantum phenomena. I mean is the formation and continued processes of the earth did life start out in some band of energy resembling more or less quantum dots? I mean its the framework that governs chemical reality right? The questions or ramifications of such are clearly astounding.

Link to comment
Share on other sites

The classical view does emerge. Look at fine measurements of the magnetic flux in a superconducting loop. As you go down and down the smooth curve becomes a staircase and we are climbing a staircase to quantum heaven, or perhaps the converse...I'd rather choose a happy solstice meditation. . Seriously, at the bottom end the last steps are, with increasing clarity, quantum steps. MAGNETIC FLUX IS QUANTIZED, at least in such a stable configuration.

Link to comment
Share on other sites

Mass is energy in a quasi-stable localization.

 

I was sloppy in my question. I consider mass and energy loosely related concepts, so do you as it seems, we share that so far. But then the question is still what is energy and what do you mean with localization?

 

Classically I understand what you mean, but to merge this with a logic that also embraces quantum phenomena, we need to seek the common denominators that unit Einsteins thinking with the quantum world. This is what I tried to suggest.

 

With Localization I suspect you implicitly refer to space, but then what is space? How is the image of space formed from collected evidence?

 

Imagine the localization to be emergent in that things organize and the principle of locality can reversly define space distance. So IMO the generalization of the locality principle to information systems is to simply consider let experience build the information geometry and space. Events that are far away in space is because experience expects low correlation.

 

So it seems the jump is not far from that to think that maybe mass of a structure is the amount of data supporting that particular structure, which have emerged. Closely related to the confidence in the structure. A structure with high mass is dominant from an information capacity point. This is the thing I'm trying to elaborate. And it seems there exists natural links here between encoding capacity and inertia, but it's still a question of exactly howto put it together with a consistent formalism.

 

I am imagining a relational idea, where even information quanta are relative, but everything is always formulated from a given starting point. This may also resolves the headache that if nature a decomposed of discrete units then what determines their scale, and how are they determined? IMO, this is relative, and the "discrete units" are not discrete blocks in the sense of classical realism. The units are rather relations between sub systems, but since the decomposition of systems into subsytems is not unique, neither do I think the discrete units are. Objectivity is emergent as per som self-organisation. Once the bits and pieces are worked out and how they interact, I would expect that interesting simulations could be made.

 

/Fredrik

 

Uncertainty Principle is often stated to say that the position and momentum (or time and energy) of something is inherently uncertain, rather than just the measurements thereof. But I don't know why this is so.

 

To analyze that, how about try to analyse how you actually measure those things? Then you may or may not might find that those measurements are not independent.

 

/Fredrik

 

In any case, I don't see how we can justify saying that some things are inherently unknowable or random, rather than just appearing to be unknowable or random. It seems like an extra but unprovable addition.

 

In a restricted sense I can agree with you here.

 

The fact that I don't know something AND that all my current experience tells me that this seems impossible to know, doesn't necessarily mean I can't learn and come to know. But then in such a hypotetical scenario, the original conditions not the same. So it's not possible to make a foolproof comparasion.

 

IMO, the whole point with the subjective bayesian interpretation of probability is that by construction, we are trying to find the plausability or "probability" for different possibilities that we by the condition can't discriminate between. Our current and incomplete knowledge can only induce a probability for the various options, which is conditional to what we know, including our lack of discrimination - not on what we could know or will come to know.

 

This is exactly why I think it makes no sense at all to imagine a probability, conditional on hypotetical information pictures as infinite measurements. It disrespects the measurement ideals IMO, which I consider to be subjective. Because two observers may not interface identical to the same measurement device. It's an unrealistic idealization IMO, that happens to make perfect practical sense in many experiements, which is why I think it has survived and the objections to it are often dismissed as irrelevant philosophy.

 

Still, I think the inherently unknown things refer to "expectedly inherently unknown". Which means that unless new evidence appears, the seem to be inherently unknown.

 

But I suspect it also depends on how you construct the formalism, what you start with and what you try to induce. If you start by postulating the commutator of x and p, or "define" p to obey this, then the inherent uncertainy follows from definition.

 

/Fredrik

 

This relates to my ramblings above. That a subjective conditional probability 0, doesn't mean it wont happen. The point is that we don't ever know what WILL happen, until it has happened. All we've got are expectations.

 

This can also intuitively explain how symmetries can be created and broken. A symmtry is an expected pattern, or rule, as induced by the incomplentess of a subsystem, and as long as the system is decoupled from the environment or in agreement with this, this symmetry will be stable and remain unbroken.

 

If you seek intuitive models, think economy models and game theory. If you consider the rate value of a company on the stock market. One can say that it's not REAL. It's simply the collective expectation! If people THINK it's good, then the exchange rates rise and it becomes good. That's apparently how the world works.

 

I think insight in this, will help in physics as well. Except of course in economy there is still an underlying determinism that we can imagine, but that is really beyond the point IMO.

 

/Fredirk

Link to comment
Share on other sites

I shall attempt to voice an idea: consider a wave packet of strongly localized extent, but still, say, a Gaussian envelope. To understand its momentum you need to sample it at different times to know the net movement of it. The combination of these pieces of information can tell you the "ebb and flow" of the disturbance. On the other hand, to describe position you need a "snapshot" which tells you nothing of motion. <> I'm sort of asking, what are the rules for pin the tail on the wavepacket?

 

I'm roughly with you here. That the change or as I like to think of of the uncertainty in one observable, can be taken to define emergence of a new observable. This will also be emergence on demand, becaues it's emerges only when the uncertainty calls for it.

 

But I also want to understand how observables emerge. This is why I think in terms of uncertainty before I think of time. Because even without clocks, there may be a certain amount of uncertainty and we might simply have lost track of the timeline in the fluctuations. But observing the uncertainty closely might give rise to a distinguishable clock. But I haven't found out exactly how to do it yet.

 

To me the information is just something human, I am looking for the one-2-one map of information and physical reality, and thus to make some sense in the usually considered "wavefunction is not real". As it stands and as we understand it, this is correct, but still that doesn't smell right. If the wavefunction is redundant that I suggest we remove that redunancy from the formalism. I've got a strong feeling that the continuum background structures of space contains ALOT of redundancy. And although the redundancy of this may be mathematical and not physical, it suggest that the model isn't minimal.

 

/Fredrik

Link to comment
Share on other sites

I'll keep assuming we are talking about the same thing... IMO this is related to the information capacity bounds of systems interacting. This bounds the resolvable inhomogenity, and it may seem that "continous options" inbetween the observations is a natural a priori assumption or "minimal extension", but I disagree with this too. Either you decoupole the model from reality, or you violate the information capacity bound by inflating it with a continuum.

 

/Fredrik

Link to comment
Share on other sites

Mr. Skeptic -

 

recall that <x>cross<p> >= hbar/2. meaning the expectation value of the total momentum - p - cross the expectation value of position - x - This means that mathematically, you can't know all components both simultaneously, yet they clearly are there. Further, you CAN know some of the components, just not all of them. It's DEFINITELY not "inherently" uncertain, as you say. They're there, but QM doesn't allow them to be observed due to commutation rules...

 

I seem to recall that Bell actually relied on "hidden variables" to justify his inequality. Yet, I sincerely believe he did not fully grasp the beautiful foundation of GR, such that a simple, non-local operator solves his so called "hidden variable" conundrum - particularly w/ his exp't which used photons - gauge bosons where the "|0> spin state" - which doesn't necessarily exist w/ a photon, could, in fact, be mathematically represented as the non-local transition b/t the |1> and -|1> states.

 

great question -

 

I offer an answer by analogy...in NMR, you place a compound in a large B field - you've effectively placed a Hamiltonian operator on your system (the Zeeman effect), since you've killed the degeneracies of your system and you're ready to start doing your exp'ts, since roughly 1/2 of all the spins in your system are now precessing around B(z). The fact that only roughly 1/2 of your spins are pointing up (slightly less than 1/2 are pointing down) is directly due to kT, and that is the slight inhomogeneity. Ultimately, we NEED that slight inhomogeneity to get our data...

 

Here's a 2nd example where inhomogeneity is BAD. - Same as above, except our material now has Iodine in it - a nucleus whose angular momentum is so strong that it fights w/ the Bfield of our magnet and DOESN'T point toward B(z) like all the other spins in your material - now, we have 2 B fields, the one created by the magnet AND the one created by the local iodine - now, we have 2 Hamiltonians in our system and each is going to have its own solutions to Schroedinger - this gives weird off diagonal terms in our matrices, since the spins now feel 2 B fields - Schroedinger breaks down - ugh!!!! This led to the field of NQR (nuclear quadrupole resonance) where you use the naturally occurring iodine (or other strong quadrupole) as its own B-field, then use that w/o a magnet and pulse (apply operators to get solutions) accordingly...

Link to comment
Share on other sites

Hi solidspin - Welcome to the boards. It's nice to have new members sharing their thoughts. I am not on the staff here, but I'd like to make a request.

 

When you respond to posts, please press the "Quote" button on the bottom right of the post to which you're responding. That way, we can see the original post to which you're responding, as well as your reaction to it.

 

When you just press "Reply" or do a quick reply, we have to search the thread to decipher which post it is to which you're responding.

 

Thanks, and happy posting. :)

Link to comment
Share on other sites

Personally, I can't see how you can apply Occam's razor to the disparity between QM and relativity...unless you want to use it to reduce some of the more radical interpretations of QM, but even then it doesn't work.

 

It reminds of the problem Faraday faced, i.e do bodies cause action at a distance, and the idea of fields was introduced, perhaps reconciling QM and relativity requires a new, perhaps radical way of viewing the universe.

 

Perhaps there's a parameter that becomes more random at smaller scales, and causal at larger scales, but I havn't come across anything like that in any of the math I've studied. It is however, probably a good question to frame.

 

Even if gravity can be described at the quantum level, what would that say about the effect of gravity at larger scales ?

Link to comment
Share on other sites

Personally, I can't see how you can apply Occam's razor to the disparity between QM and relativity...unless you want to use it to reduce some of the more radical interpretations of QM, but even then it doesn't work.

 

The association occam's razor I make in this context is related the information capacity bounds and the adaptivity.

 

If we believe an observer to only beeing able to hold a limited amount of information, then the fact that anything that doesn't "fit" (can't be projected) within the bounds are forbidden along becuse it's *too complex* - and complex here means requiring a too "massive" representation structure - "occam's razor".

 

The other idea is that I argue that that evolution will favour compact representations.

 

More or less, I associate relative complexity with relative masses.

 

Perhaps there's a parameter that becomes more random at smaller scales, and causal at larger scales,

 

Yes I think something like that. And the beautiful part is that once something becomes random, it does not contain very much information - and can be discarded/released without information loss! :) This is related IMHO and personally thinking, to the _complexity scaling_ and _mass scales_.

 

but I havn't come across anything like that in any of the math I've studied. It is however, probably a good question to frame.

 

It seems that many physicists looks for mathematical models that can solve physics problems. I think that sometimes this is limiting.

 

I don't look at my tools and wonder what I can do with them. I try to decided in terms of some internal representation what I want to do, and then look for the tools. If the tools aren't around, then we'll have to invent them like mankind always did to survive.

 

I do not rule out that we may need a completely new formalism. But if that is what it takes, then that is what we must do.

 

So the way I imagine this, the principle of occam's razor is given a more specific meaning that is built into the model.

 

/Fredrik

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.