Jump to content

Can quantum behavior really be explained through an underlying principle?

Featured Replies

A paper I wrote, which is attached, "Probabilistic Computational ToE" presents a framework that re-conceptualizes reality not as a set of static laws but as an emergent property of a dynamic, computational substrate. The shift from a deterministic computational model to a probabilistic, zero-sum system is a significant step forward, directly addressing the accumulation error problem that has plagued similar theories.

The paper's core strength lies in its hierarchical approach to stability. The concept of Tiered Resilience—with particles, error correction structures, and meta-error correction—provides a plausible mechanism for how order can not only emerge but persist over vast cosmological timescales. The analogy of black holes as "bias waves" that reset local physics is a particularly novel and thought-provoking idea, offering a concrete model for universal cyclicity and the potential for a multiverse.

The model successfully addresses quantum-like behavior like wave-packet motion and bosonic cohabitation. But, the question remains, can quantum behavior really be explained through an underlying principle?

There is a Python simulation of the model, which is attached.

Preview-Revised.pdf

GPUQuantized4D-RT-Org-Enh.py

Edited by waitaminute
Attached the python simulation.

1 minute ago, waitaminute said:

Also, note that this post is not an advertisement for any product or service for sale.

For someone who purports to have developed a ToE, you sure have lousy reading comprehension. I even bolded the relevant parts of the rule you were violating, and “advertising” wasn’t it.

Post the material here, or don’t post about it. Those are your only two options.

I will critique later,
But you have put some hardwork in it.

It's a speculative meta-theory — something that hopes to explain physics, but hasn’t yet reproduced it.

To support a speculation you need evidences and proper mathematics.

  • Author
34 minutes ago, Dhillon1724X said:

I will critique later,
But you have put some hardwork in it.

It's a speculative meta-theory — something that hopes to explain physics, but hasn’t yet reproduced it.

To support a speculation you need evidences and proper mathematics.

The model does have mathematics to support its arguements, where the ability to recreate particles from sets of bias fields, as the model suggests, it takes time, since such organizational structures are products of random sets of states or bias wave fronts that happen to form. Such chaotic systems can self-organize through feedback loops or brute force iterations. This is to similar concept to how life could form from the random interactions of atoms and molecules. The model also emphasizes that the laws of physics emerge through a natural selection process that favors organizational structures that reproduce through evolved error correction (EC) systems, yes, similar to cellular life! Effectively, reproduction is a form of meta-error correction, but particles exhibit extraordinary longevities, where the model indicates that EC must take the bulk of the finite computational resources that represent physical laws at an atomic level. The model posits that the hypercube units are very small, assuming a Planck length for their edges; the amount of computational horsepower is astronomical for each particle, something like 1020 units! As the paper indicates that most of the computational resources are committed to EC and proves why that must be so. By comparison to cellular life, the ability to create such an equivalent EC to preserve a single cell that has 1014 atoms, where Cells need to allocate resources towards growth and reproduction. This creates a trade-off between growth and repair because of its limited resources. For a cell to have the longevity of billions of years would result in a mass that is a million times greater inorder for it to produce the redundant EC infrastructure! The notion is that particle reproduction transitioned into particle assimilation, where annihilation is actually a form of assimilation into a different form of particle(s).

So, when you argue that the model hasn't produced physics, remember, neither have the models of biological life created anything even close to DNA.

Edited by waitaminute
Clarified EC for cells.

1 hour ago, waitaminute said:

I've revised the paper that elaborates on section 10.3, Creation Scenario and Multiverse Dynamics

And you still ignore Swansont's suggestion.
Another misunderstanding ??

Its getting to the point that whenever ( much more often lately ) someone presents a groundbreaking fundamental new 'theory', and they refer to it as a 'framework', I immediately think it was a crackpot idea, developed with considerable ( but incorrect help ) from an AI, and I refuse to read it, much less follow any provided links.

Maybe I'm getting jaded, but I can't get excited or interested in the many new fundamental ideas presented in the last several months.

  • Author
1 hour ago, MigL said:

And you still ignore Swansont's suggestion.
Another misunderstanding ??

Its getting to the point that whenever ( much more often lately ) someone presents a groundbreaking fundamental new 'theory', and they refer to it as a 'framework', I immediately think it was a crackpot idea, developed with considerable ( but incorrect help ) from an AI, and I refuse to read it, much less follow any provided links.

Maybe I'm getting jaded, but I can't get excited or interested in the many new fundamental ideas presented in the last several months.

No, the section was related to the previous post I made, and I uploaded the revised paper; no links are in that post. So, perhaps you have a misunderstanding?...

  • Author

I'm going to summarize the paper, since the original post was a bit over-simplified, but it is on a topic of ongoing theoretical research of developing a model that can emerge QM from underlying principles. This model is one approach that emphasizes the concept that simple rules can give rise to complex systems through leveraging highly redundant units:

The paper hinges on a new definition of computation: Where it defines computation not as symbolic logic, but as the "integration of information that results in a state". This broad definition allows for the "causal-and-effect systems" of any reality to be considered computational.

The substrate as a network: The hypercube units are not isolated; they form a "4D hypercubic lattice" where each cell's state is updated via "weighted neighbor interaction". This creates a network where information influences its neighbors.

Emergence from chaos: The model posits a "chaotic ground state" or "net-zero probabilistic chaos" as the default condition of the substrate. From this chaos, emergent structures and behaviors, like "wave-like quantum effects," arise not from symbolic computations, but from the ability to "bias areas". This is analogous to how a chaotic system can, under certain conditions, self-organize into stable patterns.

The simulation and its description in Appendix A further support this by showing how a "double-buffered update logic and randomized neighbor weighting" can lead to emergent "coherence zones" and "collapse timing". This directly connects the model's core principles—probabilistic integration and local interaction—to the observable behaviors it predicts.

The point isn't about immediate verifiability. The model's strength lies in its ability to offer a coherent, causal, and computational explanation for phenomena that are typically treated as fundamental axioms of physics. It proposes that the complex, seemingly non-intuitive behaviors of quantum mechanics, such as superposition, entanglement, and collapse, are simply the macroscopic manifestation of a vast, underlying computational process of localized error correction and probabilistic state shifts.

The Model's Core Philosophy

The model reframes our understanding of reality, suggesting that the stability of physical laws and particles is not a given but a "dynamic computational achievement". The existence of our universe for billions of years is attributed to the immense computational resources of the substrate being primarily dedicated to "a robust and redundant system of error correction". This perspective is a bold departure from traditional physics, where such stability is often assumed to be a fundamental property.

Demystifying Quantum Phenomena

The paper's approach attempts to make several key quantum concepts understandable through a computational lens:

  • Wave Collapse: Instead of being an instantaneous, mysterious event, the model reinterprets wave collapse as a "computational collapse". This is an emergent process where a measuring apparatus introduces a bias, causing the system's error correction (EC) mechanisms to rapidly stabilize a single configuration out of many possibilities. It is a "probabilistic convergence" rather than a non-local, acausal event.

  • Superposition: In the model, superposition isn't a state of being in multiple places at once, but a "spatially distributed, probabilistically activated pattern" across the substrate. These patterns are regions of elevated probability amplitude that evolve under neighbor interaction and noise.

  • Entanglement: The paper maps entanglement to "mutual EC stabilization across non-local regions". This suggests that what we observe as entanglement could be the result of a coordinated, self-correcting process within the substrate that preserves coherence across a distance.

By providing a plausible, causal mechanism for these phenomena, the paper moves them from the realm of "impossible to comprehend" to "a complex computational process to be understood." It shifts the focus from abstract mathematical principles to a physical, albeit currently unobservable, substrate with defined rules of interaction and correction. This aligns with an analogy to language processing, where a once-enigmatic cognitive process has been increasingly understood through computational models.

Attached is a revised version of the paper that will reinforce what's been posted here.

Preview-Revision-11.pdf

Edited by waitaminute
Update introduction

Guest
This topic is now closed to further replies.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.