Jump to content

Looking at the Spacetime Uncertainty Relation as an Approach to Unify Gravity


Dubbelosix

Recommended Posts

Fair enough, its been awhile since I looked into these. So I will have to do some studying and greater familaritation.

Looks like your going to eventually need each type lol.

They each have slightly different purposes.

Schmidts defines entropy, Von Neumann tests how strongly correlated the entropy is and Shannon entropy applies orthogonality to gain compatibility for qubits

Here this gives us the essential distinquishments.

https://www.google.ca/url?sa=t&source=web&rct=j&url=http://www.mpmueller.net/seminar/talk2.pdf&ved=0ahUKEwjZ9Nbgt9LWAhUGwGMKHQqEBOs4FBAWCCYwAw&usg=AOvVaw1fFu0JDQWIIg-HTfRdb5JO

 

On digging deeper it looks like Von-Neumann applies the Stirling approximation as per Einstein solid and Boltzmann. Makes sense, still confirming.

I can definitely see how this will be useful to you in your modelling goals as the above is applicable.

Edited by Mordred
Link to comment
Share on other sites

Hello.

Just a small comment about the title of your thread :

It seems to me that wanting to combine determinism with indeterminism is without issue,. even if both are exact in their own domain.

Starting from the strength of Planck (entirely deterministic) could, in my humble opinion, be more productive;  for the gravity  you can try to adapt my speculation of the vacuum catastrophe ???  http://www.scienceforums.net/topic/109635-the-end-of-the-quantum-vacuum-catastrophe/

in particular starting from the Planck force : https://en.wikipedia.org/wiki/Planck_force#Planck_force_as_a_tension_constant_of_the_space_time_fabric

Edited by stephaneww
Link to comment
Share on other sites

On 02/10/2017 at 10:56 PM, stephaneww said:

...you can try to adapt my speculation of the vacuum catastrophe ???  ...

Uh, after some research I do not see how it could be interesting for the subject of this topic....

Edited by stephaneww
Link to comment
Share on other sites

On 02/10/2017 at 9:56 PM, stephaneww said:

Hello.

Just a small comment about the title of your thread :

It seems to me that wanting to combine determinism with indeterminism is without issue,. even if both are exact in their own domain.

Starting from the strength of Planck (entirely deterministic) could, in my humble opinion, be more productive;  for the gravity  you can try to adapt my speculation of the vacuum catastrophe ???  http://www.scienceforums.net/topic/109635-the-end-of-the-quantum-vacuum-catastrophe/

in particular starting from the Planck force : https://en.wikipedia.org/wiki/Planck_force#Planck_force_as_a_tension_constant_of_the_space_time_fabric

It is true that people have tied uncertainty or indeterminism with the a matter of intrinsic randomness - I do not share this view. I am Einsteinian Deterministic, I believe the universe follows (complete) deterministic laws. I do not believe that the uncertainty principle is a measure of our inability to know the underlying structure that creates reality because of randomness, but rather a restriction on how much information we can obtain from a system.

I meant intrinsic randomness, I have correct that.

Edited by Dubbelosix
Link to comment
Share on other sites

39 minutes ago, Dubbelosix said:

It is true that people have tied uncertainty or indeterminism with the a matter of intrinsic randomness - I do not share this view. I am Einsteinian Deterministic, I believe the universe follows (complete) deterministic laws. I do not believe that the uncertainty principle is a measure of our inability to know the underlying structure that creates reality because of randomness, but rather a restriction on how much information we can obtain from a system.

I meant intrinsic randomness, I have correct that.

This work document (long and too complex for me) could interesse you :

https://hal.archives-ouvertes.fr/hal-01223516

English version is quantiqueA.pdf in "fichiers annexes"

 

Edited by stephaneww
Link to comment
Share on other sites

With the past post help from Mordred, I believe I can sum this up in the following: von Neumann introduced

[math]S = -Tr(\rho \log_b \rho)[/math]

where we have used the base power of the logarithm. The Shannon entropy has a relationship to this

[math]H = -\sum_i (p_i \log_b p_i)[/math]

Which works, when we consider a mixture of orthogonal states. In this case, the density matrix does contain classical probabilities on the diagonal. Quantum mechanical density matrices in general though, have off-diagonal terms, which for pure states, reflect the quantum phases in superpositions. So... we know where we want to go with this.

A useful equation I came across in information theory was the Shannon entropy in terms of correlation:

[math]H(A) = H(A|B) + H(A:B)[/math]

How do you read this strange equation? Well, the [math]H(A|B)[/math] is the entropy of [math]A[/math] after having measured the systems that become correlated in [math]B[/math]. And [math]H(A:B)[/math] is the information gained about [math]A[/math] through measuring the system [math]B[/math]. (Apparently), as is well known, these two quantities complement each other such that [math]H(A)[/math] will remain unchanged to satisfy the conservation of the second law. 

While this was interesting, I realised my understanding of the context of which I wanted to construct this theory relies on an interpretation of information theory that can contain an uncertainty principle - since my investigations have been primarily involved in understanding a possible non-trivial spacetime relationship, this shouldn't be too suprising. 

A paper by I. Białynicki-Birula, J. Mycielski, on ''Uncertainty relations for information entropy in wave mechanics'' (Comm. Math. Phys. 1975) contains a derivation of an uncertainty principle based on an information entropy and features here as

[math]- \int |\psi|^2 ln[|\psi(q)|^2]\ dq - \int |\hat{\psi}|^2 \ln[|\hat{\psi}(p)|^2]\ dp \geq 1 + \ln \pi[/math]

Where 

[math]|\psi(q)|^2 = \int W(q,p)\ dp[/math]

using generalized coordinates just in case anyone wonders what [math]p[/math] and [math]q[/math] is. And, 

[math]|\hat{\psi}(p)|^2 = \int W(q,p)\ dq[/math]

I have a bit of interest in this method, I generally get a feeling if I know I can do something with something else, like this above. It seems to have all the physics I need to try and piece this puzzle together.

I do seem to be getting the impression that if the system is in equilibrium, it calculates as

[math]S = - \sum_i \frac{1}{N} \ln(\frac{1}{N})[/math]

[math]=-N \frac{1}{N} \ln(\frac{1}{N})[/math]

[math]= ln(N)[/math]

Which just looks like the Boltzmann entropy. While this last bit wasn't pertinent to the work, it is interesting for educational purposes. 


ref:

http://www.cft.edu.pl/~birula/publ/Uncertainty.pdf

 

The paper by I. Białynicki-Birula, J. Mycielski shows that there are ways to describe my model with gravitational logarithmic nonlinear wave equations.

More*

In particular, I want to see if these two equations can be merged into some unified definition, by merging the physics somehow to make sense of each other in context of my investigation.

[math]- \int |\psi|^2 ln[|\psi(q)|^2]\ dq - \int |\hat{\psi}|^2 \ln[|\hat{\psi}(p)|^2]\ dp \geq 1 + \ln \pi[/math]

Which is the equation we just featured, and an equation I derived was:

[math]\Delta E = \frac{c^4}{8 \pi G} \int <\Delta R_{ij}>\ dV = \frac{c^4}{8 \pi G} \int <\psi|R_{ij} - <\psi|R_{ij}|\psi>)|\psi>\ dV[/math]

[math]=\frac{c^4}{8 \pi G} \int (<\psi|R_{ij}\psi> - <\psi|R_{ij}|\psi>)\ dV[/math]

Which was the difference in quantum geometries which as was already established, related to the uncertainty principle in the antisymmetric indices of the curvature tensor [math]R_{ij}[/math]. We never proved by any means, that this is how an uncertainty principle should be interpreted with gravity - that is a hard thing to do without a full understanding of gravity as it is. There are disagreements on how to approach a quantum theory, right down to vital questions about whether gravity is even the same as the other fundamental forces in nature. If it lacks a graviton, then you can count large portions of gauge theory will become questionable. 

 

Edited by Dubbelosix
Link to comment
Share on other sites

Thanks, feel free to add anything if you have a brain storm moment, I keep getting physics block lol 


There is such a thing as a conditional Neumann entropy, where we consider the bipartite quantum system [math](A,B)[/math]. A quantum generalisation of this two particle system from information-theory gives

[math]S(A|B) = -Tr_{AB}[\rho_{AB}\log_b \rho_{A|B}][/math]

This is known as the Von Neumann 'conditional entropy.' [math]\rho_{A|B}[/math] is a semi-def Hermitian matric in the joint Hilbert space. 

A joint state is stil

[math]\rho_{AB} \rightarrow \sum_i A_i \otimes B_i \rho_{AB}A^{*}_{i} \otimes B^{*}_{i}[/math]

and again, trace preserved through 

[math]A^{*}_{i}A_i \otimes B^{*}_{i}B_i = \mathbf{I}[/math]

The entanglement is given as

[math]E(\rho) = min \sum_i p_i S(\rho^i_A)[/math]

With [math]S(\rho^i_A)[/math] as the Shannon entropy. The Shannon entropy therefore, must also be understood in the bipartite ensemble to ensure that we restore a quantum definition. This prompted me to investigate more into a quantum Shannon entropy, if such a thing existed in literature and I have provided a reference to such historical attempts to find a quantum information theory in terms of Shannon entropy. It is somewhat related to the notation we have been using above as well as you will notice when you read through it. I made it clear though, as instructed by hindsight from Mordred, that there would be complications to look out for one was stated by me in the previous post:


''Quantum mechanical density matrices in general though, have off-diagonal terms, which for pure states, reflect the quantum phases in superpositions.''


The interesting thing, is that in our model, the density matrices governing the curvature tensor relies on off-diagonal components - in specific, all diagonal components are zero; and that is just by the definition, but an important one to remember for antisymmetric matrices. So is there a hint here that there could be unification within the two? Being a collapse model, we must also remain vigilant over the use of the Hermitian matrix - which would be a post phenomenon to the superpositioned phase and the use of those off-diagonal terms in the curvature tensor. Seems almost impossible to do. Remember, the physical kind of picture we have in mind, is a superpositioned set of particles, which become entangled from the collapse of the systems by a gravitational interaction between the two systems, such as a non-equilibrium arising in the binding energy between the two states. 

 

This is by no means, an easy task, but with my investigations, I think I've narrowed the possibilities to some good ones - at least in theory. Yet to see it in practice, I'll try, no promises. 
 

 

ref.

 

https://en.wikipedia.org/wiki/Quantum_relative_entropy

The joint state evolution will also follow a unitary operation, and has almost identical structure to the the density joint state above, that's because each unitary operator is acting per each state denoted by [math]A[/math] and [math]B[/math].


[math]E(\sigma) = E(U_A \otimes U_B \sigma U^{*}_{A} \otimes U^{*}_{B})[/math]

 

And so, if this is imposed, it removes a question of whether gravity in my theory follows unitarity - this also imposes that entropy is invariant always in the system. The symbol [math]\sigma[/math] just denoted a pure state. 

Edited by Dubbelosix
Link to comment
Share on other sites

Somehow I had three typo's in the Bure's metric, quite a few posts back, just giving the careful version now.

 


[math]F(\sigma, \rho) = \min_{A^{*}_{i}A_i} \sum_i \sqrt{Tr(\sigma A^{*}_{i} A_i)}\sqrt{Tr(\rho A^{*}_{i}A_{i})}[/math]

As shown from the Venn diagrams, it is possible to show a distinct difference between the classical and quantum entropy by noting that inequalities arise relating that entropies are weaker in a quantum case. 

The inequality is in classical theory

[math]S(A:B) \leq min[S(A), S(B)][/math]

In quantum theory it is

[math] S(A:B) \leq 2 min[S(A), S(B)][/math]

You may interpret this in this form as meaning that entropy can reach twice the classical upper bound. The factor of 2 can also be thought of as qubits. This is the same kind of situation in the von Neumann entropy, in which the base log of 2 in  information entropy is almost always calculated using base 2 log. 

You can also write this information theory in terms of energy (which may have implication for the difference of energies related to superpositioned geometry).

[math]E = k_BT \ln(2)Nh[/math]

(In conventional notation). The ln(2) is the conversion factor from base 2 of the log of Shannon entropy, to the natural base of e. [math]Nh[/math] is the amount of information in bits needed to describe the given system. Again, because we hypothesize that the gravitational superpositioning is not in equilibrium, entropy will not be zero. I've imagined, even in absence of a second particle, a single particle state could collapse if it has a centre of mass that fluctuates around the absolute square of its wave function (causing an instability, small, but gradual one). 
 

2 hours ago, Dubbelosix said:

 

The joint state evolution will also follow a unitary operation, and has almost identical structure to the the density joint state above, that's because each unitary operator is acting per each state denoted by A and B .


E(σ)=E(UAUBσUAUB)

 

And so, if this is imposed, it removes a question of whether gravity in my theory follows unitarity - this also imposes that entropy is invariant always in the system. The symbol σ just denoted a pure state. 

 

 

Keep in mind, that pure state symbol is a useful symbol to describe whether the state is separable. That means, if a mixed bipartite system descrived by a density matrix acting on [math]S_1 \otimes S_2[/math] and [math]\rho[/math] is separable iff [math]\rho \geq 0[/math] (which also preserves that entropy is never negative according to the thermodynamic laws), where I define [math](S_1,S_2)[/math] as quantum entropy phase spaces. 
 

Edited by Dubbelosix
Link to comment
Share on other sites

However, it should be noted, that Nicolas Cerf and Chris Adami has shown conditional entropy can violate the entropy thermodynamic law, allowing entropy to have a negative solution. This used the conditional density operator [math]\rho_{A|B}[/math] which we have featured before. Though forbidden, what was shown that the negative entropy allowed the framework for quantum non-separability. 

Link to comment
Share on other sites

 

 

Ok You have a huge body of various metrics to put together into one post if possible.

 It is getting extremely tricky to correlate the numerous posts as they are all closely related.

 I'm hoping that your arriving at something similar to

[latex]SU(2)\otimes U(1)[/latex]

a little research will show that numerous treatments commonly arrive at this from SO(1,3). Including a huge series of papers oft called 2+1 gravity

 Anyways under one post will make this much easier to identify any missing relations etc.

Edited by Mordred
Link to comment
Share on other sites

[math]\sqrt{|<\nabla^2_i><\nabla^2_j>|} \geq \frac{1}{2}<\psi|[\nabla_i, \nabla_j] |\psi>\ d^3x = \frac{1}{2} <\psi| R_{ij}| \psi>\ d^3x[/math]

Let's swap it around

[math]<\psi|[\nabla_i, \nabla_j] |\psi>\ d^3x = <\psi| R_{ij}| \psi>\ d^3x \leq 2 \sqrt{|<\nabla^2_i><\nabla^2_j>|}[/math]

We may multiply through that factor of 2, and what we can end up doing, is viewing the interpretation still as a mean deviation, but one that can reach twice the classical upper bound, once again. This obviously has a formal similarity to 

[math]S(A:B) \leq 2 min[S(A),S(B)][/math]

We can understand then from previous equations, that [math]S(A:B)[/math] is also [math]S(A) - S(A|B)[/math]. This is identifiable also on page 64 of the following work:

https://arxiv.org/pdf/1106.1445.pdf
 

So tomorrow, and maybe even a few days after, I will be looking more into any possible relationship between the two inequalities along with investigations into quantum bipartite Shannon entropy, just as the  von Neumann conditional entropy allows. 

3 minutes ago, Mordred said:

 

 

Ok You have a huge body of various metrics to put together into one post if possible.

 It is getting extremely tricky to correlate the numerous posts as they are all closely related.

 I'm hoping that your arriving at something similar to

SU(2)U(1)

a little research will show that numerous treatments commonly arrive at this from SO(1,3). Including a huge series of papers oft called 2+1 gravity

 Anyways under one post will make this much easier to identify any missing relations etc.

I might leave the special unitary grouping to the more familiar :) 

Link to comment
Share on other sites

It is the U(1) gauge that gives the hint lol. Think about the Pati-Salam reference and the [latex]\mathbb{Z}[/latex] you find in LQG for example. Under Clifford what is the purpose of the [latex]\mathbb{Z}[/latex]

why does LQG require it?

Link to comment
Share on other sites

Oh I know what it means, but people make big work out the unitary groups and its not something I follow deeply. 

I don't know where my gravity is heading, only that I want to see the final result and see where it all fits and whether it is truly feasible :)

Link to comment
Share on other sites

That's the density function that usually for,

 

[math]\rho = \sum_i n_i |i><i|[/math]

 

I've not really encountered it much within the theory.

Don't get me wrong, that density function is vital for the von Neumann or Shannon entropy.

Link to comment
Share on other sites

I'm not worried about how Shannon handles it.

I am interested in how your model handles [latex]u_iv_j[/latex] which is the outer product of the Minkowskii metric with inner products uv=vu

Edited by Mordred
Link to comment
Share on other sites

Fair enough let me put the outer product details later on when I have more time. I will try to get the Shannon treatments applied and its correlations to other metrics. (will take some time, most papers tend to gloss over outer products and focus on the inner products.

Link to comment
Share on other sites

Night its also late here.

See section V on macroscopic gravity correlated with Dirac and the U(1) gauge group application.

You should notice immediately the connection to particle/antiparticles. Charge,Helicity, parity.

Ie as per Pati-Salam for left/right handedness. As applied to gravity.

https://www.google.ca/url?sa=t&source=web&rct=j&url=http://cds.cern.ch/record/274210/files/9412052.pdf&ved=0ahUKEwiMq6fjudjWAhUQHGMKHUgjA7UQFggdMAA&usg=AOvVaw3p6cz8xKi30t-IfqnSSBjO

Edited by Mordred
Link to comment
Share on other sites

Ok, so hopefully I have collected a reasonable amount of information now. I am looking for a nice simple theory - we will look at the crucial expectation value of the theory and see how it implements into information theory. Assuming (from previous work) that we are working in a phase space with a some observable system, the probability density is given by [math]\rho[/math] that maximizes the relative entropy

[math]S = -\int \rho \ln \frac{\rho}{\rho_0}\ dx[/math]

And has a constraint normally notated as

[math]\int \rho\ dx = 1[/math]

Entropy can be measured by

[math]S = - \int \psi \bar{\psi}\ \ln \psi \bar{\psi}\ dx[/math]

where

[math]P = \int_U \psi \bar{\psi}\ dx[/math]

Which is the Max Born probability of finding some particle in a domain [math]U[/math]. Obviously the probability density is [math]\rho = \psi \bar{\psi}[/math]. This uses different but identical notation to the square value [math]\psi\psi^{\dagger}[/math].

Let's consider an argument and proof for a single particle wave function collapse, with [math]q = \frac{1}{n}[/math]

[math]-\sum^{n}_{i=1} p_i\ \log q_i = \sum^{n}_{i = 1} p_i\ \log n = \log n[/math]

Which is known as the entropy of [math]q[/math]. The inequality exists [math]h(p) \leq h(q)[/math] with equality [math]iff[/math] [math]p[/math] is uniform (uniform probability distribution). This is interesting, because you can argue, even in the absence of other particle dynamics, a single wave function could be capable of collapsing under it's own gravitational weight by assuming there is an analogue of the centre of mass for a wave function, which can be interpreted as fluctuating around the absolute square value of its wave function. 

The relative entropy is a distance measure between probability distributions. The entropy equations can be a little confusing so best write them out in full. The probability distributions this time, denoted as [math]p[/math] and [math]q[/math] is given as (see references),

[math]D(p|q) = \sum_l p_l\ \log_2(\frac{p_l}{q_l}) = \sum_l p_l(\log_2p_l - \log_2 q_l)[/math]

Which is basically the difference of the information gain between two distributions (under observation) - this must translate exactly into probability distributions which satisfy [math]\rho = |\psi|^2[/math] (the Born rule). Just for future reference, the relative entropy is also known as the Kullback-Leiber divergence. 

In this next case, we treat the geometry like an observable and the expectation value can be related to the density matrix [math]\rho[/math] given by:

[math]Tr(R_{ij} \rho) = Tr(R_{ij} \sum_i p_i|\psi_i><\psi_i|) = \sum_i p_i Tr(R_{ij}|\psi_i><\psi_i|) = \sum_i p_i Tr(<\psi_i| R_{ij}| \psi_i>) = \sum_i p_i <\psi_i|R_{ij}|\psi_i>[/math]

Which retrieves the expectation value of the geometry of the systems for any ensemble of states [math]\rho[/math], something we investigated before, but this time, we have it in the context of probability. This is, the very best and most simple way of investigating the expectation value in our theory (remember, we also briefly looked into how to vary the wave functions using a variational principle). This solution was specifically found within the last reference. 

It cannot be stressed enough though, if we want entanglement in this theory, we must, unlike the classical conditional entropy 

[math]S(a|b) = S(a,b) - S(b)[/math]

... which always remains positive, the quantum mechanical equivalent 

[math]S(\rho_A|\rho_B) = S(\rho_{AB}) - S(\rho_B)[/math]

is not. The state is entangled if [math]S(\rho_A|\rho_B) < 0[/math]. Very simple and nice, where the subject of entanglement can often get quite complicated. I take it as a matter of principle to find theories that are at the core, as simple as they can become. Some important things to issue here, 

[math]S(\rho_A \otimes \rho_B) = S(\rho_A) + S(\rho_B)[/math]

and that 

[math]\rho_{AB} = \sum_i \rho_{A} \otimes \rho_B[/math]

are always separable. So if 

[math]S(\rho_A|\rho_B) = S(\rho_A)[/math]

it is a separable state and no correlations. If instead we have

[math]0 < S(\rho_A|\rho_B) < S(\rho_A)[/math]

it is then said to have ''classical correlations.'' And if

[math]S(\rho_A|\rho_B) < 0[/math]

Then the correlation is quantum. Again, please check references to see this in literature. 

To finally sum up, it seems we have jigsaw pieces again to try and fit together. We identify the previous relationship of the probability to the geometry as

[math]Tr(R_{ij} \rho) = Tr(R_{ij} \sum_i p_i|\psi_i><\psi_i|) = \sum_i p_i Tr(R_{ij}|\psi_i><\psi_i|) = \sum_i p_i Tr(<\psi_i| R_{ij}| \psi_i>) = \sum_i p_i <\psi_i|R_{ij}|\psi_i>[/math]

We also have from previous formulations, that twice the deviation of the mean of the curvature as applied to a spacetime uncertainty was actually related to the

expectation value as

[math]<\psi|[\nabla_i, \nabla_j]|\psi>\ dx^3 = <\psi|R_{ij}|\psi>\ dx^3 \leq 2 \sqrt{|<\nabla^2_i> <\nabla^2_j>|}[/math]

Again, identifying the spacetime subscripts [math](i,j)[/math] as systems related to their own entropy, then the previous equation will satisfy the entropy classical upper bound,

[math]S(A:B) \leq 2min[S(A),S(B)][/math]

The difference in information gain will also be related to the difference of geometries which we derived a while ago. 

 

ref

http://www.math.uconn.edu/~kconrad/blurbs/analysis/entropypost.pdf

http://www.tcm.phy.cam.ac.uk/~sea31/tiqit_complete_notes.pdf


 

Edited by Dubbelosix
Link to comment
Share on other sites

Thats much better, now you have your outer products and the complex conjugates to satisfy the Born rule.

Its not entirely accurate to state probability amplitude squared. Its more accurate to describe the Born rule as the amplitude times its own complex conjugate. As we are involving density matrixes.  The outer product giving the tensor product of two tensors, which gives us a means to the Kronecker Delta connection.

The inner product of two vectors returns a scalar quantity. 

This was why I wanted greater detail on the outer products. You now have that above. 

I will look over the above in more detail probably have time tomorrow.

You have the tools for the Kronecker delta, but you will need Levi-Cevita connections for curved spacetime.

[latex]|\psi_i><\psi_i|=P_m[/latex] you have the projector operator (more complete above )now use this to get your identity.  The sum over all  projectors of a space  is your identity. Once you have the identity you under any basis you have the completeness equation

[latex]\sum_i=|v_i><v_i|=\mathbb{I}[/latex] (closure relation)

More correctly under QM the resolution of identity

[latex]\sum_i=|i><i|=\mathbb{I}[/latex]

Edited by Mordred
Link to comment
Share on other sites

Ok I would like you to read this article. Apply the last section to the details in this article. Pay particular attention to how the above relates to the eugenstates and eugenvalues of the linear wave equations with regards to the Schodinger particles in a box. 

Pay particular attention to coherent states with regards to that box and the phenomena of reflectivity with the potential barrier and tunnelling.

There is also excellent coverage of Schodinger time varying and time dependent aspects with regards to Schrodinger.

" Quantum mechanics made Simple " lecture notes by Weng Cho CHEW

 

https://www.google.ca/url?sa=t&source=web&rct=j&url=http://wcchew.ece.illinois.edu/chew/course/QMAll20130923.pdf&ved=0ahUKEwiK19D2iOTWAhXoyFQKHQhXCRQQFgg6MAQ&usg=AOvVaw0o4yyrtoIw1Sge245aqO0i

 

PS this article will help any other readers better understand what is going on here as well. Chapter 3 forward as operators are matrixes.

Please review matrix algebra for other readers as I know the OP understands it. It will be required to understand my last two posts.

 

 

Edited by Mordred
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.