Jump to content

How to generate A and A^2 at the same time?


Recommended Posts

Hello, hope some one can help me on this. I am writing a piece of code that generates a square matrix A:

 

Do j=1,n
  Do i=1,n
    A(i,j)=A(i,j)+const
    ...
  end do
end do

 

Now I have to calculate A^2. If I don't want to call a subroutine to do that but want to finish this at the same time when I am generating the matrix A, is there a way to do this? Thank you very much.

Link to comment
Share on other sites

  • 3 weeks later...

You would have to generate the entire matrix A first. Let B = A^2, then [math]B_{ij} = \sum_{k} A_{ik}A_{kj}[/math] so to know say the entry in the first row and first column of B, ie [math]B_{11}[/math] you need to know the entirety of the first row and first column in A. Clearly if you only know the [math]A_{11}[/math] entry in A you cannot compute [math]B_{11}[/math].

 

Unless you are doing HUGE, and I mean where n > 1000, matrices, just doing A first and then doing B won't take you long. I've run Mathematica routines which do a coordinate transformation on a rank 3 tensor in 12 dimensions (so there's [math]12^{3} = 1728[/math] entries in the tensor and each one is the result of summing together 12 products of matrix multiplication) and on a 2.4Ghz machine it takes seconds. So if you're doing anything which decent sized matrices, just do the routine again with [math]B_{ij} = \sum_{k} A_{ik}A_{kj}[/math] in the middle of the two Do loops.

Link to comment
Share on other sites

  • 4 months later...

Assuming your matrix G is not singular, there are many, infinitely many, matrices S such that S*S=G. The square root of a matrix is not unique.

 

If G is symmetric positive definite it can be decomposed by singular value decomposition as G=UVU* where the matrix U is unitary and V is diagonal. The diagonal elements of V are positive if G is symmetric positive definite; these are the singular values of G. The square root of the matrix V is easily computed: Take the square root of each element. The matrix S=UV1/2V* is symmetric and is one common way to denote the square root of a matrix.

 

Another commonly used technique is Cholesky decomposition.

Link to comment
Share on other sites

  • 3 weeks later...

Indeed the expression of [math] ds^2 = g_{ab} dx^a dx^b [/math] produces negative answers if we consider spacelike intervals. Thus a complex field is clearly implied. I am investigating [math] ds^a={T^a}_b dx^b.[/math] I can show that this tensor, [math]T_{ab}[/math], is Hermitian. <<Who was Herman, or Hermit?>>


Merged post follows:

Consecutive posts merged

If anyone is aware of current work in this area I'd appreciate hearing. I am feeling my way in constructing this linear form, as opposed to GR usually speaking only of [math]ds^2[/math]. Right now I am using real Cartesian coordinates, [math]dx^a[/math], and allowing a complex [math]T_{ab}[/math] and thus, also ds as a complex 4-vector.


Merged post follows:

Consecutive posts merged

In addition to having to be Hermitian, the matrix T must be unitary. Is this like saying it is a rotation in complex vector space?


Merged post follows:

Consecutive posts merged

Come to think of it my coordinates may or may not be Cartesian but they are those of an external, essentially flat Minkowski space.


Merged post follows:

Consecutive posts merged

Since I allow a complex vector ds, now [math]ds^2=g_{ab}ds^{*a} ds^b[/math] ((I hope.))


Merged post follows:

Consecutive posts merged

Not a simple square root, I can say that: [math] g_{ab}= T_{ac} {T^c}_b [/math]


Merged post follows:

Consecutive posts merged

It is not clear to me that this last statement adds information, since we must use the metric tensor to raise the index of the second T.


Merged post follows:

Consecutive posts merged

Whoa, I read of mathematician Charles Hermite. Cool.

Edited by Norman Albers
Consecutive posts merged.
Link to comment
Share on other sites

Over in Physics threads I added to my 'Reissner-Nordstrom...' thread about the necessary complexification of variables introduced by angular momentum terms into GR. http://www.scienceforums.net/forum/showthread.php?p=484892#post484892 Again I thank D H for gifts which I will come to know (soon).

Edited by Norman Albers
Link to comment
Share on other sites

Norman -

 

since the metric g_ab is, effectively, acting as a K. delta, doesn't the metric need to be modified, as you will now (at least w/ ds, not ds^2) have a 3rd option, that of i, in addition to 0,1 for the K. delta view?

Link to comment
Share on other sites

Yowsa, that is a far-reaching question and I have stopped short of this for now. In the Kerr metric we get twisting of spacetime but real-valued metric terms. To reach for an answer, the constraint of the tensor T, where I posit physics, is that it is unitary, and also Hermitian. This makes me look for expression of polar theoretics as we need for particles and electrodynamics. There must be a relation to the 3-space unit vector defined as part of the degenerate metric: [math]g_{ab}= \eta_{ab} + 2mk_a k_b [/math] where [math]\eta[/math] is the Lorentz flatspace: <1, -1, -1, -1> and there is a null 4-vector k such that: [math] k_a = k_0 <1, u_1, u_2, u_3> [/math]. For now I figure that the metric tensor is real-valued, as gravitation is a neutral field level, clear to us when large aggregates of neutral assemblies of charged particles gather. This still gets us into plenty of existential hot water, as with my interpretation of sideways nullspeed inside a black hole having an imaginary value. Yes I am deeply, and hopefully, productively confused and working.

Edited by Norman Albers
Link to comment
Share on other sites

  • 2 weeks later...

solidspin, as usual some of what you said is now sinking in, a week later. I am succeeding in working out the form of the proposed tensor T in the Schwarzschild case. It seems to me now that I was not quite understanding unitarity, and welcome advice. The Kronecker delta is the form of the metric tensor of mixed indices, so you lower the upper index and get the statement I am working with: [math] g_{ab}= T_{ac}{T^c}_b [/math]. It seems I'm working toward a paper on this, stay tuned. I tried starting with the Eddington transformed coordinates and the algebra got nasty. I got a good result in the Schwarzschild coordinates and it is not difficult to run that tensor through this coord. t-form. It is quite cool that starting with a diagonal Schwarzschild metric tensor, I get <0,1> off-diagonals in T, without any implication of the Eddington time transform. All this without angular momentum, seemingly. There is a strong hint when we allow the time coordinate to shift with radius in the Eddington transform. This is the mathematic step we take on the way to the degenerate metric and the Kerr AM solution... . . This seems to be a two-step polka... . .


Merged post follows:

Consecutive posts merged

This morning as I got to work it seemed that the exciting result of the off-diagonal terms from simply the Schwarzschild form had evaporated. My, my, that was disappointing, but hey, the job is to find truth. After another long day's work, now I can see that there is degeneracy in that the product of the <0,0> and <1,1>. If we define: [math]S=1-2m/r[/math], the quotient must equal [math]T_{00}/T_{11}=S^2[/math] but they are not otherwise determined. First of all observe that always here the signs of the two terms are similar. If you choose the diagonal to be <S, 1/S>, then, in this particular case, the off-diagonal term [math] T_{01}[/math] vanishes. There is nothing new here compared with the metric tensor itself, except the signature of the diagonals. However, we can choose, say, [math] <S^2, 1>[/math], and get a much more entertaining result with off-diagonals. This is not so exciting in the mass-only Schwarzschild form. Once this level is analytically clear, it will be easier to move to expressions of AM from the Kerr metric.

Edited by Norman Albers
Consecutive posts merged.
Link to comment
Share on other sites

I am working in the Eddington off-diagonal form of the Scwarzschild solution and trying the generate the tensor T. I actually have the answer but I cannot yet generate it "raw from scratch". I got it by tensor coordinate transform of my answer in the original Schwarschild coordinates. I should be able to generate it by just hardballing and solving three equations in three variables. We need look only at the 2x2 matrices of the time and radial variables, so just say T_00 = A, T_11 = B, and T_01 = C. I invert the Eddington metric matrix to raise an index, and then make the tensor product I mentioned above. This should equal the metric tensor of the system. The problem is I generate three equations which are quadratic in the three variables, <A,B,C>. What is the analytic nature of such a system; I don't think I've looked at such before. The solution from first approach does indeed satisfy these equation constraints, but there is perhaps more to understand.

Edited by Norman Albers
Link to comment
Share on other sites

Here is the system of equations I'm looking at. <A,B,C> are the <T_00,T_11,T_01> of the 2x2 essence of the Schwarzschild solution, taken through the Eddington coordinate transform. I have defined [math]S=1-2m/r[/math] and wish to solve for <A,B,C> in terms of S:.................................................... <0,0>: [math]-A^2(S-2)~+~2AC(S-1)~-~C^2S~=~S[/math] ............................................................. <0,1>:[math]~AB(S-1)~-~AC(S-2)~+~C^2(S-1)~-~SBC~=~S-1[/math] ......................................<1,1>:[math]~-C^2(S-2)~+~2BC(S-1)~-~SB^2~=~S-2[/math] . . . . . . . . . . . . . I would appreciate analytic clues.


Merged post follows:

Consecutive posts merged

There is another equation generated by knowing the determinant of the system. I know I can start with Cartesian coordinates of the Schw. solution and generate my T matrix. Then I can apply the Eddington coordinate transform to get the answer needed. The determinant stays at +1 for these (as opposed to the determinant of either metric). Thus I may say that: [math]AB~-~C^2~=~1[/math]. It seems this does not add new information, though, since I can substitute it into one of the first three equations and produce one of the others. It seems to me this is a rank-3 system. Three matrices are needed. When I solved the simpler Cartesian case there did appear a degeneracy expressed as a floating ratio of A, B, meaning that their ratio is constrained.

Edited by Norman Albers
Consecutive posts merged.
Link to comment
Share on other sites

After a week feeling outfoxed I am starting to psyche out the nature of this system. It seems you have to go to fourth order to solve things and this is messy and consumes much paper. My better approach is to start with an intelligent guess as to one of the components <A,B,C>. We suspect maybe C can be S-1 as in the Eddington metric itself. One of the three equations is quadratic in <A,C>, and the other in <B,C>. It would seem we have two possible choices of signs in the quadratic solutions, but one is relieved by the determinant constraint: [math]AB-C^2=1[/math], so that there is only one A for each possible B. . Entertainingly if I enter with a guess of C=0 then both A,B become imaginary. D H, thanks! Now I hear you truly.


Merged post follows:

Consecutive posts merged

I can no longer be part of this forum.

Edited by Norman Albers
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.