Jump to content

Can systems of equations of diagonal quadratic forms be solved by Gaussian Elimination


kyal

Recommended Posts

Can the following system of equations be solved using Gaussian Elimination?

 

[latex]\begin{bmatrix}s_{00} & s_{01} & s_{02} & s_{03}\\s_{10} & s_{11} & s_{12} & s_{13}\\ s_{20} & s_{21} & s_{22} & s_{23}\\ s_{30} & s_{31} & s_{32} & s_{33}\\\end{bmatrix}\begin{bmatrix}x^2_0 \\x^2_1 \\x^2_2 \\x^2_3 \end{bmatrix} = \begin{bmatrix} 1 \\ 1\\ 1\\ 1\end{bmatrix}[/latex]


If one were to let

 

[latex]w_i = x^2_i, 0 \leq i \leq 3,[/latex]


then the above system is (trivially) transformed to

 

[latex]\begin{bmatrix}s_{00} & s_{01} & s_{02} & s_{03}\\s_{10} & s_{11} & s_{12} & s_{13}\\ s_{20} & s_{21} & s_{22} & s_{23}\\ s_{30} & s_{31} & s_{32} & s_{33}\\\end{bmatrix}\begin{bmatrix}w_0 \\w_1 \\w_2 \\w_3 \end{bmatrix} = \begin{bmatrix} 1 \\ 1\\ 1\\ 1\end{bmatrix}[/latex]

 

Now assuming that

[latex]\begin{bmatrix}w'_0 \\w'_1 \\w'_2 \\w'_3 \end{bmatrix} [/latex]

is a unique solution to the above system does that mean that

 

[latex]x_i = \pm\sqrt{w'_i}[/latex]

 

is the zero-dimensional solution set for the original system? Clearly this is only the case when

 

[latex]w'_i \geq 0[/latex]

 

I did try to find articles dealing with systems of quadratic forms that have a diagonal Matrix but I could not find anything relevant.

If the above approach is wrong can someone point me in the right direction?

Thank you in advance

 

Link to comment
Share on other sites

The approach is correct: If certain values for the x² solve the equation then certain values of x² solve the equation (you don't even need to rename them to w for this statement to be true). And if the solving x² are all positive, then there indeed are values x that equate them when squared.

 

The reason you don't find any articles on systems of squares is that for the topic of solving the system of equation it is completely irrelevant if one of the variables is a square. As you found out yourself you could just re-name the x² to something else that looks less scary and solve for this value. And the 2nd part, the interpretation of the x²-solution for the underlying x, is not the topic of matrix calculations.

Link to comment
Share on other sites

The approach is correct: If certain values for the x² solve the equation then certain values of x² solve the equation (you don't even need to rename them to w for this statement to be true). And if the solving x² are all positive, then there indeed are values x that equate them when squared.

 

The reason you don't find any articles on systems of squares is that for the topic of solving the system of equation it is completely irrelevant if one of the variables is a square. As you found out yourself you could just re-name the x² to something else that looks less scary and solve for this value. And the 2nd part, the interpretation of the x²-solution for the underlying x, is not the topic of matrix calculations.

 

Good answer, +1

Link to comment
Share on other sites

Great! Thank you very much, so that settles it then. It appeared logical to me as well but I was still a bit unsure because it seems that the general case of solving systems of equations involving quadratic forms (degree 2 polynomials) is np-complete (I assume that you are familiar with this term, see e.g.

http://mathoverflow.net/questions/153436/can-you-efficiently-solve-a-system-of-quadratic-multivariate-polynomials)

 

so there is something different to it. Just out of curiosity (now that my main issue is solved) then it must be that the mixed terms (which don't exist in my case) must make the difference. The interdependence of these variables is not one that can be expressed in a plain linear homogeneous system?

 

Regards

Link to comment
Share on other sites

In principle, if you have an equation like x + x*y + y = 0 as one of your equations nothing stops you from calling x*y a variable P (x + P + y = 0) and applying the constraint that the value of this variable must equal x*y (which you could do in a second step after you already have a set of candidate solutions). (same for x + x² + y = 0, btw). Mathematically, it is valid. What is less clear is that you gain anything by doing so. The constraint cannot be formulated in matrix form, so you have a mixed form of (easy) matrix expressions and (potentially less easy) non-matrix expressions/constraints. And whether you have a square matrix or not depends on the number of parameters and equations (as always, but in this case it is less likely for them to be equal). If you are interested in this I suggest you just play around with a few very simple examples. This often gives the best insight.

Link to comment
Share on other sites

I agree with you completely, in fact I came to a similar conclusion. I think that there is an implicit (or perhaps explicit somewhere I overlooked) assumption when solving linear homogeneous systems namely that the unknowns are somehow "independent" in the sense that picking the value of one does not automatically pick the value of the other. In the case of general quadratic systems involving mixed terms (not the simple case of a diagnal matrix I inquired about initially) it is possible in principle to replace each (degree 2) mixed term with a new variable, resulting in total in a new system of n (n - 1) / 2 variables. However whereas in my simple original case the domain of possible solutions is R^n, in the new system I just constructed the sets of values the variables can take on independent of the coefficient matrix [s_ij] and the right hand side constraint of the system (call it B) is just some (possibly disconnected?) subset of R^n (i.e. because of their interdependence). So the task of find the unique solution w.r.t the coefficient matrix [s_ij] and B involves searching through a subset of R^n with a rather complex (and exponential) structure which definitely cannot be done by a Gaussian Elimination.

 

But that is a very loose reasoning I am giving. Not even sure my reasoning was even correct. In any case I will follow up your suggestion and play a bit ;)

Edited by kyal
Link to comment
Share on other sites

 

assumption when solving linear homogeneous systems namely that the unknowns are somehow "independent" in the sense that picking the value of one does not automatically pick the value of the other

 

Yes that's a correct definiton of linear independence, which is the underlying requirement behinf systems of linear equations.

 

You have a vector of 'basis' variables premultiplied by a matix of coefficients (constants) equaling a vector of results.

You need to be able to invert the matrix to solve thsystem.

If the matrix is not invertible the system is insoluble and its determinant is zero.

 

The basis functions must be linearly independent, that is one cannot affect the other so not only are cross functions excluded but also linear combinations where x = (say) 3y.

 

Working all this out for yourself is laudible, but doing it the hard way.

 

Try a good book, like the one by Howard Anton

 

Elementary Linear Algebra

Edited by studiot
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.