Jump to content

boolean matrices


premjan

Recommended Posts

Imagine an m x n matrix which has only 0 and 1 as its elements. I couldn't think of a better term for it, so I'm calling it a boolean matrix.

 

1) suppose m = n. I want to calculate the inverse of a given boolean matrix. Is this faster than computing it for a regular matrix (with elements other than only 0 and 1)? Is there an algorithm (analogous to, say, Gaussian elimination) which always preserves the boolean property at each step? Basically if you can maintain the boolean property, you can do everything using bit operations so it ought to be more efficient (besides there not being any numerical instability problems).

 

2) suppose m != n. I would like to compute the left and right inverse of a given boolean matrix. Is it likely that the left inverse would be equal to the right inverse? Is it easy to tell when this would be the case? Could we force it to be the case somehow?

 

3) is it easy to tell which boolean matrices are nonsingular?

Link to comment
Share on other sites

1) There is no guarantee that (over the reals, say) a matrix with entries 0 or 1 has an inverse where the entries are even integers, never mind binary digits. Unless you are doing operations mod 2, which you ought to be if you were properly going to call them boolean.

 

If we are operating of Z_2, then there are several speed up techniques for operations, such as "sliding". This is useful for dealing with matrix representations of very large groups such as the monster where a single matrix fills up a single floppy disk, and there are as many matrices as there are bytes on that disk.

 

2) Cannot possibly hold by considering the linear spans of a columns/rows. In particular given an nxm, A, matrix with n>m, there will never be more than m linearly indendent vectors in the image, and the best you can do is find an mxn, B, matrix such that AB has a block diagonal decomposition as an mxm identitiy matrix and an (n-m)x(n-m) null matrix.

Link to comment
Share on other sites

Matrix inversion injective? Z_2 is a field, look at the usual results about matrices over s field and you'll see that they are independent of the field. The matrices, over any field, with non zero determinant form a group under multiplication.

 

Sorry, I've no references for sliding, but it is a simple observation that since there are only 2^n possible vectors of length n, then once you've worked out how to multiply a given part of the matix by some of the columns you can do the rest by inspection.

Link to comment
Share on other sites

Is there some way to eliminate "singularity" in Z_2 matrix inverse while retaining other properties (possibly by redefining + or *)? Basically I was wondering if it is possible to make the matrix inverse work for all matrices including the 0 matrix (groupify the inverse operation I guess)? (probably not, I guess). Is there some way to gain intuition why this is not possible?

 

Is it possible to achieve something like a field that is closed under inverse? I guess the multiplicative property of 0 would have to be sacrificed. Could this be done by introducing an explicit "max" or "infinity" element?

 

Maybe if I use ^ (XOR) and !^ (XNOR) for * and + respectively?

Link to comment
Share on other sites

Eh? No. It's a ring with zero divisors. You cannot divide by zero without introducing a whole shed load of inconsistencies and special cases.

 

Something that posseses multiplicative inverses and still has any hope of being a ring is either a field or a division ring (algebra). In no case may you have zero divisors, and you can never invert zero. You never invert 0 inside the field of reals or complexes (and still havae it remain a field).

 

Your question is exactly the same as asking to do so with R or C or Q instead of Z_2.

Link to comment
Share on other sites

wow I like the way you delicately said "shed" instead of s***.

 

I get it. The whole thing doesn't make as much sense if constructed without the usual behavior of 0 and 1. If we changed it, it would complicate all our formulas and equation solving.

Link to comment
Share on other sites

Shed load is just a bog standard english phrase.

 

I hope what i got across was that there is nothing special about Z_2, or the fact that these are matrices, really. It's just the standard - defining 1/0 takes a whole lot of effort and is of dubious worth in this algebraic setting (as opposed to the analytic setting, where 1/0 = infinity is a useful thing for the one point compactification of the complex plane).

Link to comment
Share on other sites

now doesn't that translate into the analytic setting somehow though? I bet if we introduced an explicit infinity element it would make the whole shebang work with closed division and matrix inverse. What is a "bog" standard? If we had an infinity element, we would have 0*infinity = 1 and 1*infinity = infinity. Then it would be Z_2 augmented with infinity. 0+oo = oo and 1*oo = oo. Then singular matrices would have inverses, would they not?

Link to comment
Share on other sites

Yeah, oo (and 0) are kind of multiplicative information sinks.

 

oo * 1 = oo

oo * 0 = ?? (shall we say this is 0 or undefined?)

0*a = 0

 

I was just thinking that we might somehow be able to guarantee an invertible matrix inverse function but oo * 0 is still a problem.

 

Why not just say that oo * 0 = 1 (by definition)?

 

Here are possible truth tables for {Z_2,oo}

 

+ 0 1 oo

0 0 1 oo

1 1 0 oo

oo oo oo oo

 

* 0 1 oo

0 0 0 1

1 0 1 oo

oo 1 oo oo

 

Basically multiplication by infinity is the problem. It is not invertible, as oo*oo = oo*1. Also 0*0 = 0*1.

 

I'm guessing there is no way to eliminate these problems?

 

Perhaps we could put oo * oo = 0 (kind of return to home do not pass go situation) whereas oo * 1 = oo.

 

And 0*0 could plausibly be put as oo (it even looks like two zeros already). This would at least remove the information sink properties of 0 and oo. Would this help in any non Z_2 setting is the question. But I guess it would make matrix inversion in Z_2 (only) invertible, in some sense.

Link to comment
Share on other sites

Probably for no good reason. Somehow the idea of matrix singularity bugged me, but while we can take some reasonable steps to eliminate it, 0*0=0 seems too much like an obvious thing to kludge. I guess some matrices (and numbers) had best just remain singular (or uninvertible).

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.