Jump to content

Machine Learning joint entropy


Luke0292

Recommended Posts

Hi everybody, i have a question about machine learning. I'm not sure about how joint entropy and mutual information work.

Since:

https://imgur.com/mZU383m

For the first equation, we have that:

https://imgur.com/C1zIHFT

H(x,y) seems to be: 'everything that are not in common between x and y'. But for the second:

https://imgur.com/a/3iWIyps

In this case H(x,y) can not be 'everything that are not in common between x and y', otherwise the result would not be the mutual information I(x,y). So, how should I read H(x,y)?

Link to comment
Share on other sites

It looks very alike to bitwise operations: e.g. AND, OR, XOR, NAND, NOT etc.

https://en.wikipedia.org/wiki/Bitwise_operation

If you will do operation:

1 | 2 = 3 (binary %01 | %10 = %11)

3 & 2 = 2 (binary %11 & %10 = %10)

etc. etc.

Bitwise operations can be used not only on single bits, but vectors, lists, arrays, images, sounds etc. etc. (which are also just a bunch of bits).

 

XOR operator.

%110 ^ %011 = %101

 

Edited by Sensei
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.