Jump to content

Machine Learning joint entropy

Featured Replies

Hi everybody, i have a question about machine learning. I'm not sure about how joint entropy and mutual information work.

Since:

https://imgur.com/mZU383m

For the first equation, we have that:

https://imgur.com/C1zIHFT

H(x,y) seems to be: 'everything that are not in common between x and y'. But for the second:

https://imgur.com/a/3iWIyps

In this case H(x,y) can not be 'everything that are not in common between x and y', otherwise the result would not be the mutual information I(x,y). So, how should I read H(x,y)?

It looks very alike to bitwise operations: e.g. AND, OR, XOR, NAND, NOT etc.

https://en.wikipedia.org/wiki/Bitwise_operation

If you will do operation:

1 | 2 = 3 (binary %01 | %10 = %11)

3 & 2 = 2 (binary %11 & %10 = %10)

etc. etc.

Bitwise operations can be used not only on single bits, but vectors, lists, arrays, images, sounds etc. etc. (which are also just a bunch of bits).

 

XOR operator.

%110 ^ %011 = %101

 

Edited by Sensei

Archived

This topic is now archived and is closed to further replies.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.