Jump to content

Measuring Information

Featured Replies

Hi,

 

Recently, I've been looking into the subject of information. After seeing a video on fractals I figured that a fractal is, in some sense, a dimension with missing pieces. The missing pieces can then be interpolated, if one likes, though without adding any new information.

 

The example I've been looking at is the function f:R->R, f(x)=sin(x).

It seems obvious that f being a periodic function would contain less information than some other function, say, sinc(x) (defined at zero to be equal to the limit).

It also seems logical that the function f(x)=0 would have even less information than both.

 

You could say that whenever there is less information spread on an entire dimension (like the x-axis), the limited information needs a rule/symmetry that would tell us how to fill in the gaps. In the sine's case we say that sin(x)=sin(x+2pi). Of course, one has to somehow count the information resulting from the law itself.

 

All this led me to the conclution that the more unpredictable a function is, the more information does it contain, so, noise actually has the most information one can get. :confused:

 

Maybe physical laws and symmetries are just a cover for the lack of information in the universe, or to an equivalent description - maybe the universe's dimensions are not full, but are actually fractals?

 

I'd like to hear any thoughts and comments you people have about this... :)

so, noise actually has the most information one can get.
Yup.

 

Of a simple thing like a string of characters, it contains more information if it takes more data to describe it. This is an okay introductions.

 

Maybe physical laws and symmetries are just a cover for the lack of information in the universe
Kinda. There's an idea in biology that a lack of information causes a default towards symmetry.
All this led me to the conclution that the more unpredictable a function is, the more information does it contain, so, noise actually has the most information one can get. :confused:

 

Correct. It takes more information to accurately describe noise. Whether the noise conveys anything or not is a different question. Incidentally, compression results in a string of data that looks almost like noise.

  • Author

How do you measure the "surprise" in a message/function? My first guess was that predictability has something to do with auto-correlation, but that doesn't seem enough. There is correlation and there is independence. Independence implies zero correlation, but zero correlation doesn't necessarily imply independence. I feel I need to measure the auto-dependence of a function to measure its predictability.

 

But how do I do that? How do I measure the auto-dependence (the "surprise") of a function?

Archived

This topic is now archived and is closed to further replies.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.