Jump to content

s-field and information


JS

Recommended Posts

Anyone could explain me in what sense the

sigma field in a probability space are related with the information that we have

(any example?)

 

 

(i have listen about: sigma field with more sets could be interpretated like we have more information, but I don't understand the idea yet)

 

 

P.S: obviously my english is very bad -_-

Link to comment
Share on other sites

What little I know of information theory is this. When a new piece of data is added to a set of data that we are using for a hypothesis test, the amount of information contained in that new piece of data is defined (by some) as the net change in certainty of our hypothesis. Thus if we get a smaller p-value we gained information against the hypothesis (p < .01 means reject the hypothesis). So in this way the information content of new data is not absolute it is related statistically to the relationship between the real world object of study and the existing data. This makes sense in real world applications because some information is not useful if it comes at the wrong time or is predicated by subtle differences in the underlying criteria. In real world information systems, new information is usually judged both by its quality with respect to the object of study and its quality with respect to current needs. There is a tendency in established information networks to insulate current knowledge against small amounts of information that imply major paradigm shifts. This can be both good and bad. I found the definition above in a rather old text (by non-mathematical standards) on information theory and would certainly like to learn more if anyone feels like talking.

Link to comment
Share on other sites

i think you need to reword your question as it makes little sense. The word 'information' appears out of place.

 

In probability (and measure) given some set W a sigma filed S on W is a set of possible outcomes from sampling from W to which we can assign probabilities.

Link to comment
Share on other sites

that's true... my question is about the interpretation of have a "bigger" sigma-field

 

it make sense word 'information' beacause the conditional expectation is write

 

E(X|F) (F is the sigma field)

 

(and that F in some problems is not "constant...")

Link to comment
Share on other sites

extending the sigma field to a larger one simply means we have more possible outcomes to which we assign probabilities. in you case here it would seem the expectation depends on the sigma field since the sigma field tells us what the probabilites of any events are.

Link to comment
Share on other sites

Well when you put iot like that I think it fits the definition I mentioned below. The sigma field is a set of possible outcomes of sampling on W. Does W have a standard topolgy? Does s have a true field structure or is it just a name? At any rate if you now extend the set to include more possible samples you have added information its that simple. Or is s meant to include all possible samples already? The definition I gave below came form a good graduate level math text, I think it is correct.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.