Jump to content

Probability (mental Block)


YT2095

Recommended Posts

there are 40 balls in a bag, each numbered idividualy 1 through to 40.

 

what are the odds of picking say 1,2,3, and 4?

 

I know it`s 1 in 10 of picking at least ONE ball correctly, I`m stuck what happens after that though :(

 

any help?

Link to comment
Share on other sites

here's my take on this.

The odds of the first one is 4 out of 40.

The odds of the second on is 3 out of 39.

The odds of the third one is 2 out of 38.

The odds of the fourth one is 1 out of 37.

 

To get the probabilty of pulling out 1, 2, 3, and 4 in the first four picks is then calculated by multplying all four probabilities together:

 

(4/40)*(3/39)*(2/38)*(1/37) = 1/91,390

 

Odds are not good!!

Link to comment
Share on other sites

Isn't that the odds of picking out the balls in random order ?

 

I thought YT2095 wanted the odds of picking them in the order of 1,2,3,4.

 

Which should be (1/40)*(1/39)*(1/38)*(1/37) = 1/2 193 360

 

Almost impossible odds.

Link to comment
Share on other sites

I think I'd agree with Spyman's interpretation of the question, but dan's method for his particular interpretation appears to be correct.

 

You're also assuming the balls aren't being put back in the bag ;)

Link to comment
Share on other sites

any order at all! :)

 

it need not be 1 2 3 and 4 just any 4 balls pre choosen before you pick from the 40.

no balls when chosen are put back in the bag.

 

my appologies for not stating this 1`st time around.

Link to comment
Share on other sites

there are 40 balls in a bag' date=' each numbered idividualy 1 through to 40.

 

what are the odds of picking say 1,2,3, and 4?

 

I know it`s 1 in 10 of picking at least ONE ball correctly, I`m stuck what happens after that though :(

 

any help?[/quote']

 

I am just going to start a logical analysis, and see where it leads...

 

First define a sample space.

 

Use the roster method, since there are only 40 elements in the set.

 

[math] S \equiv \mathcal{f} {B_1,B_2,B_3,... B_{40} \mathcal{g}} [/math]

 

Translation: The sample space S is defined to be the set containing ball 1, ball 2, ball 3, etcetera, up to ball 40.

 

We are going to define an experiment, in which you reach into the bag, and pull out one ball, and then look at it, to determine which of the forty numbered balls were removed.

 

This experiment is repeatable, in the following sense...

 

It is repeatable, in the sense that you can replace the ball back into the bag, and then reach in, and take a single ball out again.

 

Suppose that you are interested in the probablity that in the first trial of this experiment, you remove either the ball numbered 1, or the ball numbered 2, or the ball numbered 3, or the ball numbered 4. If any of these balls are removed on your first pick, then you win one million dollars. If not, someone else gets to try.

 

So success is defined as follows:

 

The following statement must be true in the future:

 

I removed ball 1 XOR I removed ball 2 XOR I removed ball 3 XOR i removed ball 4.

 

Now, either the truth value of the statement above can vary in time, or not.

If there is only one future, then the truth value of the statement cannot vary in time. So that if it is true in the future, then it is true now. If it is false in the future, then it is false now. Let us consider it as inevitable, that you will draw a ball out of the bag in the future, because you want a chance at winning one million dollars.

 

Now we come to the concept of a random variable.

 

Here is a link to the definition of 'random variable', from Mathworld, at wolfram:

 

Wolfram, on Random Variable

 

I have the book on probablity theory, written by Papoulis (from memory it had a blue cover), I'm not familiar with Doob.

 

Let us presume that if there is only one possible future, then there are no random variables in play, in the outcome of the experiment. Let us also assume that if there is more than one possible future, that there are random variables involved in the outcome of the experiment.

 

If when I draw the ball out of the bag, it is not an element of the set {1,2,3,4} then I don't ever win the million dollars, because I am not allowed at least two chances at it, someone else needs to be given a chance.

 

Let us presume that I don't know that there is only one possible future, and that I am going to entertain the thought, that the truth value of the statement "I draw ball 1 XOR I draw ball 2 XOR I draw ball 3 XOR I draw ball 4" can vary in time. Therefore, it might be false now, but there is something I can do which can change it to true.

 

Here is what Dan said in post#2, of this thread:

 

here's my take on this.

The odds of the first one is 4 out of 40.

The odds of the second on is 3 out of 39.

The odds of the third one is 2 out of 38.

The odds of the fourth one is 1 out of 37.

 

To get the probabilty of pulling out 1' date=' 2, 3, and 4 in the first four picks is then calculated by multplying all four probabilities together:

 

(4/40)*(3/39)*(2/38)*(1/37) = 1/91,390

 

Odds are not good!![/quote']

 

Notice that he said that, "the odds of the first one is 4 out of 40."

 

In other words:

 

[math] P(success) = \frac{4}{40} [/math]

 

How did Dan arrive at that?

 

Well, he didn't say, but it is a good guess, that it was by treating the probability of drawing any one ball, as equal to the probability of drawing any other.

 

It would help to make sense out of this, by appealing to Kolmogorov's axioms of the mathematical theory of probability. Here they are:

 

 

Axiom I:

 

[math] \forall E[ 0 \leq P(E) \leq 1 ] [/math]

 

 

Definition: The sample space, is the set consisting of all possible outcomes of a given experiment, and only possible outcomes of a given experiment. Thus, if an element isn't in the sample space, then it is impossible for it to occur, as the result of an experiment whose sample space has been correctly expressed.

 

As an introductory explanation of the definition above, consider the roll of an ordinary six sided die.

 

There are many things we could focus on, in the roll of a die, for example the time it takes to come to rest, whether or not it spins and rotates when it hits the table, but in the experiment designed here, all we are interested in here, is the face of the die that shows, when it finally comes to rest on the table.

 

Now, if we sit there an repeat this experiment forever, we will necessarily generate the entire sample space, because there are only six elements in it.

 

Let S1 denote the event that a one lands, and let S2 denote the event that a two lands, and so on. Here is how to write the definition of the sample space for this experiment:

 

[math] S = \mathcal{f} S_1,S_2,S_3,S_4,S_5,S_6} \mathcal{g} [/math]

 

The sample space above, dictates that it is impossible for the die to land on one of it's corners.

 

So there are only six possible outcomes in on throw of a real six sided die. Now, it is possible to put a tiny piece of lead in a die, and make it more likely for the number one to come out, than any other number. If we do that then:

 

[math] P(S_1) > P(S_2) [/math]

[math] P(S_1) > P(S_3) [/math]

[math] P(S_1) > P(S_4) [/math]

[math] P(S_1) > P(S_5) [/math]

[math] P(S_1) > P(S_6) [/math]

 

Translation:

 

[math] P(S_1) > P(S_2) [/math]

 

is translated as

 

"The probability that event S1 occurs on a specific throw of this loaded die, is greater than the probability that the face shows two, upon a specific roll of this loaded six sided die."

 

 

Now, we can experimentally approximate the value of P(S1) as follows:

 

Let us have thrown the die an enormous number of times N. We tabulated the data, and counted the number of times S1 occurred, and we know for sure that it occurred one million times. We can now write the approximate value for P(S1), as:

 

[math] P(S_1) \approx \frac{1 \times 10^6}{N} [/math]

 

Notice that the symbol for "approximately equal to" was used, rather than the symbol for equality, because we can never actually measure the true value for the probability of rolling a one, because that requires that we roll the die an infinite number of times, which is utterly impossible. Also, notice something much deeper, which is that the probability of rolling a one varies in time, since the mass of the die changes continually. A piece could chip off, on one of the rolls. Or the piece of lead could slip out.

 

Just so that we don't have any hidden assumptions, we have stipulated that whatever the probability of rolling a one, of this die is, whatever it's real value is, it's real value is constant in time. That is a built in assumption here.

 

Now it is impossible for N to be less than one million.

 

Let it be the case that N is three million.

 

Thus, this experiment was performed three million times, and the face of the loaded die showed the number one, exactly one million of those times. We can now compute the approximate probability of rolling a one, for this specific die:

 

[math] P(S_1) \approx \frac{1 \times 10^6}{3 \times 10^6} = \frac{1}{3}[/math]

 

If the value above happens to really be correct, even though we don't currently know it, what will happen is that as we continue performing the experiment, the value of the computation

 

[math] \frac{n_1}{N} [/math]

 

will vary sinusoidally in time, about the number 1/3 (prove this), where the values of both n1, and N in the above formula, are varying in time. N is the total number of times this loaded die was rolled, and n1 is the total number of times the face showed one.

 

Now, suppose that we also counted the number of times that 2 came out, 3 came out, etc, and that we have the following data:

 

[math]

\left( \begin{array}{c}

n_1\\

n_2\\

n_3\\

n_4\\

n_5\\

n_6\\

\end{array} \right) [/math]

 

Where I have just chosen to organize the data using a column vector. The first entry is the number of times one was rolled, the second entry the number of times a two was rolled, and so on. Since there were no other possible outcomes, other than six, the total number of times the die was rolled must be equivalent to the sum of the entries in the column vector above, at all moments in time. Thus, the following statement is true:

 

[math] N = n_1+n_2+n_3+n_4+n_5+n_6 [/math]

 

So all the value of each symbol above is varying in time, as we repeatedly perform this experiment, but the statement itself is always true, even before we roll the die for the first time. Because before we begin the experiment, N=0, and n1=0, n2=0, etc.

 

The point I am trying to get to is this, the sum of the probabilities is one, and the reason I am trying to get there, is because that is Kolmogorov's second axiom.

 

So after we have performed the experiment N times, we can approximate the 'true' probabilities as:

 

[math] P(S_1) \approx \frac{n_1}{N} [/math]

 

[math] P(S_2) \approx \frac{n_2}{N} [/math]

 

[math] P(S_3) \approx \frac{n_3}{N} [/math]

 

[math] P(S_4) \approx \frac{n_4}{N} [/math]

 

[math] P(S_5) \approx \frac{n_5}{N} [/math]

 

[math] P(S_6) \approx \frac{n_6}{N} [/math]

 

When I say "true probabilities" this approach to probability, is based on the false assumption that the probability of the events in the sample space are each constant in time. Which is of course false, since the real probability is a function of the inertial mass of the die, which of course is varying in time.

 

But of course it is called the theory of probability.

 

But our only present goal, is to justify Kolmogorov's second axiom, with a single well chosen example problem.

 

At all moments in time, the following statement is true:

 

[math] N = n_1 + n_2 + n_3 + n_4 + n_5 + n_6 [/math]

 

Now, if N is zero, we cannot divide both sides of the formula above by N. So, up until we throw the die once, we aren't even justified in defining probability. Nevertheless, let us focus on the case where

 

Case: [math] N \geq 1 [/math]

 

As long as N is greater than or equal to one, we can divide both sides of the equation above by N, to obtain this formula here:

 

[math] 1 = \frac{n_1}{N} + \frac{n_2}{N} + \frac{n_3}{N} + \frac{n_4}{N} + \frac{n_5}{N} + \frac{n_6}{N} [/math]

 

Under the assumption that the probabiltiy of any given face shows is not a function of time, let us represent them, in order, as a,b,c,d,e,f.

 

Thus, we stipulate that:

 

[math] p(S_1) = a [/math]

 

[math] p(S_2) = b [/math]

 

[math] p(S_3) = c [/math]

 

[math] p(S_4) = d [/math]

 

[math] p(S_5) = e [/math]

 

[math] p(S_6) = f [/math]

 

Thus, the derivative with respect to time of any of the above six quanties is assumed to be zero.

-----------------------------------------------------------

Picking up where I left off,

 

We are working towards Kolmogorov's second axiom.

 

As you can see, the approximate probabilities are all fractions, greater than or equal to zero, and less than one.

 

But the idea of Kolmogorov's second axiom, is that the sum of them all, is equal to one.

 

Thus, it is Kolmogorov's principle that:

 

[math] a+b+c+d+e+f=1 [/math]

 

For the simple die roll, being considered here.

 

We can also express this as follows, using summation notation:

 

[math] \sum_{n=1}^{n=6} P(S_n)} = 1 [/math]

 

This notation will prove useful, when we seek to generalize things later.

 

Now, this notion that the sum of all probabilities is unity, isn't out of the blue. It is a consequence of the fact that, after the experiment is performed the first time, some element of the sample space must have occurred.

 

So, if we think using set theory, we can understand why this is so.

 

Let us define the following 'event' set theoretically:

 

[math] \Omega \equiv S_1 \cup S_2 \cup S_3 ... \cup S_n [/math]

 

Now, the elements in the sample space S, were defined using the roster method of set denotation, so that each element of the sample space is a distinct member of the set.

 

This is worth explaining briefly...

 

The ordered pair (x,y) is different from the ordered pair (y,x), but this is not so for an arbitrary set of two elements. A set is just a collection of things, and the order the elements are listed just doesn't matter.

 

In terms of the roster method of set denotation, that means this:

 

{x,y} = {y,x}

 

Now, in the case where y=x, we have:

 

{x,x} = {x}

 

So by our saying that there are n elements in the sample space, we have stipulated that each element is a distinct entity.

 

Therefore, the intersection of any two arbitrarily selected elements of the sample space is the empty set.

 

For example, consider our die roll. Choose two elements of the sample space at random, say you choose S1, and S4. S1 is the event where a one is rolled, and S4 is the event that a four is rolled.

 

The interection of these two events is

 

[math] S_1 \cap S_4 = \emptyset [/math]

 

And this makes sense, in terms of this particular example, because the event above would be one in which we throw a single six sided die, and the face shows one AND the face shows 4. This is impossible.

 

But now, consider the event which is the union of all elements in the sample space.

 

In words, that event is this:

 

One is rolled or two is rolled or three is rolled or four is rolled or five is rolled or six is rolled.

 

The moment we toss the die, the statement above will certainly be true after the die comes to rest, even if it was false while the die was soaring through the air.

 

And this leads us to the expression of Kolmogorov's second axiom, using either set theory, or summation notation. Whichever you prefer.

 

Kolmogorov's second axiom

 

Let N denote the number of elements in the sample space.

 

[math] \sum_{n=1}^{n=N} P(S_n)} = 1 [/math]

 

or

 

[math] (\Omega \equiv S_1 \cup S_2 \cup S_3 ... \cup S_n ) \Rightarrow P(\Omega) = 1 [/math]

 

Someone might gripe, that the elements of the sample space aren't sets, but that requires just a slight change in notation.

Link to comment
Share on other sites

so the fact that we got 1`st 2`nd 3`rd and 4`th places in the grand national wasn`t too bad at all then!?

 

it gets a bit complicated when I say we had 5 horses each (10 in total), so I guess that would be like picking 10 balls out the bag (I think?).

Link to comment
Share on other sites

I've found that good way to really look into probability is looking at the rank of hands in poker, and also playing the game. Every hand, you have to think of the probability that your hand will win and how likely it is that you are right. If you don't think you have the hand won, you have to then think about the probability of getting the card(s) you need compared to the amount of money you could possibly win (pot odds). It's actually more mathematical than one would think.

 

But spending the time to go through the different hands and then rank them based on their probability of happening is pretty interesting.

Link to comment
Share on other sites

so (1/40)*(1/39)*(1/38)=X

 

when I find X' date=' do I divide that by 10?[/quote']No dividing of X !

 

Probability is (probability for first coincidence) multiplied with (propability for second coincidence) and so on...

 

For specific order of "pick up":

 

First coincidence is number of balls allowed to pick up divided by total number of balls in the bag = 1/40

(In this case only one ball is allowed.)

 

Second coincidence is number of balls allowed to pick up divided by total number of balls in the bag = 1/39

(You have already removed one ball.)

 

For random order of "pick up":

 

First coincidence is number of balls allowed to pick up divided by total number of balls in the bag = 4/40

(In this case four different balls is allowed.)

 

Second coincidence is number of balls allowed to pick up divided by total number of balls in the bag = 3/39

(You have already removed one ball.)

 

This expanation is for You first question, for the setup with the horse race one needs to know how many horses in each race You bet on and total number of horses in each race.

 

Also I don't consider a horse race to be random in such a way propability rules.

Link to comment
Share on other sites

there were 40 horses running.

10 of those 40 were picked at random.

the betting office pays out on the 1`st 4 positions.

after the race we had positions 1 3 and 4.

 

what are the chances of that?

Link to comment
Share on other sites

so (1/40)*(1/39)*(1/38)=X

 

when I find X' date=' do I divide that by 10?[/quote']

 

 

You first had 10 ten horses picked out of forty. Of those ten you got three that came in 1, 3 and 4.

 

So first your first placed horse, the odds of you having him as one of your ten is 10/40 = 1/4

 

Now, knowing that one of your horses didn't come in second, this leaves nine of your horses left but only 38 horses running. This gives the odds of your third placed horse being 9/38.

 

The odds of your fourth placed horse would then be 8/37

 

Total odds would then be: (1/4)*(9/38)*(8/37) = 1/78.1111

 

That's my take on it anyway

Link to comment
Share on other sites

it Did happen, the wife and I had £5 each and put a bet on 5 horses 50p each way (that means that positions 1 2 3 or 4 get paid on regardless).

 

we paid our £10 and got $16.88 back :)

 

not mega bucks, but loads of fun. here`s the REAL head F**k though!... I`ve been playing every year since I was 15, I`m 38 this May, and I`ve NEVER once lost a penny to them! I`ve always walked away with more than I put in without fail, what are the odds of that!????

Link to comment
Share on other sites

  • 2 years later...

well well well... I`ve done it once more :)

 

came in 1`st 2`nd and 4`th place in the grand national yesterday, total cash IN was 7 quid, cash OUT was 11.38 :D

 

that`s 25 years on a run now!

 

and yeah the years between that above post and this I came out on top also, never Major bucks but enough to be Fun ;)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.