# Approaching 1/2 Probability

## Recommended Posts

13 minutes ago, Ghideon said:

did you see the rightmost column in my result?

That's what I would have expected, because a random number generator simulates true randomness.  Then there is no such thing as true randomness that can be coded into a computer, as far as we know.  Possibly, a quantum computer could generate true randomness.  I don't know if they have programmed one to do that yet.  That has more to do with the basis in how a quantum computer gains it's bits.

I guess I should restate my problem, my difficulty with accepting probability theory, is based on:

The law of large numbers has not been been proven based on the probabilities of all the possible outcomes.

I state this because, in the example of a die roll, it doesn't use the probabilities of getting a certain die roll, which would be 1/6 for each side.  Then it says X_1=X_2=...

I don't think X_n can stand for a probability.  If it can, I don't see how it could or how someone would show any example where it does use probabilities to obtain the expected value from an increasing number of outcomes.

##### Share on other sites
28 minutes ago, Conjurer said:

The law of large numbers has not been been proven

It has been proven. The law of large numbers is a theorem.

28 minutes ago, Conjurer said:

based on the probabilities of all the possible outcomes.

I have no idea what you mean by this.

28 minutes ago, Conjurer said:

If it can, I don't see how it could or how someone would show any example where it does use probabilities to obtain the expected value from an increasing number of outcomes.

"I don't see how" doesn't mean it can't be done.

##### Share on other sites
24 minutes ago, Conjurer said:

That's what I would have expected, because a random number generator simulates true randomness.

Ok. And the middle column? Do you claim that it does not agree with the law of large numbers?

25 minutes ago, Conjurer said:

Then there is no such thing as true randomness that can be coded into a computer, as far as we know.

Does that make the simulation invalid?

26 minutes ago, Conjurer said:

I guess I should restate my problem, my difficulty with accepting probability theory, is based on:

The law of large numbers has not been been proven based on the probabilities of all the possible outcomes.

I state this because, in the example of a die roll, it doesn't use the probabilities of getting a certain die roll, which would be 1/6 for each side.  Then it says X_1=X_2=...

Can you explain the problem in detail so a simulation can be performed or a proof provided? Which theorem(s) of probability theory do you distrust?

29 minutes ago, Conjurer said:

Possibly, a quantum computer could generate true randomness.  I don't know if they have programmed one to do that yet.  That has more to do with the basis in how a quantum computer gains it's bits.

How is the above applicable to the discussion in this thread?

##### Share on other sites
2 minutes ago, uncool said:

It has been proven. The law of large numbers is a theorem.

I have no idea what you mean by this.

"I don't see how" doesn't mean it can't be done.

It has been proven experimentally, but not mathematically.  I am sure you would have no idea about a basic math concept.  I don't understand why you are trying to help me with math, when you cannot even take a basic average to see that the numbers do not add up.

##### Share on other sites
3 minutes ago, Conjurer said:

It has been proven experimentally, but not mathematically.

The law of large numbers has been proven mathematically. I could provide a basic proof, if you want.

3 minutes ago, Conjurer said:

I don't understand why you are trying to help me with math, when you cannot even take a basic average to see that the numbers do not add up.

Edited by uncool

##### Share on other sites
15 minutes ago, Ghideon said:

Ok. And the middle column? Do you claim that it does not agree with the law of large numbers?

I don't think it or the third does really.  I thought it was a 1 with a decimal in front of it, but it was actually a comma.  The average of heads/tails=1.

15 minutes ago, Ghideon said:

Does that make the simulation invalid?

It makes it to where it only proves what the programmer was capable of coding to simulate randomness where it actually didn't exist.  If he/she went to these forums and asked a public opinion on how that program should act, then assuredly, yes, it would make it invalid.

15 minutes ago, Ghideon said:

Can you explain the problem in detail so a simulation can be performed or a proof provided? Which theorem(s) of probability theory do you distrust?

I don't it is possible to add probabilities and take an average to get the expected value.  In the example, they just used the variation of the data to calculate something using the law of large numbers.  Then it is just an average of the date values, not the actual probabilities themselves that are involved in the equation.

15 minutes ago, Ghideon said:

How is the above applicable to the discussion in this thread?

You appear to be under the impression that what I am saying is impossible, because you have a computer spitting out possible outcomes.

Edited by Conjurer

##### Share on other sites
6 minutes ago, Conjurer said:

I don't it is possible to add probabilities and take an average to get the expected value.

That (or at least, some reasonable version of that: take the sum of the products of the probabilities and the values) is the definition of the expected value of a random variable.

Edited by uncool

##### Share on other sites
11 minutes ago, uncool said:

That (or at least, some reasonable version of that: take the sum of the products of the probabilities and the values) is the definition of the expected value of a random variable.

I think we should start over.

The law of large numbers is just showing how someone would get the expected value from an experiment.

They would add all the outcomes and take the average to determine the expected value.

I am saying that it is not possible to use the probabilities of a series of heads or tails to obtain the expected value, or the answer you would get from the law of large numbers.

Say I was going to use the law of large numbers,

I would assign 0 to heads and 1 to tails.

I got 100 heads and 101 tails after a total of 201 flips.

I add all of the X_n to get 101, since all the heads add up to zero.

I then divide that by 201.

Then I get approximately 1/2

Then by that experiment, I proved that the probability of the coin is 1/2 and it is a fair coin.

Now I want to calculate what is the probability of me getting 100 head and 101 tails.  It comes out to some ridiculously low chance that will ever happen...

Edited by Conjurer

##### Share on other sites
22 minutes ago, Conjurer said:

Now I want to calculate what is the probability of me getting 100 head and 101 tails.  It comes out to some ridiculously low chance that will ever happen...

But you would also accept 101 heads and 100 tails, right?

And if you flipped 2001 times, you would probably accept anywhere between 991 and 1010 heads as evidence of a fair coin.

And if you flipped 20001 times, you would probably accept anywhere between 9901 and 10100 heads. And so on.

So the question isn't about the probability of a specific number of heads, but a range that grows with the number of flips. And the theorem says that the total probability of landing within that range limits to 1.

Edited by uncool

##### Share on other sites
4 minutes ago, uncool said:

So the question isn't about the probability of a specific number of heads, but a range that grows with the number of flips. And the theorem says that the total probability of landing within that range limits to 1.

No, it approaches the expected value which is 1/2

You would have to assign values to heads or tails, so that the number of flips is twice the number of possible outcomes for a single flip.

##### Share on other sites
8 minutes ago, Conjurer said:

No, it approaches the expected value which is 1/2

What does "it" refer to here? "the total probability of landing within that range"?

##### Share on other sites
1 minute ago, uncool said:

What does "it" refer to here? "the total probability of landing within that range"?

The law of large numbers

By definition, a larger number of flips should approach the expected value or probability of getting heads or tails, which is 1/2

##### Share on other sites
6 minutes ago, Conjurer said:

The law of large numbers

"No, the law of large numbers approaches the expected value which is 1/2" makes no sense.

6 minutes ago, Conjurer said:

By definition, a larger number of flips should approach the expected value or probability of getting heads or tails, which is 1/2

That isn't a definition, it's a garbled version of the law of large numbers itself. And it doesn't address my point: as the number of flips increases, the range of "acceptable" outcomes also increases. Which means you don't only calculate the probability of the "perfect" outcome (i.e. the closest number of flips to 1/2), but of an entire range of numbers of heads.

Edited by uncool

##### Share on other sites
34 minutes ago, uncool said:

"No, the law of large numbers approaches the expected value which is 1/2" makes no sense.

For example, a fair coin toss is a Bernoulli trial. When a fair coin is flipped once, the theoretical probability that the outcome will be heads is equal to 1/2.  Therefore, according to the law of large numbers, the proportion of heads in a "large" number of coin flips "should be" roughly 1/2. In particular, the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity.

34 minutes ago, uncool said:

That isn't a definition, it's a garbled version of the law of large numbers itself. And it doesn't address my point: as the number of flips increases, the range of "acceptable" outcomes also increases. Which means you don't only calculate the probability of the "perfect" outcome (i.e. the closest number of flips to 1/2), but of an entire range of numbers of heads.

Is the coin going to start landing on its side or something?  I don't see why you are so caught up in this.  When I did my calculation earlier I just used the variables that could represent any possible range of the number, including the ones that were not even close to the expected value.  The average of all the combinations per permutations was 1/r! for the entire range of possible outcomes of any replaceable event as it approached an infinite number of trials.

Edited by Conjurer

##### Share on other sites
27 minutes ago, Conjurer said:

In particular, the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity.

Yes, and that doesn't contradict the statement I made: "And the theorem says that the total probability of landing within that range limits to 1." (In fact, it says exactly the same thing)

27 minutes ago, Conjurer said:

Is the coin going to start landing on its side or something?

No, and that's not even close to what I said.

27 minutes ago, Conjurer said:

I don't see why you are so caught up in this.

Caught up in what? The fact that the theorem deals with a range, and not only "perfect" outcomes? Because you keep on talking only about the "perfect" outcomes.

27 minutes ago, Conjurer said:

When I did my calculation earlier I just used the variables that could represent any possible range of the number,

You did not (or at least, did not do so correctly). If you had, you would have gotten the formula I stated.

Here's an exercise for you: write a formula for the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010.

Edited by uncool

##### Share on other sites
19 minutes ago, uncool said:

Yes, and that doesn't contradict the statement I made: "And the theorem says that the total probability of landing within that range limits to 1." (In fact, it says exactly the same thing)

No, it almost surely converges to 1/2, which means that it is guaranteed to approach a 1/2 probability or it has a probability of 1 of approaching 1/2.

19 minutes ago, uncool said:

Caught up in what? The fact that the theorem talks about a range? Because you keep on talking only about the "perfect" outcomes.

You don't ever seem to be able to grasp anything while only dealing with perfect outcomes.  How could you be capable of understanding something more complex than that to set up?

19 minutes ago, uncool said:

Here's an exercise for you: write a formula for the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010.

n=2000

990<X_n<1010

X_n=(1/2000)(X_1+...+X_n)

Edited by Conjurer

##### Share on other sites
1 minute ago, Conjurer said:

No, it almost surely converges to 1/2

Once again: what does "it" refer to here?

11 minutes ago, Conjurer said:

You don't ever seem to be able to grasp anything while only dealing with perfect outcomes.

Hahaha whatever you say

6 minutes ago, Conjurer said:

n=2000

990<X_n<1010

X_n=(1/2000)(X_1+...+X_n)

...this is nowhere near the formula I asked for. To demonstrate: please show me how you would use this formula to calculate the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010.

##### Share on other sites
26 minutes ago, uncool said:

Once again: what does "it" refer to here?

the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity.

28 minutes ago, uncool said:

...this is nowhere near the formula I asked for. To demonstrate: please show me how you would use this formula to calculate the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010.

990/2000=0.495

1010/2000=0.505

0.495<x<0.505

##### Share on other sites
4 minutes ago, Conjurer said:

the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity.

Yet again, that not only doesn't contradict what I said, it means the precise same thing: "And the theorem says that the total probability of landing within that range limits to 1."

"The proportion of heads after n flips" is not "the total probability of landing within that range".

4 minutes ago, Conjurer said:

990/2000=0.495

1010/2000=0.505

0.495<x<0.505

And?

I still don't see an answer for the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010.

Edited by uncool

##### Share on other sites
2 hours ago, uncool said:

I still don't see an answer for the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010.

(n!/(n^r (r! (n - r)!)) - 1/r!) < P(H or T) < (n!/(n^r (r! (n - r)!)) + 1/r!)

##### Share on other sites

That's still not an answer, and worse, it is notationally nonsense. What do you mean by P(H or T)?

As I said before:

On 9/12/2019 at 8:03 PM, uncool said:

It looks to me like you are quasi-randomly putting formulae together without understanding what they are for

##### Share on other sites
7 hours ago, Conjurer said:

I don't think it or the third does really.  I thought it was a 1 with a decimal in front of it, but it was actually a comma.  The average of heads/tails=1.

Can you provide some evidence?  "I dont't think so" is not a strong argument.  You claim that mainstream math is incorrect and reject current proofs in links provided. Is it maybe the very definitions of the math of probabilities you question? That would make discussions about existing proofs not very fruitful.

Note: The decimal comma "," should be read as a decimal point "." . It is the language setting of computer running simulation that causes confusion; for instance 0.997 is written 0,998 in that locale.

7 hours ago, Conjurer said:

I don't it is possible to add probabilities and take an average to get the expected value.  In the example, they just used the variation of the data to calculate something using the law of large numbers.  Then it is just an average of the date values, not the actual probabilities themselves that are involved in the equation.

You want a result based on every possible outcome instead of an outcome based large number of randomly selected samples?

7 hours ago, Conjurer said:

It makes it to where it only proves what the programmer was capable of coding to simulate randomness where it actually didn't exist.  If he/she went to these forums and asked a public opinion on how that program should act, then assuredly, yes, it would make it invalid.

A discussion about random number generation in computing is better suited for a separate thread. I deliberately used secure random numbers and specified so in my note to reduce this kind of issues. csrc.nist.gov has information about certifications and research.

##### Share on other sites
13 hours ago, uncool said:

That's still not an answer, and worse, it is notationally nonsense. What do you mean by P(H or T)?

As I said before:

That is the proper notation of the probability of getting a heads or tails.  That should be one of the first things you learn about probability...

It shows where you are having this problem of it just looking like I am quasi-randomly putting formula together...

10 hours ago, Ghideon said:

Can you provide some evidence?  "I dont't think so" is not a strong argument.  You claim that mainstream math is incorrect and reject current proofs in links provided. Is it maybe the very definitions of the math of probabilities you question? That would make discussions about existing proofs not very fruitful.

I am not looking for a fruitful discussion about it.  I am looking for answers.  I don't think the law of large numbers has been proven from the basis of considering the probabilities of events occurring in a row.

I am wondering if I have already answered my own question, but I don't understand why n^r doesn't seem to work with the coin flip problem.  It seems like the permutations with replacement should be r^n.  Then you could get 2^(#flips), which would give you the total possible number of outcomes.

If my math was correct, then 1/(r!) could be the average probability.  Then you could say that the average probability of getting heads or tails would just be 1/(2!)=1/2.  Then n^r didn't seem to work in this example...

##### Share on other sites
20 minutes ago, Conjurer said:

That is the proper notation of the probability of getting a heads or tails.  That should be one of the first things you learn about probability...

It shows where you are having this problem of it just looking like I am quasi-randomly putting formula together...

You are making my point. Why are you looking at the probability of getting a heads or tails (which will be 1), when what I asked you for was "the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010"?

##### Share on other sites
1 minute ago, uncool said:

You are making my point. Why are you looking at the probability of getting a heads or tails (which will be 1), when what I asked you for was "the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010"?

The probability of getting heads or tails is 1/2!

P(H or T) = 1/2

You have a 1 out of two possible outcomes to get either a heads or a tails.  The numerator is the number of outcomes for a success, and the denominator is the number of possible outcomes.

What you are asking for doesn't seem to make sense.  I don't even think the mathematical tools to even describe what you are asking has even been discovered yet.  You are just asking me to discover this.  I don't know how you could incorporate a probability into the law of large numbers to show that.  I gave you my best guess, but maybe you would have to subtract the probability from an average probability instead...

##### Share on other sites
This topic is now closed to further replies.

×