Jump to content

Approaching 1/2 Probability


Conjurer

Recommended Posts

2 minutes ago, Conjurer said:

To me, it is like you are just fantasizing

I'm not. I'm applying probability theory. 

3 minutes ago, Conjurer said:

It should be a matter of choice

It's not. 

4 minutes ago, Conjurer said:

I take it this means to imply

Don't. It means what it says, and implies exactly what I said it did. 

Link to comment
Share on other sites

3 minutes ago, uncool said:

Don't. It means what it says, and implies exactly what I said it did. 

Then by some reason you just have some strong urge to voice your opinion about something you don't even know about or how to actually be doing it yourself?

Link to comment
Share on other sites

1 minute ago, Conjurer said:

Then by some reason you just have some strong urge to voice your opinion about something you don't even know about or how to actually be doing it yourself?

I am demonstrating that the weak law of large numbers talks about an increasing range of "accepted" outcomes as n grows larger. Which is what you asked me about. 

Link to comment
Share on other sites

13 minutes ago, uncool said:

I am demonstrating that the weak law of large numbers talks about an increasing range of "accepted" outcomes as n grows larger. Which is what you asked me about. 

No, the accepted range is defined by the law of large numbers.  It should have a probability of 1 of approaching an average probability of the event happening a single time. 

The problem is that, even if you did use summation, you couldn't sum up a series of probabilities to get the probability of it occurring once.

Like I said at the start of the thread, you should get approximately the same number of heads and tails, so the average number of either heads or tails is half of the total number. 

Then if you start summing up probabilities, you don't get anything close to that, and it actually diverges away from that value and approaches zero.  Rather you use summation or integrals, it isn't going to change that.

Edited by Conjurer
Link to comment
Share on other sites

1 minute ago, Conjurer said:

No, the accepted range is defined by the law of large numbers. 

And as defined by the law of large numbers, it grows with n. Which is what I showed.

1 minute ago, Conjurer said:

The problem is that, even if you did use summation, that you couldn't sum up a series of probabilities to get the probability of it occurring once.

It sounds like you disagree with the axiom that the probability of a union of disjoint events is the sum of the probability of each of those events. 

3 minutes ago, Conjurer said:

Like I said at the start of the thread, you should get approximately the same number of heads and tails, so the average number of either heads or tails is half of the total number.  Then if you start summing up probabilities, you don't get anything close to that, and it actually diverges away from that value and approaches zero.  Rather you use summation or integrals, it isn't going to change that.

Actually, it does change that.

If you flip a coin 200 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 17%.

If you flip a coin 2000 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 36%.

If you flip a coin 20000 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 84%.

Because the law of large numbers is about that range around "perfect". 

Link to comment
Share on other sites

7 minutes ago, uncool said:

Actually, it does change that.

If you flip a coin 200 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 17%.

If you flip a coin 2000 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 36%.

If you flip a coin 20000 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 84%.

Because the law of large numbers is about that range around "perfect". 

So you pulled a rabbit out of a hat.  This should warrant some kind of round of applause or something?

Link to comment
Share on other sites

5 minutes ago, Conjurer said:

So you pulled a rabbit out of a hat.  This should warrant some kind of round of applause or something?

I've demonstrated an example of the law of large numbers (ish; technically, all I've done is point out some increasing values, but careful investigation will continue to show that the point I make with those values is correct), in direct contradiction to one of your statements. It warrants you retracting that statement, as a start.

Edited by uncool
Link to comment
Share on other sites

1 minute ago, uncool said:

I've demonstrated an example of the law of large numbers (ish; technically, all I've done is point out some increasing values, but careful investigation will continue to show that I am right), in direct contradiction to one of your statements. It warrants you retracting that statement, as a start.

That is close to what I would expect to happen, but if you calculate the probability of getting the same number of heads and tails you end up getting a smaller and smaller chance to get the same number of them.  Then it is contradictory.   

Link to comment
Share on other sites

4 minutes ago, Conjurer said:

That is close to what I would expect to happen, but if you calculate the probability of getting the same number of heads and tails you end up getting a smaller and smaller chance to get the same number of them.  Then it is contradictory.   

It is not contradictory, because the range is growing. For 200 flips, the range is between 99 and 101 heads - 3 outcomes. For 2000 flips, it's between 990 and 1010 - 21 outcomes. For 20000 flips, it's between 9900 and 10100 flips - 201 outcomes. 

For 200 flips, the probability of exactly 100 heads is about 5.6%.

For 2000 flips, the probability of exactly 1000 heads is about 1.8%.

For 20000 flips, the probability of exactly 10000 heads is about 0.56%. 

The probability of getting exactly the "perfect" number of heads is decreasing, but the probability of hitting a range including 1/2 is increasing. 

Edited by uncool
Link to comment
Share on other sites

6 minutes ago, uncool said:

It is not contradictory, because the range is growing. For 200 flips, the range is between 99 and 101 heads - 3 outcomes. For 2000 flips, it's between 990 and 1010 - 21 outcomes. For 20000 flips, it's between 9900 and 10100 flips - 201 outcomes. 

I really don't know why I would even bother to ask, at this point.

Link to comment
Share on other sites

13 hours ago, Conjurer said:

I am not looking for a fruitful discussion about it.  I am looking for answers. 

Ok. You have been given plenty of answers based on mainstream science. You keep rejecting these answers for some reason. I wanted to know what to base answers on, some kind of common base that would be acceptable.

I'll read through the posts again and see if I find some alternate way of explaining. 

 

Link to comment
Share on other sites

18 hours ago, Conjurer said:

I am not looking for a fruitful discussion about it.  I am looking for answers.

 

 

I gave you  an answer which included

The incorrect assumption you made

The correct mathematical formula for that assumption

Some trial examples to demonstrate my point directly

and this is the only answer you can offer, which I consider quite facetious.

15 hours ago, Conjurer said:
15 hours ago, studiot said:

 

 

Yes but he is wrong to quote this, just as he is wrong to introduce integration and 'the calculus' in general.

https://www.khanacademy.org/math/ap-calculus-ab/ab-integration-new/ab-6-1/v/introduction-to-integral-calculus

 

Link to comment
Share on other sites

On 9/15/2019 at 6:38 AM, studiot said:

I gave you  an answer which included

The incorrect assumption you made

The correct mathematical formula for that assumption

Some trial examples to demonstrate my point directly

and this is the only answer you can offer, which I consider quite facetious.

 

I don't understand why you would think that an integral isn't the limit of a summation as n approached infinity.  It seems like you have a different defection of an integral.  I don't know what that is.  Where do you think you should use an integral?

I still haven't seen anyone be able to sum probabilities to come anywhere close to the expected value or probability that would result from finding a probability after any number of events, let alone infinity.

On 9/15/2019 at 2:30 AM, Ghideon said:

Ok. You have been given plenty of answers based on mainstream science. You keep rejecting these answers for some reason. I wanted to know what to base answers on, some kind of common base that would be acceptable.

I'll read through the posts again and see if I find some alternate way of explaining. 

 

I still haven't been shown an answer where a summation of probabilities comes out to be what you should find out what the probability of an event was after it occured any number of times.

How can I disagree with a proof that I haven't seen or does not exist?  I don't see any math here to disagree with.

Link to comment
Share on other sites

26 minutes ago, Conjurer said:

I don't understand why you would think that an integral isn't the limit of a summation as n approached infinity. 

Because an integral is a particular limit that is not being taken here. Further, an integral is a limit of summation, not the other way around.

26 minutes ago, Conjurer said:

How can I disagree with a proof that I haven't seen or does not exist?  I don't see any math here to disagree with.

I have offered to provide the proof of the law of large numbers in this thread multiple times. 

Edited by uncool
Link to comment
Share on other sites

11 minutes ago, uncool said:

Because an integral is a particular limit that is not being taken here. Further, an integral is a limit of summation, not the other way around.

It is the total area under the curve of a function.  It is more precise, because it doesn't jump from one step to another like a summation does.  

 

12 minutes ago, uncool said:

I have offered to provide the proof of the law of large numbers in this thread multiple times. 

I haven't seen probabilities being summed to show the same result of the law of large numbers yet.  

I think my calculation actually was accurate, because the more times an event occurs in succession, the lower and lower the probability of ANY outcomes becomes incredibly low.  Then the limit approaches zero, because all of the probabilities of any possible outcomes approaches zero.  The area under the curve becomes zero.

I am saying that there is no known way to add probabilities with a certain method that prefers an outcome similar to what you get from the law of large numbers.

PROBABILITIES WERE NOT ADDED TO GET THE LAW OF LARGE NUMBERS!!! 

I DON'T KNOW HOW ELSE TO EXPLAIN THIS TO YOU, BESIDES USING BOLD AND ALL CAPS.

YOU STILL DON'T SEEM TO UNDERSTAND THAT THE LAW OF LARGE NUMBERS ISN'T THE FINAL ANSWER OF A PROOF OF SOMEONE ADDING TOGETHER PROBABILITIES!!!

Link to comment
Share on other sites

9 minutes ago, Conjurer said:

It is the total area under the curve of a function.

And why should we want area under a curve for this specific application?

9 minutes ago, Conjurer said:

It is more precise, because it doesn't jump from one step to another like a summation does.  

Why is that relevant to this particular calculation?

The question isn't about whether one application is "better" than the other. The question is which one is correct. And the laws of probability explicitly say that summing is correct. 

9 minutes ago, Conjurer said:

YOU STILL DON'T SEEM TO UNDERSTAND THAT THE LAW OF LARGE NUMBERS ISN'T THE FINAL ANSWER OF A PROOF OF SOMEONE ADDING TOGETHER PROBABILITIES!!!

If you mean that someone who thinks "adding together probabilities" (I assume you mean the method I have been demonstrating) can't get the result of the law of large numbers, then you are wrong. 

Edited by uncool
Link to comment
Share on other sites

7 minutes ago, uncool said:

And why should we want area under a curve for this specific application?

That is basically what you are doing if you take a summation or an integral.  Then an area can be finite even though the line is infinitely long.  Then you can get a finite answer.  

The line of the equation of the average of all probabilistic events approach the X-axis as the values get closer to infinity.  That makes sense, because there would be an extremely low value of a probability of any specific outcome.  Then the average of all those would be close to zero.

Then the law of large numbers states that the average outcomes should be close to the expected value.  You should be able to find the probability of a single event from all of the random outcomes.  Then that shouldn't be close to zero.

13 minutes ago, uncool said:

Why is that relevant to this particular calculation?

It is because summations are an obsolete form of mathematics.  We might as well be hitting people over the head with clubs and living in caves or something.

 

14 minutes ago, uncool said:

The question isn't about whether one application is "better" than the other. The question is which one is correct. And the laws of probability explicitly say that summing is correct. 

They would both have their benefits and problems.  I already told you like 50 times that it didn't matter to me if it only applied to one specific example that can only use integers.  It would just be much easier to not even worry about that.  You are just making it more complicated than it really needs to be to find ANY answer or example where it could work.

17 minutes ago, uncool said:

If you mean that someone who thinks "adding together probabilities" (I assume you mean the method I have been demonstrating) can't get the result of the law of large numbers, then you are wrong. 

I still haven't seen anyone able to do it, and they would never be able to get away with it if they did.  They would have to incorporate some way to weight probabilities to get closer to a desired outcome.  Then they would not be able to, because then that would mean they commuted the gamblers fallacy.

The more times and event occurs, any outcome just becomes less likely to occur.  You do not end up with an average of the base, starting probability.

Link to comment
Share on other sites

I am writing the following in large font at the beginning and end of the post because it is an offer you have repeatedly ignored, and it is likely central to your confusion. 

I am offering to prove the weak law of large numbers for a binary variable (i.e. a biased or unbiased coin) using "summing probabilities", i.e. the method I have been using this entire thread. Do you accept?

47 minutes ago, Conjurer said:

That is basically what you are doing if you take a summation or an integral.  Then an area can be finite even though the line is infinitely long.  Then you can get a finite answer.  

Then why are you rejecting the summation?

48 minutes ago, Conjurer said:

Then the law of large numbers states that the average outcomes should be close to the expected value.  You should be able to find the probability of a single event from all of the random outcomes.  Then that shouldn't be close to zero.

Why not, when the law of large numbers isn't about the single possible outcome?

48 minutes ago, Conjurer said:

It is because summations are an obsolete form of mathematics.  We might as well be hitting people over the head with clubs and living in caves or something.

I don't know where you are pulling this bullshit from, but it is bullshit. 

49 minutes ago, Conjurer said:

They would both have their benefits and problems.

That is exactly the point I am saying is irrelevant. This isn't about "benefits and problems". It's about which method is correct. And I guarantee that summation is correct, by the laws of probability

50 minutes ago, Conjurer said:

I still haven't seen anyone able to do it

I am offering to prove the weak law of large numbers for a binary variable (i.e. a biased or unbiased coin) using "summing probabilities", i.e. the method I have been using this entire thread. Do you accept?

Link to comment
Share on other sites

22 minutes ago, uncool said:

I am offering to prove the weak law of large numbers for a binary variable (i.e. a biased or unbiased coin) using "summing probabilities", i.e. the method I have been using this entire thread. Do you accept?

Sure! Why not?  You waiting for a drum-roll?  

Link to comment
Share on other sites

As with any proof, we should start with the statement we are trying to prove, and then start the proof proper.

The statement of the weak law of large numbers for a binary variable is the following:

Let X_1, X_2, ..., X_n be independent, identically distributed binary variables (in layman's terms: they're coinflips that don't affect each other and have the same probability p). Define Y_n = (X_1 + X_2 + ... + X_n)/n. Then for any epsilon > 0, lim_{n -> infinity} Pr(|Y_n - p| > epsilon) = 0.

Writing out the limit: for any delta > 0 and any epsilon > 0, there is some N such that for any n>N, Pr(|Y_n - p| > epsilon) < delta

To prove it, we will need a few lemmas.

Definition: X and Y are independent if for any outcomes i, j, P(X = i,  Y = j) = P(X = i) * P(Y = j). 

Definition: For a discrete variable X, E(X) = sum_i i*P(X = i)

Note the summation in the above.

Lemma 1: For any two independent variables X and Y, E(XY) = E(X) E(Y). 

Proof: E(XY) = sum_{i, j} i*j*P(X = i, Y = j) = sum_{i, j} i*P(x = i) * j * P(Y = j) = (sum_i i*P(x = i)) (sum_j j*P(x = j)) = E(X) E(Y)

Lemma 2: Assume X is a variable with all positive outcomes. Then for any a, P(X > a) <= E(X)/a.

Proof: E(X) = sum_i i*P(X = i) = sum_{i > a} i*P(X = i) + sum_{i <= a} i*P(X = i) >= sum_{i > a} a*P(X = i) + sum_{i <= a} 0*P(X = i) = a*sum_{i > a} P(X = i) = a*P(X > a), so P(X > a) <= E(X)/a.

Lemma 3: If X and Y are independent, then X - a and Y - b are independent. Left to the reader. 

Lemma 4: E(X - p) = 0. Left to the reader. 

Lemma 5: E((X - p)^2) = p - p^2. Left to the reader.

Lemma 6: For any variables X and Y, E(X + Y) = E(X) + E(Y) (no assumption of independence needed). Left to the reader.

 

Now, as is usual for limit proofs, we work backwards from the statement we want to prove to the statements we can prove.

We want to prove that for any delta > 0 and any epsilon > 0, there is some N such that for any n>N, Pr(|Y_n - p| > epsilon) < delta

Equivalently, for any delta > 0 and any epsilon > 0, there is some N such that for any n>N, Pr(|(sum(X_i))/n - p| > epsilon) < delta

Equivalently, for any delta > 0 and any epsilon > 0, there is some N such that for any n>N, Pr(|sum(X_i) - p*n| > epsilon*n) < delta

I want to note here, once again, that this shows what I've been saying: that this is about a range of possibilities. In this case, that range is epsilon*n around the "perfect" outcome. 

Equivalently, for any delta > 0 and any epsilon > 0, there is some N such that for any n>N, Pr((sum(X_i) - p*n)^2 > epsilon^2*n^2) < delta

Equivalently, for any delta > 0 and any epsilon > 0, there is some N such that for any n>N, Pr((sum(X_i - p))^2 > epsilon^2*n^2) < delta

Applying lemma 2 (since squares are always positive), we know this is true as long as E((sum(X_i - p))^2) < delta*epsilon^2*n^2, because then Pr((sum(X_i - p))^2 > epsilon^2*n^2) <= E((sum(X_i - p))^2)/(epsilon^2 * n^2) < delta. 

(sum(X_i - p))^2 = sum_{i, j} (X_i - p)(X_j - p) = sum_i (X_i - p)^2 + sum_{i =/= j} (X_i - p)(X_j - p), so E((sum(X_i - p))^2) = E(sum_i (X_i - p)^2 + sum_{i =/= j} (X_i - p)(X_j - p))

By lemma 6, we can split this sum up into individual terms. The first term is sum_i E((X_i - p)^2) = sum_i (p - p^2) = n*(p - p^2) by lemma 5. The second term is sum_{i =/= j} E((X_i - p)(X_j - p)) = sum_{i =/= j} E(X_i - p) E(X_j - p) by lemma 1, = 0 by lemma 4. So the condition we want is n*(p - p^2) < n^2*delta*epsilon^2, or n > (p - p^2)/(delta*epsilon^2). Which means choose N = ceil((p - p^2)/(delta*epsilon^2)), and the statement follows.

This proof generalizes quite easily; all that's necessary is to replace p by E(X_i) and (p - p^2) by E((X_i - E(X_i))^2). 

2 hours ago, Conjurer said:

Sure! Why not?  You waiting for a drum-roll?

I was waiting for you to show any interest in the actual proof, rather than insisting that it hadn't been proven. 

Edited by uncool
Link to comment
Share on other sites

9 hours ago, Conjurer said:

How can I disagree with a proof that I haven't seen or does not exist? I don't see any math here to disagree with.

I posted [math] \lim_{n \rightarrow  \infty } |h-t|\rightarrow \infty[/math] meaning the more times you toss a fair coin the larger the probability that there will be a large difference between number of heads and tails. You replied:

On 9/8/2019 at 10:27 PM, Conjurer said:

That is demonstrably false.  You could tell a computer program to randomly pick a number 1 or 0 an increasing number of times.  It will pick 1 or 0 half of the time, if you make it run enough trials.

Your statement is incorrect, I posted the output of a computer program output to highlight that. Do we need to defend mainstream math and science in the mainstream section? 

Anyway, here is another attempt. This time all combinations are used, not a random selection. As example, lets use a case with a fair coin thrown 32 times. I use 32 as an initial example since it gives a reasonably large number of possible outcomes but not more than could be analysed in full on a regular computer if required. The list below shows how many combinations there exists of the different outcomes. For instance there is only one way to get 32 heads in 32 throws, that is to throw 32 heads in a row. 31 heads and one tail could be thrown in 32 different ways; the single tail could be any one of the 32 throws.

heads		tails		combinations
32		0		        1
31		1		       32
30		2		      496
29		3		     4960
28		4		    35960
27		5		   201376
26		6		   906192
25		7		  3365856
24		8		 10518300
23		9		 28048800
22		10		 64512240
21		11		129024480
20		12		225792840
19		13		347373600
18		14		471435600
17		15		565722720
16		16		601080390
15		17		565722720
14		18		471435600
13		19		347373600
12		20		225792840
11		21		129024480
10		22		 64512240
9		23		 28048800
8		24		 10518300
7		25		  3365856
6		26		   906192
5		27		   201376
4		28		    35960
3		29		     4960
2		30		      496
1		31		       32
0		32		        1
				4294967296

We see that the most common possible outcome is 16 heads and 16 tails, that is 601080390 of 4294967296. As an example; let's take a closer look at the number of combinations surrounding 16 of each:

19        13        347373600
18        14        471435600
17        15        565722720

16        16        601080390
15        17        565722720
14        18        471435600

13        19        347373600

Each of the 4294967296 combinations have the same probability to be thrown. If we sum the possible outcome of for instance 18 heads, 17 heads, 15 heads and 14 heads we get 2074316640, that is more than the 601080390 possible outcomes that have exactly 16 heads and 16 tails.
Checking one of them in more detail: 17 heads 15 tails and 15 heads 17 tails have the same number of possible combinations. So the probability of throwing 17 heads 15 tails or 15 heads 17 tails is the same. That means that on average, when one of the equally probable 17 heads 15 tails or 15 heads 17 tails is thrown, the number of heads will be 16. But |heads-tails| (absolute difference) will be |17-15|=2 or |15-17|=2.

Questions:
If throwing coin 32 times, counting all possible outcomes what is the probability of getting exactly 16 of each compared to not getting exactly 16 of each?
If counting all the possible outcomes, what will it tell us about the probability of throwing heads? 
If throwing a coin 32 times over and over, what would be the average of |heads-tails| (absolute difference)? Hint: it will not be 0 as you keep claiming.
If increasing number of throws how will it affect average of |heads-tails|? Hint: it will not approach zero as you keep claiming.

Is the above a feasible approach according to your question regarding using "all combinations"? Should we do some calculations and compare with simulations?

I'm not going to put time into a scenario if you are going to reject it anyway, so this post is the first in a possible sequence. 

 

Link to comment
Share on other sites

17 hours ago, Conjurer said:

 It is because summations are an obsolete form of mathematics.  We might as well be hitting people over the head with clubs and living in caves or something.

!

Moderator Note

I have to admit that it's been a number of years since I took calculus class, but this is nonsense. Under most circumstances I would say I'm going to need a reference to show that summations aren't used in mainstream mathematics, but it's trivial to find that this isn't the case. And pushing a speculative view is not permitted in mainstream discussions.

You asked a question. You can ask for clarification of the answers, but you don't get to argue an alternative (i.e. non-mainstream) viewpoint.

 
Link to comment
Share on other sites

17 hours ago, Ghideon said:

Is the above a feasible approach according to your question regarding using "all combinations"? Should we do some calculations and compare with simulations?

I'm not going to put time into a scenario if you are going to reject it anyway, so this post is the first in a possible sequence. 

 

I think everyone should have probably realized by now, that there doesn't exist an equation where we could easily plug probabilities (fractions) into where we could keep adding probabilities (fractions) to it and end up getting closer and closer to 1/2 or any probability.  If there was, someone would have already spit it out by now.

I wouldn't waste your time, because you will most likely just end up posting a wall of text that will make my eyes bleed. 

I find it interesting how you showed that there are the greatest number of combinations where it is the same number of heads and tails.  It seems like you are making some kind of progress, but lets face it, one of you two would have to be like the next Isaac Newton to provide me an equation like that.  I don't think Uncool's equations could accept probabilities of 1/2 that could add up to 1/2 or even approach it, so I doubt he is going to even be the next Riemann.

Even though it is the most likely for them to be apart of about those 600 million combinations, there are still a total of almost 4.2 billion of them.  That is still a small fraction of all of the possibilities.  Even if you added both 500 million other combinations on each side of it, that would still only total about 1.1 billion.  That is still about only 1/4 of the total outcomes.  We were trying to approach 1/2 here or a probability of 1 of approaching 1/2.

Edited by Conjurer
Link to comment
Share on other sites

19 minutes ago, Conjurer said:

I think everyone should have probably realized by now, that there doesn't exist an equation where we could easily plug probabilities (fractions) into where we could keep adding probabilities (fractions) to it and end up getting closer and closer to 1/2 or any probability.  If there was, someone would have already spit it out by now.

I wouldn't waste your time, because you will most likely just end up posting a wall of text that will make my eyes bleed. 

In other words, there is literally nothing that could possibly convince you, and you aren't even going to try to understand the arguments we post. Do I have that right?

19 minutes ago, Conjurer said:

Even though it is the most likely for them to be apart of about those 600 million combinations, there are still a total of almost 4.2 billion of them.  That is still a small fraction of all of the possibilities.  Even if you added both 500 million other combinations on each side of it, that would still only total about 1.1 billion.  That is still about only 1/4 of the total outcomes.  We were trying to approach 1/2 here or a probability of 1 of approaching 1/2.

I repeat:

If you flip a coin 200 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 17%.

If you flip a coin 2000 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 36%.

If you flip a coin 20000 times, the probability of (number of heads/number of flips) being between 0.495 and 0.505 is about 84%.

This is an example where I literally chose the numbers (.495, .505, 200, 2000, 20000) arbitrarily (because they were easy for me). The probability of being close to 1/2 approaches 1 - for any reasonable choice of "close". 

Edited by uncool
Link to comment
Share on other sites

1 minute ago, uncool said:

In other words, there is literally nothing that could possibly convince you, and you aren't even going to try to understand the arguments we post. Do I have that right?

If you gave me an equation, I could plug probabilities or fractions into, that made the final result approach the same probability by summing up more possible outcomes in a row, I would recommend you for the Nobel Prize in mathematics.

It would be like what calculus did to gravity, and you would be the next Isaac Newton of probabilities.   

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.