Jump to content

Conjurer

Curmudgeon
  • Posts

    339
  • Joined

  • Last visited

Everything posted by Conjurer

  1. I was wondering if there were any hidden variables that could describe it, but apparently not. It is just the Planck Length cubed over the Planck Mass times the Planck Time squared. All of which should be constants, so I guess that proves that the gravitational constant should remain constant.
  2. I guess it has been known that This is a thing that has been established? Have you ever heard about the mystery surrounding G, and how no one knows what it could represent? In other words, is this what it is defined as? Like, on a fundamental basis? Or, all these dumb questions? https://en.wikipedia.org/wiki/Gravitational_constant
  3. Right? Wouldn't it be absolutely crazy? Just like I mentioned? If something was changing there, I would vote for the gravitational constant of the universe. It could be possible that dark energy is due to the number of alternate universes, and maybe it could have measurable affect here on Earth counteracting gravity. I admit, science has not advanced far enough to make a real conclusion about it. We could only hypothesis about the cause. The only reason the gravitational constant of the universe is the value that it is comes from what number you would need to multiply two masses together, divided by the radius squared to get the correct amount of force. I believed this was done by Newton, but the wiki now claims it was done by C. V. Boys. https://en.wikipedia.org/wiki/Gravitational_constant I already know I will convince nobody of this now. That is not the reason I stated it. I have given up on trying to memorize those values to the new values, because anytime I go back to them, everyone always believes they are a different value. I just mentioned it, because I have experienced this problem. Then someone just happened to make a thread wondering what the historic values were. Then they are not what I originally remember them to be, and they are not even what people tried to correct me to believe they were from before even. One day, people might look back on this thread and wonder why you two never corrected me to say that I did not actually post what they currently were in the wiki. Then I will be wrong about being able to do that as well. I actually discovered the Mandela Effect due to this, and the fact that Galileo was no longer killed for his blasphemy against religions doctrine that the Earth, not the sun, was at the center of the solar system. Then he was only imprisoned instead, under house arrest. Something just occurred to me. I really have no idea how the gravitational constant of the universe even comes into the calculation, or why you would choose to divide by c^5 in the calculation for the Planck Time and c^3 for the Planck Length. Does that even give you the correct units for length and time? Isn't the gravitational constant of the universe unit-less?
  4. I am sure everyone will think I am crazy for saying this, but I doubt it will change everyone's opinion since I am new and have more down-votes. I will just go on and say it. I remembered the Planck Time and Planck Length being 10^-33 sec and 10^-34 cm. Checking the wiki: It says the Planck Time was 10^-44 sec, and the Length is currently 10^-35 m, which is also 10^34 cm. When I was looking at the wiki a couple of weeks ago, the Planck Length was in the mid negative 40 decimal places. Then the Planck Time was only off by only one decimal place. Either someone is screwing around with the wiki here, or the Planck Length and Time doesn't seem to remain constant. https://en.wikipedia.org/wiki/Planck_length https://en.wikipedia.org/wiki/Planck_time It makes me wonder if this figure could somehow be tied to the number of alternate universes there are, and this number is changing, altering the values of these constants through time...
  5. In a light bulb, light is given off by the excitement of the atoms and the electrons ejecting photons from their electrons in the atom. In a magnetic field, two electrons exchange a virtual photon to create repulsion from each other. Don't virtual particles have a different amount of mass? Virtual particles just link together the mathematical theory, even though they are not the same as the particle they are described as being. That is basically the only reason why we even think they are photons. Couldn't they be some other particle that just behaves like a photon that we cannot even detect?
  6. Who is to say what range we should be looking in? It was incredibly hard to read, because you didn't use latex format. Then you didn't define the variables you used in a way I could understand. I said at the start of the thread, I just started teaching 7th grade probabilities. I never had to take probabilities in my past, because I was told that this proof never existed. Then apparently the Common Core Standards have introduced it into the curriculum, despite that from my knowledge.
  7. I am sure they would make an exception. Well, I still don't know how I could plug probabilities into an equation to sum an increasing number of outcomes to obtain a result, where the average outcome is the same as the probability of that single event occurring. I see no way the equations you provided could accomplish that. Everytime I find the probability of any event occurring in succession, I only get a lower probability of that event occurring. That is still all I know from the information you have provided.
  8. If you gave me an equation, I could plug probabilities or fractions into, that made the final result approach the same probability by summing up more possible outcomes in a row, I would recommend you for the Nobel Prize in mathematics. It would be like what calculus did to gravity, and you would be the next Isaac Newton of probabilities.
  9. I think everyone should have probably realized by now, that there doesn't exist an equation where we could easily plug probabilities (fractions) into where we could keep adding probabilities (fractions) to it and end up getting closer and closer to 1/2 or any probability. If there was, someone would have already spit it out by now. I wouldn't waste your time, because you will most likely just end up posting a wall of text that will make my eyes bleed. I find it interesting how you showed that there are the greatest number of combinations where it is the same number of heads and tails. It seems like you are making some kind of progress, but lets face it, one of you two would have to be like the next Isaac Newton to provide me an equation like that. I don't think Uncool's equations could accept probabilities of 1/2 that could add up to 1/2 or even approach it, so I doubt he is going to even be the next Riemann. Even though it is the most likely for them to be apart of about those 600 million combinations, there are still a total of almost 4.2 billion of them. That is still a small fraction of all of the possibilities. Even if you added both 500 million other combinations on each side of it, that would still only total about 1.1 billion. That is still about only 1/4 of the total outcomes. We were trying to approach 1/2 here or a probability of 1 of approaching 1/2.
  10. Sure! Why not? You waiting for a drum-roll?
  11. That is basically what you are doing if you take a summation or an integral. Then an area can be finite even though the line is infinitely long. Then you can get a finite answer. The line of the equation of the average of all probabilistic events approach the X-axis as the values get closer to infinity. That makes sense, because there would be an extremely low value of a probability of any specific outcome. Then the average of all those would be close to zero. Then the law of large numbers states that the average outcomes should be close to the expected value. You should be able to find the probability of a single event from all of the random outcomes. Then that shouldn't be close to zero. It is because summations are an obsolete form of mathematics. We might as well be hitting people over the head with clubs and living in caves or something. They would both have their benefits and problems. I already told you like 50 times that it didn't matter to me if it only applied to one specific example that can only use integers. It would just be much easier to not even worry about that. You are just making it more complicated than it really needs to be to find ANY answer or example where it could work. I still haven't seen anyone able to do it, and they would never be able to get away with it if they did. They would have to incorporate some way to weight probabilities to get closer to a desired outcome. Then they would not be able to, because then that would mean they commuted the gamblers fallacy. The more times and event occurs, any outcome just becomes less likely to occur. You do not end up with an average of the base, starting probability.
  12. It is the total area under the curve of a function. It is more precise, because it doesn't jump from one step to another like a summation does. I haven't seen probabilities being summed to show the same result of the law of large numbers yet. I think my calculation actually was accurate, because the more times an event occurs in succession, the lower and lower the probability of ANY outcomes becomes incredibly low. Then the limit approaches zero, because all of the probabilities of any possible outcomes approaches zero. The area under the curve becomes zero. I am saying that there is no known way to add probabilities with a certain method that prefers an outcome similar to what you get from the law of large numbers. PROBABILITIES WERE NOT ADDED TO GET THE LAW OF LARGE NUMBERS!!! I DON'T KNOW HOW ELSE TO EXPLAIN THIS TO YOU, BESIDES USING BOLD AND ALL CAPS. YOU STILL DON'T SEEM TO UNDERSTAND THAT THE LAW OF LARGE NUMBERS ISN'T THE FINAL ANSWER OF A PROOF OF SOMEONE ADDING TOGETHER PROBABILITIES!!!
  13. I don't understand why you would think that an integral isn't the limit of a summation as n approached infinity. It seems like you have a different defection of an integral. I don't know what that is. Where do you think you should use an integral? I still haven't seen anyone be able to sum probabilities to come anywhere close to the expected value or probability that would result from finding a probability after any number of events, let alone infinity. I still haven't been shown an answer where a summation of probabilities comes out to be what you should find out what the probability of an event was after it occured any number of times. How can I disagree with a proof that I haven't seen or does not exist? I don't see any math here to disagree with.
  14. I really don't know why I would even bother to ask, at this point.
  15. That is close to what I would expect to happen, but if you calculate the probability of getting the same number of heads and tails you end up getting a smaller and smaller chance to get the same number of them. Then it is contradictory.
  16. So you pulled a rabbit out of a hat. This should warrant some kind of round of applause or something?
  17. No, the accepted range is defined by the law of large numbers. It should have a probability of 1 of approaching an average probability of the event happening a single time. The problem is that, even if you did use summation, you couldn't sum up a series of probabilities to get the probability of it occurring once. Like I said at the start of the thread, you should get approximately the same number of heads and tails, so the average number of either heads or tails is half of the total number. Then if you start summing up probabilities, you don't get anything close to that, and it actually diverges away from that value and approaches zero. Rather you use summation or integrals, it isn't going to change that.
  18. Then by some reason you just have some strong urge to voice your opinion about something you don't even know about or how to actually be doing it yourself?
  19. To me, it is like you are just fantasizing about some kind of mathematics where summation is used in probability to show something. It should be a matter of choice, because the integral is just the summation as it approaches infinity for all the numbers in between. Why should I just use summation to to satisfy your opinion? I wanted something that can be true for all cases, and I didn't want to make a statement that is only true for one individual case and have to go back and use an integral to make it true for all cases after doing that. Even though, it could possibly make it easier with how stubborn you are about this. Then that problem could also be solved by me just trying to figure this out on my own. I take it this means to imply that you are able to make a summation of probabilities that can show P(H)=1/2 (IFF you can even be satisfied with that statement)
  20. I don't think you have. You just keep saying that without giving a real reason. You should start watching this video at three minutes. He says they are the same. https://www.khanacademy.org/math/ap-calculus-ab/ab-integration-new/ab-6-1/v/introduction-to-integral-calculus That still says nothing about how an increasing number of probabilities could come into play here. It is just stating that the expected value is the same as the average of random occurrences of data.
  21. I wanted to add all of the possible outcomes. I wasn't concerned with it only applying to that one single example. I am looking for the proof of the addition and multiplication of probabilities in a series. Where does it have a variable for probabilities in it?
  22. Wouldn't you have to take the derivative of the integral to get the same result as the sum? You already forgot what we just talked about. The law of large numbers doesn't consider probabilities! Did you even watch the video?
  23. https://www.khanacademy.org/math/ap-calculus-ab/ab-integration-new/ab-6-1/v/introduction-to-integral-calculus
  24. If you wanted to accomplish that goal, wouldn't you then try to take the average of the probabilities of an event occurring as it continued on to infinity? I am saying that first, you would have to consider how many combinations there are of the event. nCr = (n!)/(r! (n - r)! Then to find the probability of that event you would have to divide by the total number of possible outcomes or permutations with replacement. nPr = n^r Then you get nCr/nPr = (n!)/ (n^r(r! (n - r)!) Then you would want add all of those probabilities together so you can take the average lim_(n->∞) integral[(n!)/(n^r (r! (n - r)!)) dn] Then you would want to divide by the total number of trials to find the average probability. lim_(n->∞) 1/n integral[(n!)/(n^r (r! (n - r)!)) dn] This would be like saying that 1/n (nCr/nPr + nCr/nPr + ...) = lim_(n->∞) 1/n integral[(n!)/(n^r (r! (n - r)!)) dn] Then today, wolfram alpha seems to suggest that it is actually 0 How could the average probability be zero when it should come out to what we would expect the probability for an event to occur a single time?
  25. I am not trying to say that the law of large numbers needs to be proven. I am saying that the way we calculate probabilities needs to be proven, and arriving at the law of large numbers could be considered that proof. You just turned everything I said up on side of it's head and arrived at a conclusion which was completely backwards of my intent.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.