Skip to content

joigus

Senior Members
  • Joined

  1. This kind of thinking has a tendency to snowball that, as I remember, reached a peak somewhere around 2008, bordering into horror-sci-fi: https://cerncourier.com/a/the-day-the-world-switched-on-to-particle-physics/ I remember arguments both from theory (evaporation of black holes) and experiment (existence of high-energy particles from cosmic rays) putting the matter to rest.
  2. Depends on how you define a reality. It would be essentially different from 'red, blue, and green' reality if we have to make room for quantum mechanics of spin. But that would take us off on a tangent.
  3. I think you're getting confused here. Space itself is not of a statistical nature. Things going on in space are. Space is just a backcloth of everything else. You cannot set up a proper statistical question that involves only space with nothing in it. Wally can be here or Wally can be there. But there's nothing to be done with here and there without some kind of Wally.
  4. Ok. I must say I'm quite less sophisticated when it comes to probability that you probably picture me to be, and you yourself are. The cases you comment of disqualified horses, dead heats, etc, IMO, would be completely smoothed out to zero by means of the 'frequentist approach': They almost never happen. The way I always understood probabilities is: You first make a hypothesis based on symmetries, known features, engineering specs and so on, direct exploration, etc. Then you do thousands upon thousands of experiments and apply the 'frequency test'. That way, you see whether your statistical hypothesis was correct. In some cases, like physics, you have physical principles that allow you to not guess in the total dark. In your horse-race example, your hypothesis would be based on a priori conditions on the horses: Their physique, breed, biometrics, and so on. Then you would have them race with different riders, atmospheric conditions, etc. Something like that. I think it's fair to say Bayesian methods give you equal probabilities at first, but that's precisely because the first-order approach is to assume no bias, and then correct your hypothesis as you learn more about the different odds (the heart and soul of Bayesian methods or, as I like to say, probe more and more deeply into the sample space). So the first assignment of probabilities doesn't give you any better insight than the other ones. I think you're right in your conclusion. But I don't think it's because infinity has limits. I think it's because infinity is not a number, it's more of a topological nature (the boundary of all numbers), so you cannot reach it numerically, which is quite the opposite of what you said in words, even if your intuition might have been right.
  5. Yes. https://en.wikipedia.org/wiki/Frequentist_probability No. Please, explain.
  6. Just today, I was fumbling for the word 'tetration' as a binary operation. Thank you.
  7. In response to your first question: Yes, that's exactly what I mean by a frequentist definition. In response to your second question: I assume by 'my scenario' you mean the molecule-speed vs probability scenario...? In that case, F as absolute frequency (number of times it produced a certain value) would be, say, 1 or two, while the number of times it's been tried would be (ideally) infinitely many. And therefore, the relative frequency would be f = 1 or 2 / infinity = zero. => zero probability does not imply zero occurrences when infinitely many tries are involved. I hope we're talking about the same thing. If not, it's probably my fault, and I apologise. Please, point out to me, if you can, where you made this qualification, as it escaped me. It's a very interesting point to me, as I think many misunderstandings when talking odds come from this, as 'random' could mean Laplace (finite sample space), binomial, Poisson, Gaussian, or who knows what...
  8. Therefore I should have said 'at T=3000K the Xenon molecule is much more random (much less predictable) than at T=298k. I hope I didn't make my argument completely un-understandable. Sorry. I sometimes think I may well have been misdiagnosed as 'cognitively normal' when I may well belong in the 'cognitively-exceptional' spectrum. For some uncanny reason, I tend to express thing the opposite way I mean to.
  9. Yes. Context means a lot for us humans. You were right to say it's not a tool for the masses. I've recently thought of this metaphor: AI, if used properly, should become some kind of pseudopodium stemming from our intelligence, not a prosthesis of it. Unfortunately we see too many examples of the latter, and too few of the first.
  10. Yes, but simply declaring 'lack of information' doesn't determine the probability distribution, does it? Did you happen to take a look at Bertrand's circle paradox? In the case you provide, a relatively-simple change of variables to a certain s=f(pi) renders the probability distribution deterministic in s. If you're not given this information, every digit has equal probability of occurring (as far as we know, because nobody can decode pi in terms of digits), so it's a good generator for the prescription 'equal probability for every digit from 0 to 9). I draw your attention on the fact that 'equal probability for every digit from 0 to 9' is just one way of defining 'random' in this context. Yes, but (I insist) 'random' by itself doesn't mean much. Here's the distribution of probabilities for the speed of a Xenon molecule at temperatures, T = 298 K and T = 3000 K. Both are random, and yet, at T = 3000 K the Xenon-molecule speed is much less random (much more predictable) than at T = 298 K: I should have said 'much more random'. Sorry.
  11. You have to work a lot on your prompt, and diagnose mistakes like these to rephrase your next prompt. And I would add, never venture into concepts that you know even the experts have not reached an agreement on. Machines lack context, while we are context machines. A simple question like, 'how old are you?' could be understood to mean 'how long have you as an individual been around?' or 'how long have you humans been around?'
  12. The infinite-monkeys is a metaphor to illustrate the arguably paradoxical nature of probability. In that case, you take an extremely unlikely event and flood your laboratory with attempts to obtain a succesful outcome. What paradox does it try to illustrate? That, in the limit, even an event with probability =0 is possible if there is a continuum of outcomes accesible. The metaphor makes the point clear even though no-matter-how-big a number of monkeys will never produce a continuum of outcomes (books written at random). But that has little to do with what I was trying to argue. Namely: That the word 'random' doesn't necessarily mean something precise in a number of cases.
  13. Maybe so, but not by a humorous observation on the nature of our expectations. 'Everything that can go wrong will go wrong' is no probabilistic law. Starting with: it's manifestly false. The nature of our expectations is quite irrelevant to the laws of probability anyway...
  14. Unfortunately, no. Murphy's law is not an actual law of probability, but a humorous observation on the nature of our expectations.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Account

Navigation

Search

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.