Jump to content

Genady

Senior Members
  • Joined

Everything posted by Genady

  1. By definition, logical OR is inclusive, i.e., either one of the parts is true regardless of the other. An exclusive OR, which does not allow for both parts to be true together, is usually written as XOR.
  2. The developers of ChatGPT acknowledge these limitations of the system: (ChatGPT: Optimizing Language Models for Dialogue (openai.com))
  3. This article lists "best" uses for ChatGPT, and the last one is similar to what you did, I think. It also links to another article, about ChatGPT's limitations. The 5 Best Uses (So Far) for ChatGPT's AI Chatbot (cnet.com)
  4. I like that saying. Don't remember seeing it before and will use it often. Thank you, @swansont. Any known source?
  5. Correction: it is rather about 45 billion light years.
  6. The average distance a (general) particle travels, about 14 billion light years.
  7. One reason is that the moving parts of a device accelerate thus creating internal tensions.
  8. Yes, I have. The answers differed in wording, not in content.
  9. In my opinion, this "importance" function does not have a global maximum.
  10. We can conclude that this is a generic feature of science forums because this guy doesn't know anything about scienceforums.net specifically:
  11. I don't think it is normal or common to everybody. Perhaps it is not rare either, as a list of possible disorders with such symptoms come to mind. As an example, see Auditory Processing Disorder.
  12. Not all is bad in the ChatGPT world. Look at this:
  13. @joigus I guess that the crucial difference between a human and the ChatGPT's experiences is in their context: the latter is an experience of language, while the former is an experience of language-in-real-life. For example, we easily visualize a daughter and her mother, and in this mental picture the mother is clearly older than the daughter. The ChatGPT instead knows only how age comparison appeared in texts. Yes, you're right: the red box denotes what I say and the green one denotes what AI says. No, sometimes it says that it cannot answer, with some explanation why.
  14. Yes, @joigus, the experience, i.e., the training statistics emphasis of your hypothesis seems to me a right way to analyze this behavior. It was the following specification that looks unsupported:
  15. @joigus, it doesn't look like a result of logical assumptions, because on one hand, it derives truth from the question itself in this example: and on the other hand, it is incapable of a simple syllogism in this example:
  16. @studiot, there are thousands of articles about ChatGPT, here is one from the horse's mouth: ChatGPT: Optimizing Language Models for Dialogue (openai.com)
  17. I like this guess, @joigus. Here is a little evidence supporting it:
  18. Also, we should not assume that A and B are necessarily persons. If they are, say, cells or some asexually reproducing organisms, then B is the only answer. Anyway, by the Occam's principle, with the data given, B is the best answer, isn't it? Update. With the follow-up questions, this AI becomes ridiculous and self-contradictory:
  19. I've asked ChatGPT a question and got an answer, which is correct, but ... Here it is: Why doesn't it consider B herself?
  20. If this discovery is confirmed, then structures in our universe are shaped akin the wire black corals :
  21. I know how the magnetic field was set on Earth when the other entangled electron was measured 'up'. I want to set the magnetic field here (M87) so that the electron here will definitely measure 'down'. If 'here' was in the same lab, I would set the magnetic field just parallel to the other.
  22. So far, I don't see any conflict between science and philosophy. Certainly, philosophy can help scientists. As well as music and sport.

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.