Jump to content

Featured Replies

In light of the wave of novel ideas we’re getting, most likely fueled by AI, I think I/we have to jump in more quickly to demand specific predictions/falsifiability and math where appropriate. We’re getting walls o’ text that are pretty much all blather and responses are more of the same.

39 minutes ago, swansont said:

In light of the wave of novel ideas we’re getting, most likely fueled by AI, I think I/we have to jump in more quickly to demand specific predictions/falsifiability and math where appropriate. We’re getting walls o’ text that are pretty much all blather and responses are more of the same.

Amen to that. It's clear these walls of text are not written by the poster, are very tedious to read and generally disguise rather poor ideas. Lipstick on a pig, in many cases.

And if they can't provide these basics, they at least need to listen to criticism. Most of the references cited are bogus in my experience. If the poster hasn't bothered to check them, I think that's posting in bad faith. I've been rather stunned that nobody bothers to see if the LLM cited anything meaningful about their idea. The OPs should be excited to read mainstream material that supports them, but few seem to know what's in those citations.

7 minutes ago, Phi for All said:

And if they can't provide these basics, they at least need to listen to criticism. Most of the references cited are bogus in my experience. If the poster hasn't bothered to check them, I think that's posting in bad faith. I've been rather stunned that nobody bothers to see if the LLM cited anything meaningful about their idea. The OPs should be excited to read mainstream material that supports them, but few seem to know what's in those citations.

I agree. Anyone citing a document should have read at least the abstract. If not, they can’t know whether it supports their argument or not, so it’s just a bad faith bluff. That it has been put forward as relevant by an LLM is not good enough.

Maybe indeed checking citations is a good way to smoke out botshit.

I’m chary of the idea of insisting on maths, though. That works for a mathematical science like physics but would not be appropriate for biology or geology, say.

1 hour ago, exchemist said:

I’m chary of the idea of insisting on maths, though. That works for a mathematical science like physics but would not be appropriate for biology or geology, say.

As swansont said, where appropriate the math should be an easy way to support an idea, but there are other ways to model. Unfortunately, a lot of folks want us exposed to the whole idea at once so we take it in the way they imagine it. They don't like it when we take small bites and chew thoughtfully before replying, but that's really the only intellectually honest way to discuss ideas like this. Nobody wants to continue to scale the wall of text once they find flaws with the first few bricks. Fix these, please, and then we'll continue, thanks very much.

2 hours ago, Phi for All said:

And if they can't provide these basics, they at least need to listen to criticism.

Well the use of AI is frequently done to avoid the arduous task of thinking. If they were able to listen and respond to criticism they likely wouldn't just blindly copy the LLM output in the first place. The best you can hope for is that they enter your criticism into their chat and then paste whatever abomination of a response is generated then. And of course, if the idea is to outsource the thinking and reading part, it is just a consequence that citations are not going to be read, either.

The whole approach takes away the joy of discussing and arguing and it simply becomes a bad-faith performance for its own sake. In a broader sense, I am not sure whether this might eventually be the end of online discussion fora. What is the point of it, if over long or short you could have the exact same discussion with a chat bot?

It might be a Luddite way of thinking, but it seems to me that the new technologies almost makes it necessary to go back to face-to-face to maintain the human connection.

Please sign in to comment

You will be able to leave a comment after signing in

Sign In Now

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.