Skip to content

Is AI making us luddites?

Featured Replies

  • Author
15 hours ago, MigL said:

I'm not a Luddite; I was building 8 bit computers in 1979, programming Z80 assembly, faithfully reading BYTE magazine ( and Steve Ciarcia;s hardware column ) along with Microprocessor Report in the University library, and waiting for the micro-computer revolution which came 15 years later.

I had a discussion with younger students and one of the things that they noted is that they feel unequipped to navigate the whole social media and now AI landscape as they lack the knowledge gained compared to folks who grew up in the "before-times".

6 hours ago, dimreepr said:

I said we'll lose students as AI improves, so the pathways will reduce but the quality of students will increase bc they will stand out, thinking beyond the typical AI answer.

What is happening now is the opposite. Quality of students goes down, number of folks going through the system increases. Few, if any imagine being able to outperform AI.

On 2/20/2026 at 5:59 AM, Markus Hanke said:

I think the best case scenario would be that it strengthens one’s ability to ask the right questions.

I cannot stress how important this is in what concerns both pedagogy and scientific methodology.

There are very interesting comments on societal and economic derivations, but I'll make no comment on that because I don't feel competent enough.

AI should, and will, be included at some point in education. It's simply a matter of time. New standards for exams will have to be put in place. The further forward the field will develop (and it will), the truer the previous two statements will become.

Taking up on suggestions from others, one (among many) possibilities would be something like,

a) Verbal (oral or written) examination merely to probe the student's mind in order to simply see if they've gained a reasonable grasp of the subject

b) Exploration with the AI tool to see that they actually are able to formulate significant questions, clarify context, and highlight nuances, as well as follow up answers, by refining the prompt

c) Presentation of results again in a verbal (oral or written format)

Needless to say, the worst-case scenario would be to start using AI as some kind of oracle, which too many have been doing for some time now.

Edited by joigus
minor correction

  • Author
13 minutes ago, joigus said:

AI should, and will, be included at some point in education. It's simply a matter of time. New standards for exams will have to be put in place. The further forward the field will develop (and it will), the truer the previous two statements will become.

There are part of curricula already. However, I do feel that there is a disconnect how some educators and especially administrators frame it and how it affects learning in practice. Essentially there is an emphasis on academic misconduct and plagiarism to counter the use for entirely AI-generated works. And then promoting beneficial uses. Yet quite clearly, we can see that it has little (positive) impact.

15 minutes ago, joigus said:

a) Verbal (oral or written) examination merely to probe the student's mind in order to simply see if they've gained a reasonable grasp of the subject

That is already the case, though oral is problematic for large classes and usually has a high level of complaints regarding subjectivity. Furthermore, there is strong administrative push to let students pass, which is a long-standing problem. Essentially if too many students fail, the assumption is that the prof is at fault, rather a decline in learning abilities. This has led to at least a decade of grade inflation.

17 minutes ago, joigus said:

b) Exploration with the AI tool to see that they actually are able to formulate significant questions, clarify context, and highlight nuances, as well as follow up answers, by refining the prompt

18 minutes ago, joigus said:

c) Presentation of results again in a verbal (oral or written format)

These are already being done, but frequently if you poke them for more details (see a) we can see massive gaps. Together with a system that disincentivizes failing students (largely because of tuition) and the fact that students fail at building basic skills, there is not a lot material to work with.

The easiest ways for educators is to go with the flow and we are already seeing that in high schools and increasingly at universities. Thus, the overall issue here is how do we disincentivize the easy ways of using AI and promote the better ones. Students frequently do not see the benefit in wasting their valuable time on social media by doing homework or exercises.

19 minutes ago, CharonY said:

There are part of curricula already. However, I do feel that there is a disconnect how some educators and especially administrators frame it and how it affects learning in practice. Essentially there is an emphasis on academic misconduct and plagiarism to counter the use for entirely AI-generated works. And then promoting beneficial uses. Yet quite clearly, we can see that it has little (positive) impact.

Let me concentrate on this point, because somehow it's the closest to my heart. An important aspect of scientific endeavour consists of (or at least implies) interacting with other minds. I think putting AI to good use would entail facing the student with interacting with other working minds. It is conceivable (and very understandable) that educators need to develop the necessary criteria to extract pedagogical benefits from this. IOW: I want to see your interaction with the automaton mind.

A. N. Whitehead (as reflected in my signature) once wrote: "Civilisation advances by extending the number of important operations which we can perform without thinking about them". This goes to show that the problem in a general context is not new. The question we face now is a new one: What about the important operation being thinking itself? Can we perform thinking without thinking about thinking? I think we can't. We must think about this new machine thinking, and refine the criteria. That's all. And nobody says it's easy.

  • Author
1 hour ago, joigus said:

An important aspect of scientific endeavour consists of (or at least implies) interacting with other minds. I think putting AI to good use would entail facing the student with interacting with other working minds. It is conceivable (and very understandable) that educators need to develop the necessary criteria to extract pedagogical benefits from this. IOW: I want to see your interaction with the automaton mind.

An issue is that AI is not a human mind. It does not think like one and it mostly simulates. Avi Loeb has dubbed it "Alien Intelligence" to make the distinction.

I agree with the majority of the posts here. AI has its uses, but clearly some groups are weaponizing it to destroy society.

And the data centers, the only reason I can think of why they need to be built is to control Trump’s middle defense. Maybe it will also work on asteroids.

It is like the irony in Mutual Assured Destruction where more bombs make us safer. Let’s have a computer that learns (rips of your intellectual property), can make thousands to your one to insure you work more productivity. Your job is more productive and the more AI can do the more job security you have.

4 hours ago, joigus said:

I cannot stress how important this is in what concerns both pedagogy and scientific methodology.

The following may be an unpopular opinion, but I’ll say it anyway.

When I went to school, we spent the first five years of math education doing pretty much nothing else but pen-on-paper arithmetic. Addition, subtraction, multiplication, division…over and over and over again, with increasingly large numbers and more decimal points.

Since I left school 30+ years ago, never even once was I in a situation where I in fact had to do pen-and-paper arithmetic, at least to the best of my recollection. I’m unsure if now I even remember how to do it. The reality is that we live in the Information Age, and it’s a skill that’s basically never needed anymore. Of course one needs to have an understanding of what those operations mean, but being proficient in working out 6537.45/765.44 by hand on paper has kind of lost its relevance, IMHO. I think it’s enough to spent at most a year on this.

On the other hand, had we gone further on the upper end, beyond single-variable calculus, perhaps into differential equations, calculus on manifolds etc, it would have been very helpful.

Just my personal opinion. You can crucify me for it :)

16 hours ago, CharonY said:

What is happening now is the opposite. Quality of students goes down, number of folks going through the system increases. Few, if any imagine being able to outperform AI.

Assuming we're ruling out sentience, then a true AI would create a society that is completely free to think; Maslow's hierarchies will be met by the machine's, including levelling the playground for plebs like me; dyslexia means text is effectively my second language, I love predictive text bc it even tells me where to put an apostrophe, to convey meaning in a language I find natural, orally.

The definition of student, in the context of a working GAI will be the future etymology.

We don't need to outperform AI, we just need to understand how and when to apply the brakes and ride the wave.

9 hours ago, CharonY said:

An issue is that AI is not a human mind. It does not think like one and it mostly simulates. Avi Loeb has dubbed it "Alien Intelligence" to make the distinction.

Agreed. And a safe rule of thumb might be that it only works efficiently in simulating when it follows well-trodden paths, which would make it suitable for education if used properly.

7 hours ago, Markus Hanke said:

[...] On the other hand, had we gone further on the upper end, beyond single-variable calculus, perhaps into differential equations, calculus on manifolds etc, it would have been very helpful.

I know exactly what you mean. I remember being taught to use tables of logarithms... yuck!

Some contact with these old tools might be interesting. Knowing what they are and how the craft got underway. But dwelling on such thechniques is a bit like insisting on learning to cook by using bifaces and grinding stones. Getting intuitions for what numbers are is what's important, but when you learn about p-adic numbers, just to take one example, you realise numbers are not the sequences of digits yout teachers taught you about, but something much more abstract.

  • Author
12 hours ago, Markus Hanke said:

The following may be an unpopular opinion, but I’ll say it anyway.

When I went to school, we spent the first five years of math education doing pretty much nothing else but pen-on-paper arithmetic. Addition, subtraction, multiplication, division…over and over and over again, with increasingly large numbers and more decimal points.

Since I left school 30+ years ago, never even once was I in a situation where I in fact had to do pen-and-paper arithmetic, at least to the best of my recollection. I’m unsure if now I even remember how to do it. The reality is that we live in the Information Age, and it’s a skill that’s basically never needed anymore. Of course one needs to have an understanding of what those operations mean, but being proficient in working out 6537.45/765.44 by hand on paper has kind of lost its relevance, IMHO. I think it’s enough to spent at most a year on this.

On the other hand, had we gone further on the upper end, beyond single-variable calculus, perhaps into differential equations, calculus on manifolds etc, it would have been very helpful.

Just my personal opinion. You can crucify me for it :)

I am not too worried about specific exercises, or skills. What worries we most is the decline in fundamentals, which to me ultimately boils down to reading, or perhaps in a more abstract sense, information comprehension. In my mind, so far other media (video, audio) are not as well suited to transmit complex information. The only exception I can think of are oral traditions. However, cultures who used those have specialized Knowledge Keepers with highly focused training to retaining, contextualizing and transmitting information. It would be the opposite of how it is used in modern times.

1 hour ago, CharonY said:

What worries we most is the decline in fundamentals, which to me ultimately boils down to reading, or perhaps in a more abstract sense, information comprehension.

I think this has been said many times in these forums (often by you)(🙂), and repetition doesn't hurt. My own experience, at work, with younger relatives and acquaintances, is very much that the loss is spending hours with one book or other source, reflecting carefully on its meanings and so on. A lot of real knowledge can't be found in a web soundbite or wiki entry or AI summary, and cannot be absorbed in some intuitive augenblick. The other problem arising from so much information, a virtual firehose pointed at us, is not having the techniques for filtering out what's important - filtering is such an important aspect of comprehension. (If one reads a long novel, for example, there comes a point where you realize how to focus on who the characters really are and have less need to recall every setting description or minor plot point)

Edited by TheVat
Fixt

  • Author
5 hours ago, TheVat said:

I think this has been said many times in these forums (often by you)(🙂), and repetition doesn't hurt. My own experience, at work, with younger relatives and acquaintances, is very much that the loss is spending hours with one book or other source, reflecting carefully on its meanings and so on. A lot of real knowledge can't be found in a web soundbite or wiki entry or AI summary, and cannot be absorbed in some intuitive augenblick. The other problem arising from so much information, a virtual firehose pointed at us, is not having the techniques for filtering out what's important - filtering is such an important aspect of comprehension. (If one reads a long novel, for example, there comes a point where you realize how to focus on who the characters really are and have less need to recall every setting description or minor plot point).

I think one of the issues your touching on is that getting to the point of understanding requires often boring repetition. However, we have done everything to eliminate even a second of boredom and thus these the boring tasks are becoming and unendurable task for many.

I have students which I would consider decently intelligent and they really do work hard, in terms of investing time. But by the time they come to university, many basic abilities have not been developed and things that would have been routine even 15 years ago, require massive time investment on their part, often resulting in immense stress, which many are also not equipped to deal with.

The number of folks having burnout symptoms just from regular coursework is just too high. There is little wonder why relief from AI is so attractive, though.

21 hours ago, Markus Hanke said:

The reality is that we live in the Information Age, and it’s a skill that’s basically never needed anymore.

The irony is that 'manual' laborious jobs cannot be replaced by AI, while information processing jobs are already being replaced by relatively 'primitive' AI.
A lot of young people have been lied to.

2 hours ago, MigL said:

The irony is that 'manual' laborious jobs cannot be replaced by AI

I’m not so sure about this actually. If you equip a properly designed robot with a suitably trained AI or AGI , I think this is precisely what will happen. It’s just a matter of time. In Japan they already have restaurants entirely staffed by robotic waiters.

20 hours ago, CharonY said:

I am not too worried about specific exercises, or skills. What worries we most is the decline in fundamentals, which to me ultimately boils down to reading, or perhaps in a more abstract sense, information comprehension. In my mind, so far other media (video, audio) are not as well suited to transmit complex information. The only exception I can think of are oral traditions. However, cultures who used those have specialized Knowledge Keepers with highly focused training to retaining, contextualizing and transmitting information. It would be the opposite of how it is used in modern times.

The fundamantals in this context will change, but one thing that history teaches us is, some humans will always rage against the machine/societal norm's with a new line of thinking that contradicts the accepted fundamentals.

Perhaps the rise of AI fakes will be what teaches the next generation to think critically and rise above the average.

It strikes me that this question runs parallel to the 'was Marx right' question.

Is it AI to blame? The movie Idiocracy has been made long before AI.

4 minutes ago, Genady said:

Is it AI to blame? The movie Idiocracy has been made long before AI.

On a related note ,my concern when I was a lot younger was that knowledge would be "cul de sac'd " into specialities so that individually and as a group we would loose ourselves in the areas we were proficient in and no one would have a decent overview of the knowledge we had gained into the world around us.

It feels as if this concern has been either hopelessly outdated or that it has morphed into this monster where 99% of our critical faculties seems to be at risk of being outsourced to some kind of reckoning machine that we are "free" to rubberstamp (if the oligarchs have not rubberstamped it before us)

Not sure if that makes me a Luddite.

13 hours ago, CharonY said:

I think one of the issues your touching on is that getting to the point of understanding requires often boring repetition.

I think @TheVat was referring to repetition of your principle of insisting on understanding the underlying concepts. I favour the use of repetition too, to hammer home some particular ideas, or even just sequences of words (like the title of an influential work, name of the author, dates etc) that's central to the development of the topic. The reason I say this is because all of us also have this automaton inside of us, the hippocampus, that does things for us without us ever thinking about them.

In fact, I favour a two-pronged system: 1) Synthesising the main ideas ("filtering", as the Vat was saying --> writing an outline of the topic at hand, which requires understanding) and 2) Automation of some items (memorising names, dates, difficult-to-spell-and-remember new words.

The use of AI (or AGI) should come later, and should be presented from the start as a dialogue. With questions such as, "how do you know what the machine told you is actually correct?" etc.

  • Author
12 hours ago, Markus Hanke said:

I’m not so sure about this actually. If you equip a properly designed robot with a suitably trained AI or AGI , I think this is precisely what will happen. It’s just a matter of time. In Japan they already have restaurants entirely staffed by robotic waiters.

I am not sure what the current status is, but interestingly there were some cafes with robots, which were remotely controlled, in part by elderly folks.

Only somewhat related, but I have read somewhere that the Chinese strategy for AI is very different from the US. The latter aims to generate AGI with the assumption that it would create an enormous strategic advantage in almost all realms (how that would work is a bit unclear to me, most of the articles I read are a bit like sci-fi mixed with handwaving and not terribly concrete). The Chinese strategy otoh seems to be more about implementation of AI in somewhat more specialized tasks, such as improving fabrication (they are already the most automated workforce, from what I understand).

4 hours ago, dimreepr said:

Perhaps the rise of AI fakes will be what teaches the next generation to think critically and rise above the average.

If that happens, it will require yet another generation. The two youngest two generations right now are struggling. As I mentioned before, the issue many of them are seeing is that they only have a vague concept of live before the internet and social media has become the single most dominant element in their lives. I have likened that to the fish in the water issue, where younger folks struggle to define their position to the internet as its absence has become inconceivable. The fear that they have now there is that AI is going to take over the internet pushing them even out of those spaces.

19 hours ago, CharonY said:

If that happens, it will require yet another generation. The two youngest two generations right now are struggling. As I mentioned before, the issue many of them are seeing is that they only have a vague concept of live before the internet and social media has become the single most dominant element in their lives. I have likened that to the fish in the water issue, where younger folks struggle to define their position to the internet as its absence has become inconceivable. The fear that they have now there is that AI is going to take over the internet pushing them even out of those spaces.

The media is muddying the waters though, it only takes notice of the negative effects, like the guy who formed an intimate relationship with a relatively primitive AI chatbot, which reinforced his delusion about assasinating Queen liz II with a crossbow.

For every example of that type of extreme human behaviour, there will be an example of an extreme at the other end of the spectrum.

Most of us don't need a rampant influence to be a luddite... 😉

  • Author
2 hours ago, dimreepr said:

For every example of that type of extreme human behaviour, there will be an example of an extreme at the other end of the spectrum.

Well, this how the tech industry responds:

Companion
No image preview

Einstein - AI Education Companion

Einstein logs into Canvas and does your homework automatically. He has his own computer — he can watch lectures, read essays, write papers, and participate in discussions.

I will also note that what I am talking about are not media reports. I am talking about the experience of myself and colleagues over the last decades (social media), acceleration of trends over the pandemic and the rapid adoption of AI to get out of work by students. While it is possible that things will level out at some point, we are still in the midst of figuring out what is going on in the first place. Even more worrisome, I don't think that we are even starting to address the issues in education in younger folks starting about a decade ago. I.e. we are already a decade behind and now things are accelerating in the wrong direction.

For many years I have been wondering whether that is just a normal generational change in attitudes. There are pathways where ways of learning might change, but the final output (e.g. academic work) still largely maintains its quality. I will also say that the simple fact that education has been expanded will result in some decline in quality but should be level out at some point. However, all indicators point to a sustained decline ranging from basic skills to mental health. It has come to the point where the benefits and purpose of academia becomes questionable. And that trend is not entirely AI driven (but will likely accelerate because of it).

Underlying everything it seems that GenZ seems to be the first generation to be less capable adopting to new tech then the previous generations and at this point I think the data suggests that we are looking at something different happening to our brains than what the previous technologies did.

I also think that this development is happening because we keep imagining how great these tools are going to be for productivity and are not sufficiently clear about the human side of the equation. Most importantly, I don't think that leadership, neither in academia, business/tech or government is seeing the value of proper deep development of human intellect and are only to happy to outsource that task to boost the economy.

Or I might just be grumpy because I will have grade exams that are barely intelligible and will have to curve so hard that the space-time continuum becomes infinite.

1 hour ago, CharonY said:

Underlying everything it seems that GenZ seems to be the first generation to be less capable adopting to new tech then the previous generations and at this point I think the data suggests that we are looking at something different happening to our brains than what the previous technologies did.

During my chem eng degree course in the '70s, I had a similar exposure to computers as @MigL , (mainframe Fortran mk IV, leading to in-house development of mini and microcomputer applications in later years). The combined understanding of the strengths and weaknesses of software applications and 'old-fashioned' hand calculation of standard chem eng problems earned me a lot of work in the '90s with major engineering design companies helping develop and evaluate engineering software applications that on the face of it, were intended to put chemical engineers out of a job.

I doubt if anybody has routinely done the long-winded iterative stuff like distillation column tray calculations manually for a quarter of a century now, but software whether AI or conventional remains ill-equipped to deal with novelty or unpredictability, the areas where experienced chemical engineers really earn their paychecks, and I don't see the scope of those reducing in the foreseeable future.

Academia, particularly that of the developing nations, has become highly proficient at churning out low cost CAD monkeys (pray forgive me!) over the last few decades, and perhaps that's no bad thing. But there are much more rewarding jobs available for those who are prepared to knuckle down and gain a thorough understanding of the fundamentals.

In essence, that division between steady Eddies and bright sparks has always existed. And both are necessary, they have a symbiotic relationship.

19 hours ago, CharonY said:

Well, this how the tech industry responds:

Companion
No image preview

Einstein - AI Education Companion

Einstein logs into Canvas and does your homework automatically. He has his own computer — he can watch lectures, read essays, write papers, and participate in discussions.

I will also note that what I am talking about are not media reports. I am talking about the experience of myself and colleagues over the last decades (social media), acceleration of trends over the pandemic and the rapid adoption of AI to get out of work by students. While it is possible that things will level out at some point, we are still in the midst of figuring out what is going on in the first place. Even more worrisome, I don't think that we are even starting to address the issues in education in younger folks starting about a decade ago. I.e. we are already a decade behind and now things are accelerating in the wrong direction.

For many years I have been wondering whether that is just a normal generational change in attitudes. There are pathways where ways of learning might change, but the final output (e.g. academic work) still largely maintains its quality. I will also say that the simple fact that education has been expanded will result in some decline in quality but should be level out at some point. However, all indicators point to a sustained decline ranging from basic skills to mental health. It has come to the point where the benefits and purpose of academia becomes questionable. And that trend is not entirely AI driven (but will likely accelerate because of it).

Underlying everything it seems that GenZ seems to be the first generation to be less capable adopting to new tech then the previous generations and at this point I think the data suggests that we are looking at something different happening to our brains than what the previous technologies did.

I also think that this development is happening because we keep imagining how great these tools are going to be for productivity and are not sufficiently clear about the human side of the equation. Most importantly, I don't think that leadership, neither in academia, business/tech or government is seeing the value of proper deep development of human intellect and are only to happy to outsource that task to boost the economy.

Or I might just be grumpy because I will have grade exams that are barely intelligible and will have to curve so hard that the space-time continuum becomes infinite.

I largely agree, it's a scary future; we are either sitting on the precipice of a brave new world, where all the boring task's are done for us and we're free to think, or an infinitely sustainable autocracy with AI as the king maker and no-one is aloud to think.

I honestly don't know which would be preferable, when the dust settles.

Somer tastes so good.

  • Author
23 hours ago, sethoflagos said:

During my chem eng degree course in the '70s, I had a similar exposure to computers as @MigL , (mainframe Fortran mk IV, leading to in-house development of mini and microcomputer applications in later years). The combined understanding of the strengths and weaknesses of software applications and 'old-fashioned' hand calculation of standard chem eng problems earned me a lot of work in the '90s with major engineering design companies helping develop and evaluate engineering software applications that on the face of it, were intended to put chemical engineers out of a job.

I doubt if anybody has routinely done the long-winded iterative stuff like distillation column tray calculations manually for a quarter of a century now, but software whether AI or conventional remains ill-equipped to deal with novelty or unpredictability, the areas where experienced chemical engineers really earn their paychecks, and I don't see the scope of those reducing in the foreseeable future.

Academia, particularly that of the developing nations, has become highly proficient at churning out low cost CAD monkeys (pray forgive me!) over the last few decades, and perhaps that's no bad thing. But there are much more rewarding jobs available for those who are prepared to knuckle down and gain a thorough understanding of the fundamentals.

In essence, that division between steady Eddies and bright sparks has always existed. And both are necessary, they have a symbiotic relationship.

I think the difference here is that traditionally new developments or tech have made certain tasks obsolete, which often is not a bad thing. However, one of the stated goals of AGI and similar movements is to make human thinking obsolete.

1 hour ago, CharonY said:

However, one of the stated goals of AGI and similar movements is to make human thinking obsolete.

In my experience (albeit limited to ANI), the reality is merely a shift in boundaries. If anything, the coastline - the boundary where human inspiration is necessary - grows larger as established knowledge expands and new operations come within paddling distance of the beach.

On a philosophical level, the universe is driven by a hugely diverse array of PDEs, most of which are non-analytic in their practical realisation, and each of which is subject to a (literally!) infinite variety of boundary conditions and consequent infinity of plausible solutions. Gathering the experimental data necessary to confirm each advance is not a trivial exercise. I think we're safe for a few millennia on that count alone.

This afternoon while doing a bit of background study on another thread, I asked one of Google's demonic offspring to back calculate the 1D wave equation from the general solution as a check. It proceeded to use the shorthand symbol f'' for the second partial derivative of f(r,t) irrespective of whether it was with respect to the spatial or time variable, and thus convinced itself that the two quantities must be equal. (They don't even have the same dimensions).

Wouldn't accept that it was wrong. And if one lacks the capability to question one's results, then how can one possibly advance?

Create an account or sign in to comment

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Account

Navigation

Search

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.