Jump to content

Featured Replies

I was reading a pdf paper about protons when this little fella popped up

Aiagain1.jpg

Save time be dammed. How will anyone ever learn anything serious ?

I get this kind of thing at work:

  • People using AI to flesh-out their emails.

  • Which they send to other people.

  • Who use AI to summarise those emails.

All seems a bit weird.

1 hour ago, studiot said:

I was reading a pdf paper about protons when this little fella popped up

Aiagain1.jpg

Save time be dammed. How will anyone ever learn anything serious ?

Welcome to my world. For a while now I don't think that any of my students on the undergrad level have been reading the assigned material (and it is getting doubtful for grads, too).

I can see who is accessing free open source books on the course website and in a class of about 60 I get single-digits of folks reading. The books that are available at the library as part of the course have not been accessed for years now.

16 minutes ago, pzkpfw said:

I get this kind of thing at work:

  • People using AI to flesh-out their emails.

  • Which they send to other people.

  • Who use AI to summarise those emails.

All seems a bit weird.

Yep, and we have AI writing fake news and there is the proposal to use AI to flag those. We are actively removing humans from all human endeavors.

I heard about this incident this morning ( but happened a few months ago ).
In the US, Baltimore to be exact, they have a program to combat gun violence.
This program makes use of AI to 'detect' students who may be armed, and notifies Police.
As many as 8 police cruisers showed up, all with weapons drawn,, threw a 16 year old kid to the ground and handcuffed him before realizing he was actually carrying a bag of Doritos ( corn chips ).

Armed police handcuff teen after AI mistakes crisp packet for gun in US

AI seems to be the new 'lazy' way to get 'easy' results/solutions to problems.
I hope the kid's family sues the sh*t out of the police, the school, and the board of Education, or the City, that allowed such a project.

What is this world coming to ...

Edited by MigL

Well, they're just taking an example from your president.. ;)

The Atlantic
No image preview

The President Who Doesn't Read

Trump’s allergy to the written word and his reliance on oral communication have proven liabilities in office.

(I didn't feel like reading this article either, so I asked ChatGPT to summarize it for me :P)

Wolff quotes economic adviser Gary Cohn writing in an email: “It’s worse than you can imagine … Trump won’t read anything—not one-page memos, not the brief policy papers, nothing. He gets up halfway through meetings with world leaders because he is bored.”

I remember my grandad (b.1919) frowning when calculators came out and UK using metric. Everyone seems to become a Luddite at some point.

10 hours ago, studiot said:

How will anyone ever learn anything serious ?

They will read the paper.

42 minutes ago, StringJunky said:

I remember my grandad (b.1919) frowning when calculators came out and UK using metric. Everyone seems to become a Luddite at some point.

And Bach thought the piano would never catch on.........

But then, don't you also remember the dotcom bubble? AI will of course have its place, but there is little doubt it is currently being hyped and misapplied.

What makes this hype cycle especially dangerous is that it is a product liable to induce dependency (cf. cigarettes) that is being bolted onto the internet and thereby sold to billions of people by the tech capitalists, willy-nilly, without any regulatory guard rails. I even get an AI answer now every time I use my web browser. I get no say in this, even though AI responses are of zero interest to me and in fact I actively do not want them, because of the damage to the climate caused by their outsized power requirements.

Edited by exchemist

19 minutes ago, exchemist said:

I actively do not want them, because of the damage to the climate caused by their outsized power requirements.

They require this huge power for training, but not for answering queries.

(I skip the LLM responses just as I skip commercial ads.)

46 minutes ago, exchemist said:

because of the damage to the climate caused by their outsized power requirements.

Then instead of demagogy, learn something.

"Microsoft Azure

  • Microsoft contracted about 13.7 GW of renewable energy capacity for its cloud services by the end of 2023, part of its effort to power operations sustainably.

It met its goal of matching 100 % of its electricity use with renewable purchases in 2025 for the first time — meaning it buys enough renewable energy to cover all its energy consumption, even as it scales up cloud and AI infrastructure.

Microsoft aims by 2030 to match 100 % of its consumption “24/7 with carbon‑free energy” and become carbon negative, not just carbon neutral."

"Amazon Web Services (AWS)

  • AWS reached 100 % renewable energy matching for its operations in 2023, ahead of its original 2025 target. This means its total purchase of renewable energy equals its electricity consumption.

AWS still uses renewable credits and purchases on grids far from actual usage locations — and industry debate continues about how fully this reflects “clean energy on the ground” at every hour."

"Google Cloud

  • Google has also matched 100 % of its electricity use with renewable energy in recent years (e.g., 2023), making it one of the largest corporate buyers of renewables globally."

ps. Energy does not disappear anywhere, the heat generated by data centers can be used to heat houses or plant farms, etc.

34 minutes ago, Genady said:

They require this huge power for training, but not for answering queries.

(I skip the LLM responses just as I skip commercial ads.)

Is this really true, though? For instance:

QUOTE

Conventional search engines are the result of decades of optimization. They are designed to process billions of queries every day with incredible efficiency. A typical Google search, for example, is estimated to use between 0.3 and 0.5 watt-hours (Wh) of energy and returns results in milliseconds. These results are drawn from massive, pre-indexed databases of the web, meaning the system doesn’t have to “think” in real time — it simply retrieves the most relevant links based on your keywords.

In contrast, LLMs operate on a completely different model. Instead of fetching pre-written content, an LLM processes your query and then generates a custom answer, word by word, using massive neural networks with billions of parameters. This process, known as inference, is computationally expensive and energy-intensive. Depending on the size of the model and the hardware it runs on, a single LLM query may consume between 2 and 5 watt-hours — sometimes even more. In some comparisons, LLM queries have been found to require up to 100 times more energy than a standard search engine query.

UNQUOTE

From: https://best.baffour.digital/llms-vs-conventional-search-engines-energy-efficiency-requirements-and-use-cases/

Edited by exchemist

ps2. By buying LEDs instead of lightbulbs you don't save at all - the "wasted" energy from them heated your house, so you paid less for heating..

1 minute ago, Sensei said:

ps2. By buying LEDs instead of lightbulbs you don't save at all - the "wasted" energy from them heated your house, so you paid less for heating..

Irrelevant.

Just now, Sensei said:

ps2. By buying LEDs instead of lightbulbs you don't save at all - the "wasted" energy from them heated your house, so you paid less for heating..

Not everyone and not all the time needs to heat their house.

I never need it. And nobody else in my country.

2 minutes ago, exchemist said:

Irrelevant.

Nonsense. This is an analysis of what happens to your energy from any electronic device (such as a server in a data center) after it's done these calculations.

The heat from the light bulb or the heat from the server can be used at a later stage.

7 minutes ago, Sensei said:

Then instead of demagogy, learn something.

"Microsoft Azure

  • Microsoft contracted about 13.7 GW of renewable energy capacity for its cloud services by the end of 2023, part of its effort to power operations sustainably.

It met its goal of matching 100 % of its electricity use with renewable purchases in 2025 for the first time — meaning it buys enough renewable energy to cover all its energy consumption, even as it scales up cloud and AI infrastructure.

Microsoft aims by 2030 to match 100 % of its consumption “24/7 with carbon‑free energy” and become carbon negative, not just carbon neutral."

"Amazon Web Services (AWS)

  • AWS reached 100 % renewable energy matching for its operations in 2023, ahead of its original 2025 target. This means its total purchase of renewable energy equals its electricity consumption.

AWS still uses renewable credits and purchases on grids far from actual usage locations — and industry debate continues about how fully this reflects “clean energy on the ground” at every hour."

"Google Cloud

  • Google has also matched 100 % of its electricity use with renewable energy in recent years (e.g., 2023), making it one of the largest corporate buyers of renewables globally."

ps. Energy does not disappear anywhere, the heat generated by data centers can be used to heat houses or plant farms, etc.

You don't seem to know what demagogy means.

The offsetting trick does not invalidate my point. While these corporations may pat themselves on the back for buying up "green" power, that leaves less to go round for everyone else. In other words, because of the net increase in power demand LLMs create, a lot of other users are forced to source power from fossil fuel sources instead of renewable sources. So the climate suffers - because of LLM usage.

Just now, Sensei said:

Nonsense. This is an analysis of what happens to your energy from any electronic device (such as a server in a data center) after it's done these calculations.

The heat from the light bulb or the heat from the server can be used at a later stage.

Yes we know. And it is irrelevant.

8 minutes ago, exchemist said:

Is this really true, though? For instance:

QUOTE

Conventional search engines are the result of decades of optimization. They are designed to process billions of queries every day with incredible efficiency. A typical Google search, for example, is estimated to use between 0.3 and 0.5 watt-hours (Wh) of energy and returns results in milliseconds. These results are drawn from massive, pre-indexed databases of the web, meaning the system doesn’t have to “think” in real time — it simply retrieves the most relevant links based on your keywords.

In contrast, LLMs operate on a completely different model. Instead of fetching pre-written content, an LLM processes your query and then generates a custom answer, word by word, using massive neural networks with billions of parameters. This process, known as inference, is computationally expensive and energy-intensive. Depending on the size of the model and the hardware it runs on, a single LLM query may consume between 2 and 5 watt-hours — sometimes even more. In some comparisons, LLM queries have been found to require up to 100 times more energy than a standard search engine query.

UNQUOTE

From: https://best.baffour.digital/llms-vs-conventional-search-engines-energy-efficiency-requirements-and-use-cases/

I've learned things like this:

Though the size of the computation for training can be large, the trained neural network can be quite small. In our MNIST example, training the network involves a reasonable amount of computational power to find the optimal values of the parameters. But the network only has 11,935 parameters. It is relatively small. This observation tells us chips containing trained neural networks can be small and cheap. It will be easy to install them into everyday devices.

Bernhardt, Chris. Beautiful Math (p. 185). MIT Press. 2024.

Irrelevant (and untrue as I pointed it out) to this thread were your words that LLM/datacenters has a negative impact on the climate. When each data center has its own solar panels and wind farm, their impact on the climate is zero. So please do not mislead people with your false statements.

Edited by Sensei

25 minutes ago, Sensei said:

ps2. By buying LEDs instead of lightbulbs you don't save at all - the "wasted" energy from them heated your house, so you paid less for heating..

Because electric heat is always cheaper and more environmentally friendly? Because we never want to cool our houses?

33 minutes ago, Sensei said:

Then instead of demagogy, learn something.

"Microsoft Azure

  • Microsoft contracted about 13.7 GW of renewable energy capacity for its cloud services by the end of 2023, part of its effort to power operations sustainably.

...

"Amazon Web Services (AWS)

  • AWS reached 100 % renewable energy matching for its operations in 2023, ahead of its original 2025 target. This means its total purchase of renewable energy equals its electricity consumption.

"Google Cloud

  • Google has also matched 100 % of its electricity use with renewable energy in recent years (e.g., 2023), making it one of the largest corporate buyers of renewables globally."

Cloud storage is not AI.

33 minutes ago, Sensei said:

ps. Energy does not disappear anywhere, the heat generated by data centers can be used to heat houses or plant farms, etc.

Can be ≠ is

 

15 minutes ago, Sensei said:

Irrelevant (and untrue as I pointed it out) to this thread were your words that LLM/datacenters has a negative impact on the climate. When each data center has its own solar panels and wind farm, their impact on the climate is zero. So please do not mislead people with your false statements.

When is doing a lot of heavy lifting here. We’re talking about what’s actually happening now with LLMs, and you’re talking about hypotheticals, and shifting the goalposts by bringing datacenters into it.

11 hours ago, pzkpfw said:

I get this kind of thing at work:

  • People using AI to flesh-out their emails.

  • Which they send to other people.

  • Who use AI to summarise those emails.

All seems a bit weird.

I agree here, however between forums such as this, the fediverse and other places I feel there are pockets of people who expect higher standards or we work at high standards.

When it comes to learning and knowledge it takes effort, hard work (in many cases). If we write at an academic level, we have to cite sources of information and there are constraints on that (Wikipedia is not an academic source).

Thanks to social media etc attention spans seem to have got shorter, not good

when you really want to deep dive in to a topic.

I am not ready to abandon all hope just yet, but do share your sentiment.

Edited by paulsutton
wrote reply in wrong place.

7 minutes ago, swansont said:

Cloud storage is not AI.

ChatGPT does not have their own servers- they use cloud in 3rd datacenters.

LLM mentioned by the OP is Google one, so it is in Google Cloud datacenters.

Edited by Sensei

Just now, Sensei said:

ChatGPT does not have their own servers- they use cloud in datacenters.

LLM mentioned by the OP is Google one, so it is in Google Cloud datacenters.

The faulty generalization fallacy. One example doesn’t rebut all the others. “It did not rain on Tuesday” does not mean “It rained last week” is false.

8 minutes ago, swansont said:

The faulty generalization fallacy. One example doesn’t rebut all the others. “It did not rain on Tuesday” does not mean “It rained last week” is false.

You always come up with this nonsense when you're out of arguments.

So let's look at the LLM statistics.

ChatGPT 64.5%, Google Gemini 21.5%. 64.5+21.5=86% of the LLM worldwide market.

Grok 3%. 86+3=89%.

Are you all okay?

32 minutes ago, swansont said:

When is doing a lot of heavy lifting here. We’re talking about what’s actually happening now with LLMs, and you’re talking about hypotheticals, and shifting the goalposts by bringing datacenters into it.

I don't understand what you wrote here. I didn't write about any hypothetical things.

Datacenter = the place where these LLM questions are processed.

1.png

If ChatGPT (64.5% of the world market) runs on Azure, and Microsoft claims that Azure is 100% renewable, this means that 64.5% of the world's LLM runs on renewable energy.

https://datacenters.microsoft.com/globe/powering-sustainable-transformation/

Same with Google Gemini. So 86% of LLMs are on renewable energy.

Repeat with smaller models..

It is physically unprofitable to have a data center without renewable energy.

Edited by Sensei

13 hours ago, studiot said:

I was reading a pdf paper about protons when this little fella popped up

Aiagain1.jpg

Save time be dammed. How will anyone ever learn anything serious ?

We've given birth to a problem child, a precocious little smart arse that's got a lot of growing up to do; maybe it'll be a good teacher, if we guide it down the right path.

But one thing's for sure, we can't kill it now...

Create an account or sign in to comment

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.