this post was submitted on 09 Apr 2025
-29 points (27.7% liked)

Technology

68918 readers
4471 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Advances in AI are making us reconsider what intelligence is and giving us clues to unlocking AI’s full potential.

top 9 comments
sorted by: hot top controversial new old
[–] Reverendender@sh.itjust.works 32 points 1 week ago (1 children)

No it isn’t. No they aren’t.

[–] db2@lemmy.world 10 points 1 week ago

Couldn't have said it better.

[–] hendrik@palaver.p3x.de 13 points 1 week ago* (last edited 1 week ago) (1 children)

It's a long article. But I'm not sure about the claims. Will we get more efficient computers that work like a brain? I'd say that's scifi. Will we get artificial general intelligence? Current LLMs don't look like they're able to fully achieve that. And how would AI continuously learn? That's an entirely unsolved problem at the scale of LLMs. And if we ask if computer science is science... Why compare it to engineering? I found it's much more aligned with maths at university level...

I'm not sure. I didn't read the entire essay. It sounds to me like it isn't really based on reality. But LLMs are certainly challenging our definition of intelligence.

Edit: And are the history lessons in the text correct? Why do they say a Turing machine is a imaginary concept (which is correct), then say ENIAC became the first one, but then maybe not? Did we invent the binary computation because of reliability issues with vacuum tubes? This is the first time I read that and I highly doubt it. The entire text just looks like a fever dream to me.

[–] technocrit@lemmy.dbzer0.com 1 points 1 week ago (1 children)

Why do they say a Turing machine is a imaginary concept (which is correct), then say ENIAC became the first one, but then maybe not?

Thanks for pointing out this hilarious section. A Turing machine is an "imaginary concept" just like any of mathematics. An abacus is also an "imaginary concept". But people can still make them (at least finite versions).

When they start talking about "imaginary concepts", it's pretty clear that the author has no understanding about the relationship between math, science, engineering, etc. That lack of understanding is a necessary prerequisite for writing this kind of article.

[–] hendrik@palaver.p3x.de 1 points 1 week ago* (last edited 1 week ago)

Yes. Plus the turing machine has an infinite memory tape to write and read. Something that is in scope of mathematics, but we don't have any infinite tapes in reality. That's why we call it a mathematical model and imaginary... and it's a useful model. But not a real machine. Whereas an abacus can actually be built. But an Abacus or a real-world "Turing machine" with a finite tape doesn't teach us a lot about the halting problem and the important theoretical concepts. It wouldn't be such a useful model without those imaginary definitions.

(And I don't really see how someone would confuse that. Knowing what models are, what we use them for, and what maths is, is kind of high-school level science education...)

[–] cecilkorik@lemmy.ca 11 points 1 week ago (2 children)

At least 1% of the money being poured into "AI research" nowadays seems to be spent on spewing these breathless puff pieces everywhere. The other 99% is spent on datacenter costs, probably. I am so excited for the day this bubble will finally pop. Just imagine the firesales on GPUs and rack space. It'll be glorious.

[–] vane@lemmy.world 4 points 1 week ago* (last edited 1 week ago)

Those datacenter GPU won't be for sale. Those will be destroyed so corporate can write them off from tax. You will pay for them twice.

[–] BrightCandle@lemmy.world 3 points 1 week ago

So many of the GPUs have been crippled for the purposes of gaming and there is zero incentive for Nvidia to produce drivers for those cards to do anything else. Alas they will just end up in landfill.

[–] MonkderVierte@lemmy.ml 4 points 1 week ago* (last edited 1 week ago)

Insufferable headline.

are compelling us to rethink our understanding of what intelligence truly is.

Oh, there was already a generally agreed-on understanding of it?

Dropped.