this post was submitted on 15 Oct 2025
-18 points (31.2% liked)

Technology

76198 readers
3190 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] SnoringEarthworm@sh.itjust.works 14 points 5 days ago* (last edited 5 days ago) (3 children)

In fact, according to The Register, the GPU computing performance of the GB10 chip is roughly equivalent to an RTX 5070. However, the 5070 is limited to 12GB of video memory, which limits the size of AI models that can be run on such a system. With 128GB of unified memory, the DGX Spark can run far larger models, albeit at a slower speed than, say, an RTX 5090 (which typically ships with 24 GB of RAM). For example, to run the 120 billion-parameter larger version of OpenAI's recent gpt-oss language model, you'd need about 80GB of memory, which is far more than you can get in a consumer GPU.

Or you could've just made GPUs, and then we'd all be gaming and calling each other shitheads in Valorant instead of - checks notes - literally stealing the water from poor communities.

[–] monogram@feddit.nl 1 points 2 days ago

But this device will be air cooled, the freshwater argument is a huge problem but only exists for hyperscalers and cloud ai.

This would actually be a good way to lower demand of building more ai servers farms.

[–] mctoasterson@reddthat.com 5 points 5 days ago (1 children)

If I had to come up with a steelman argument for small "AI focused" systems like this, I'd say that the more development in this space, makes the cost of entry cheaper, and actually eventually starves out the big tech garbage like OpenAI/Google/Microsoft.

If everyone who wants to use AI can locally process queries to a locally hosted open-source model with "good enough" results, that cuts out the big tech douchebags, or at least gives an option to not participate in their data collection panopticon ecosystem.

[–] NGram@piefed.ca 3 points 5 days ago

Unfortunately Nvidia is also big tech so starving out (sort of) competitors doesn't help get rid of douchebags. It actually has the added risk of giving some of the douchebags a monopoly.

Buying one of those AMD Ryzen AI Max chips actually makes more sense now...

[–] Lembot_0004@discuss.online -3 points 5 days ago (1 children)

You can game with bricks. Or ball.

And throw away your notes. They are a completely disgraceful waste of paper.

[–] SnoringEarthworm@sh.itjust.works -3 points 5 days ago (1 children)
[–] Engywuck@lemmy.zip 3 points 5 days ago (1 children)

Well, he's right. Most likely you're "wasting" energy and water as well, just in a different manner.

[–] SnoringEarthworm@sh.itjust.works 2 points 5 days ago* (last edited 5 days ago)

The difference is your comment managed to say that without being a dick about it.

[–] MonkderVierte@lemmy.zip 4 points 5 days ago (1 children)

Ok, but can you use it as a PC?