technocrit

joined 1 year ago
MODERATOR OF
[–] technocrit@lemmy.dbzer0.com 36 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Classic pseudo-science for the modern grifter. Vague definitions, sloppy measurements, extremely biased, wild unsupported predictions, etc.

[–] technocrit@lemmy.dbzer0.com 5 points 2 weeks ago

there’s something to it though, being crammed on the sidewalk in the pouring rain, alongside a million other people on this tiny little sidewalk, around all the various hidden and famous shops and importers.

Yeah for me this was the feeling of "fuck seattle" and "i'm never coming back here." But now it's looking much better.

[–] technocrit@lemmy.dbzer0.com 5 points 2 weeks ago

At least now it's a possibility.

[–] technocrit@lemmy.dbzer0.com 2 points 2 weeks ago

It's not really a place for shipping.

[–] technocrit@lemmy.dbzer0.com 11 points 2 weeks ago (1 children)

Her Worship

Is that a real title or sarcasm? It's hard to tell when the state regularly uses these kind of absurd clown titles (eg. her honor).

https://en.wikipedia.org/wiki/Civil_religion

[–] technocrit@lemmy.dbzer0.com 26 points 2 weeks ago (5 children)

Cars (like any technology under capitalism) are meant to keep people dependent, desperate, and exploitable.

[–] technocrit@lemmy.dbzer0.com 19 points 2 weeks ago* (last edited 2 weeks ago)

These millionaire homeowners, who could not persuade Charlottesville residents and could not win at the ballot box, decided they would throw everything they had to nullify their defeat. And it worked 😠

The usual tale of how the state violently serves capital.

[–] technocrit@lemmy.dbzer0.com 3 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

You're over complicating this shit.

Weight loss is primarily just calories burned minus calories eaten...

(times some factor, plus/minus some constant, ignoring higher order terms, excluding exogenous variables, etc.)

[–] technocrit@lemmy.dbzer0.com 6 points 2 weeks ago (3 children)

We didn’t abandon Newtonion physics when we accepted Einstein’s model ~~was proven~~

[–] technocrit@lemmy.dbzer0.com 2 points 2 weeks ago* (last edited 2 weeks ago)

Harris, who is also pretty moderate

Pretty moderate by imperial standards. Absolute fascist by objective measurement.

I doubt that moderate Democrats are especially upset at the moment.

Ofc most dems are not upset. They're completely fine with 99% of Trump's fascism. That's why we're here.

The second problem is that the US electoral system always stabilizes around two big-tent parties.

AKA it's an excellent system for violent control but a terrible joke of a "democracy".

It’s not clear to me that introducing a new party solves problems here.

They're not trying to solve problems that we care about. They're trying to maintain their control. That's the "problem" here.

[–] technocrit@lemmy.dbzer0.com 2 points 2 weeks ago* (last edited 2 weeks ago)

If it's working, they'll get rid of it sooner or later.

[–] technocrit@lemmy.dbzer0.com 2 points 2 weeks ago* (last edited 2 weeks ago)

The old school dems will team up with the majority of Rs

Already been happening for a long time.

nothing will change.

Already living under increasing fascism.

 

Mike German, an ex-FBI agent, said immigration agents hiding their identities ‘highlights the illegitimacy of actions’

Some wear balaclavas. Some wear neck gators, sunglasses and hats. Some wear masks and casual clothes.

Across the country, armed federal immigration officers have increasingly hidden their identities while carrying out immigration raids, arresting protesters and roughing up prominent Democratic critics.

It’s a trend that has sparked alarm among civil rights and law enforcement experts alike.

Mike German, a former FBI agent, said officers’ widespread use of masks was unprecedented in US law enforcement and a sign of a rapidly eroding democracy. “Masking symbolizes the drift of law enforcement away from democratic controls,” he said.

 

It was the hardest day of my life. I’ve never felt humiliation like I did that day.

I hope food can get through soon and be distributed in a respectful way, without humiliation and killing. The current system is chaotic and deadly.

There’s no justice in it. Most end up with nothing, because there’s no organised system and there’s too little aid for too many people.

 

A hacker working for the Sinaloa drug cartel was able to obtain an FBI official's phone records and use Mexico City's surveillance cameras to help track and kill the agency's informants in 2018, the U.S. Justice Department said in a report issued on Thursday.

 

We are constantly fed a version of AI that looks, sounds and acts suspiciously like us. It speaks in polished sentences, mimics emotions, expresses curiosity, claims to feel compassion, even dabbles in what it calls creativity.

But what we call AI today is nothing more than a statistical machine: a digital parrot regurgitating patterns mined from oceans of human data (the situation hasn’t changed much since it was discussed here five years ago). When it writes an answer to a question, it literally just guesses which letter and word will come next in a sequence – based on the data it’s been trained on.

This means AI has no understanding. No consciousness. No knowledge in any real, human sense. Just pure probability-driven, engineered brilliance — nothing more, and nothing less.

So why is a real “thinking” AI likely impossible? Because it’s bodiless. It has no senses, no flesh, no nerves, no pain, no pleasure. It doesn’t hunger, desire or fear. And because there is no cognition — not a shred — there’s a fundamental gap between the data it consumes (data born out of human feelings and experience) and what it can do with them.

Philosopher David Chalmers calls the mysterious mechanism underlying the relationship between our physical body and consciousness the “hard problem of consciousness”. Eminent scientists have recently hypothesised that consciousness actually emerges from the integration of internal, mental states with sensory representations (such as changes in heart rate, sweating and much more).

Given the paramount importance of the human senses and emotion for consciousness to “happen”, there is a profound and probably irreconcilable disconnect between general AI, the machine, and consciousness, a human phenomenon.

https://archive.ph/Fapar

 

We are constantly fed a version of AI that looks, sounds and acts suspiciously like us. It speaks in polished sentences, mimics emotions, expresses curiosity, claims to feel compassion, even dabbles in what it calls creativity.

But what we call AI today is nothing more than a statistical machine: a digital parrot regurgitating patterns mined from oceans of human data (the situation hasn’t changed much since it was discussed here five years ago). When it writes an answer to a question, it literally just guesses which letter and word will come next in a sequence – based on the data it’s been trained on.

This means AI has no understanding. No consciousness. No knowledge in any real, human sense. Just pure probability-driven, engineered brilliance — nothing more, and nothing less.

So why is a real “thinking” AI likely impossible? Because it’s bodiless. It has no senses, no flesh, no nerves, no pain, no pleasure. It doesn’t hunger, desire or fear. And because there is no cognition — not a shred — there’s a fundamental gap between the data it consumes (data born out of human feelings and experience) and what it can do with them.

Philosopher David Chalmers calls the mysterious mechanism underlying the relationship between our physical body and consciousness the “hard problem of consciousness”. Eminent scientists have recently hypothesised that consciousness actually emerges from the integration of internal, mental states with sensory representations (such as changes in heart rate, sweating and much more).

Given the paramount importance of the human senses and emotion for consciousness to “happen”, there is a profound and probably irreconcilable disconnect between general AI, the machine, and consciousness, a human phenomenon.

https://archive.ph/Fapar

 

We are constantly fed a version of AI that looks, sounds and acts suspiciously like us. It speaks in polished sentences, mimics emotions, expresses curiosity, claims to feel compassion, even dabbles in what it calls creativity.

But what we call AI today is nothing more than a statistical machine: a digital parrot regurgitating patterns mined from oceans of human data (the situation hasn’t changed much since it was discussed here five years ago). When it writes an answer to a question, it literally just guesses which letter and word will come next in a sequence – based on the data it’s been trained on.

This means AI has no understanding. No consciousness. No knowledge in any real, human sense. Just pure probability-driven, engineered brilliance — nothing more, and nothing less.

So why is a real “thinking” AI likely impossible? Because it’s bodiless. It has no senses, no flesh, no nerves, no pain, no pleasure. It doesn’t hunger, desire or fear. And because there is no cognition — not a shred — there’s a fundamental gap between the data it consumes (data born out of human feelings and experience) and what it can do with them.

Philosopher David Chalmers calls the mysterious mechanism underlying the relationship between our physical body and consciousness the “hard problem of consciousness”. Eminent scientists have recently hypothesised that consciousness actually emerges from the integration of internal, mental states with sensory representations (such as changes in heart rate, sweating and much more).

Given the paramount importance of the human senses and emotion for consciousness to “happen”, there is a profound and probably irreconcilable disconnect between general AI, the machine, and consciousness, a human phenomenon.

https://archive.ph/Fapar

 

Google’s carbon emissions have soared by 51% since 2019 as artificial intelligence hampers the tech company’s efforts to go green.

While the corporation has invested in renewable energy and carbon removal technology, it has failed to curb its scope 3 emissions, which are those further down the supply chain, and are in large part influenced by a growth in datacentre capacity required to power artificial intelligence.

The company reported a 27% increase in year-on-year electricity consumption as it struggles to decarbonise as quickly as its energy needs increase.

Datacentres play a crucial role in training and operating the models that underpin AI models such as Google’s Gemini and OpenAI’s GPT-4, which powers the ChatGPT chatbot. The International Energy Agency estimates that datacentres’ total electricity consumption could double from 2022 levels to 1,000TWh (terawatt hours) in 2026, approximately Japan’s level of electricity demand. AI will result in datacentres using 4.5% of global energy generation by 2030, according to calculations by the research firm SemiAnalysis.

 

Spotify, the world’s leading music streaming platform, is facing intense criticism and boycott calls following CEO Daniel Ek’s announcement of a €600m ($702m) investment in Helsing, a German defence startup specialising in AI-powered combat drones and military software.

The move, announced on 17 June, has sparked widespread outrage from musicians, activists and social media users who accuse Ek of funnelling profits from music streaming into the military industry.

Many have started calling on users to cancel their subscriptions to the service.

“Finally cancelling my Spotify subscription – why am I paying for a fuckass app that works worse than it did 10 years ago, while their CEO spends all my money on technofascist military fantasies?” said one user on X.

 

When I started working on this video about Palantir, I didn’t expect that it would make me want to have a panic attack. Then again, maybe panic is the appropriate response to learning that an artificial intelligence and surveillance company is actively collecting data on every American citizen in order to establish a technological dystopia.

 

An industry-backed researcher who has forged a career sowing doubt about the dangers of pollutants is attempting to use artificial intelligence (AI) to amplify his perspective.

Louis Anthony “Tony” Cox Jr, a Denver-based risk analyst and former Trump adviser who once reportedly claimed there is no proof that cleaning air saves lives, is developing an AI application to scan academic research for what he sees as the false conflation of correlation with causation.

 

Advocates call the CTC a rubber stamp for highway widening. The body didn't do anything to dispel that notion yesterday.

view more: ‹ prev next ›