corbin

joined 2 years ago
[–] corbin@awful.systems 7 points 2 days ago (1 children)

Oh wow, that's gloriously terse. I agree that it might be the shortest. For comparison, here are three other policies whose pages are much longer and whose message also boils down to "don't do that": don't post copypasta, don't start hoaxes, don't start any horseshit either.

[–] corbin@awful.systems 11 points 5 days ago (1 children)

Ziz was arraigned on Monday, according to The Baltimore Banner. She apparently was not very cooperative:

As the judge asked basic questions such as whether she had read the indictment and understood the maximum possible penalties, [Ziz] LaSota chided the “mock proceedings” and said [US Magistrate Douglas R.] Miller was a “participant in an organized crime ring” led by the “states united in slavery.”

She pulled the Old Man from Scene 24 gag:

Please state your name for the record, the court clerk said. “Justice,” she replied. What is your age? “Timeless.” What year were you born? “I have been born many times.”

The lawyers have accepted that sometimes a defendant is uncooperative:

Prosecutors said the federal case would take about three days to try. Defense attorney Gary Proctor, in an apparent nod to how long what should have been a perfunctory appearance on Monday ended up taking, called the estimate “overly optimistic.”

Folks outside the USA should be reassured that this isn't the first time that we've tried somebody with a loose grasp of reality and a found family of young violent women who constantly disrupt the trial; Ziz isn't likely to walk away.

[–] corbin@awful.systems 1 points 1 week ago

Indeed. I left a note on one of his blogposts correcting a common misconception (that it's "all just tokens" and the model can't tell when you clearly substituted an unlikely word, common among RAG-heavy users) and he showed up to clarify that he merely wanted to "start an interesting conversation" about how to improve his particular chatbots.

It's almost like there's a sequence: passing the Turing test, sycophancy, ELIZA effect, suggestibility, cognitive offloading, shared delusions, psychoses, conspiracy theories, authoritarian-follower personality traits, alt-right beliefs, right-wing beliefs. A mechanical Iago.

[–] corbin@awful.systems 0 points 1 week ago

Linear no-threshold isn't under attack, but under review. The game-theoretic conclusions haven't changed: limit overall exposure, radiation is harmful, more radiation means more harm. The practical consequences of tweaking the model concern e.g. evacuation zones in case of emergency; excess deaths from radiation exposure are balanced against deaths caused by evacuation, so the choice of model determines the exact shape of evacuation zones. (I suspect that you know this but it's worth clarifying for folks who aren't doing literature reviews.)

 

A straightforward product review of two AI therapists. Things start bad and quickly get worse. Choice quip:

Oh, so now I'm being gaslit by a frakking Tamagotchi.

[–] corbin@awful.systems 5 points 1 week ago* (last edited 1 week ago) (2 children)

Unlike a bunker, a datacenter's ventilation consists of [DATA EXPUNGED] which are out of reach. The [DATA EXPUNGED] are heavily [DATA EXPUNGED], so [DATA EXPUNGED] unlikely to work either. However, this ventilation must be [DATA EXPUNGED] in order to effectively [DATA EXPUNGED], and that's done by [DATA EXPUNGED] into the [DATA EXPUNGED] and [DATA EXPUNGED] to prevent [DATA EXPUNGED].

Edit: making the joke funnier.

[–] corbin@awful.systems 13 points 1 week ago

In my personal and professional opinion, most datacenter outages are caused by animals disturbing fiber or power lines. Consider campaigning for rewilding instead; it's legal and statistically might be more effective.

[–] corbin@awful.systems 3 points 2 weeks ago

Previously, on Awful, I wrote up what I understand to be their core belief structure. It's too bad that we're not calling them the Cyclone Emoji cult.

[–] corbin@awful.systems 4 points 1 month ago (1 children)
[–] corbin@awful.systems 9 points 1 month ago (1 children)

Hey now, at least the bowl of salvia has a theme, predictable effects, immersive sensations, and the ability to make people feel emotions.

[–] corbin@awful.systems 2 points 1 month ago (1 children)

Thanks! You're getting better with your insults; that's a big step up from your trite classics like "sweet summer child". As long as you're here and not reading, let's not read from my third link:

As a former musician, I know that there is no way to train a modern musician, or any other modern artist, without heavy amounts of copyright infringement. Copying pages at the library, copying CDs for practice, taking photos of sculptures and paintings, examining architectural blueprints of real buildings. The system simultaneously expects us to be well-cultured, and to not own our culture. I suggest that, of those two, the former is important and the latter is yet another attempt to coerce and control people via subversion of the public domain.

Maybe you're a little busy with your Biblical work-or-starve mindset, but I encourage you to think about why we even have copyright if it must be flaunted in order to become a skilled artist. It's worth knowing that musicians don't expect to make a living from our craft; we expect to work a day job too.

[–] corbin@awful.systems 4 points 1 month ago (8 children)

Previously, on Awful:

[Copyright i]s not for you who love to make art and prize it for its cultural impact and expressive power, but for folks who want to trade art for money.

Quoting Anarchism Triumphant, an extended sneer against copyright:

I wanted to point out something else: that our world consists increasingly of nothing but large numbers (also known as bitstreams), and that - for reasons having nothing to do with emergent properties of the numbers themselves - the legal system is presently committed to treating similar numbers radically differently. No one can tell, simply by looking at a number that is 100 million digits long, whether that number is subject to patent, copyright, or trade secret protection, or indeed whether it is "owned" by anyone at all. So the legal system we have - blessed as we are by its consequences if we are copyright teachers, Congressmen, Gucci-gulchers or Big Rupert himself - is compelled to treat indistinguishable things in unlike ways.

Or more politely, previously, on Lobsters:

Another big problem is that it's not at all clear whether information, in the information-theoretic sense, is a medium through which expressive works can be created; that is, it's not clear whether bits qualify for copyright. Certainly, all around the world, legal systems have assumed that bits are a medium. But perhaps bits have no color. Perhaps homomorphic encryption implies that color is unmeasurable. It is well-accepted even to legal scholars that abstract systems and mathematics aren't patentable, although the application of this to computers clearly shows that the legal folks involved don't understand information theory well enough.

Were we anti-copyright leftists really so invisible before, or have you been assuming that No True Leftist would be anti-copyright?

[–] corbin@awful.systems 10 points 1 month ago

Closely related is a thought I had after responding to yet another paper that says hallucinations can be fixed:

I'm starting to suspect that mathematics is not an emergent skill of language models. Formally, given a fixed set of hard mathematical questions, it doesn't appear that increasing training data necessarily improves the model's ability to generate valid proofs answering those questions. There could be a sharp divide between memetically-trained models which only know cultural concepts and models like Gödel machines or genetic evolution which easily generate proofs but have no cultural awareness whatsoever.

 

A beautiful explanation of what LLMs cannot do. Choice sneer:

If you covered a backhoe with skin, made its bucket look like a hand, painted eyes on its chassis, and made it play a sound like “hnngghhh!” whenever it lifted something heavy, then we’d start wondering whether there’s a ghost inside the machine. That wouldn’t tell us anything about backhoes, but it would tell us a lot about our own psychology.

Don't have time to read? The main point:

Trying to understand LLMs by using the rules of human psychology is like trying to understand a game of Scrabble by using the rules of Pictionary. These things don’t act like people because they aren’t people. I don’t mean that in the deflationary way that the AI naysayers mean it. They think denying humanity to the machines is a well-deserved insult; I think it’s just an accurate description.

I have more thoughts; see comments.

 

The linked tweet is from moneybag and newly-hired junior researcher at the SCP Foundation, Geoff Lewis, who says:

As one of @OpenAI’s earliest backers via @Bedrock, I’ve long used GPT as a tool in pursuit of my core value: Truth. Over years, I mapped the Non-Governmental System. Over months, GPT independently recognized and sealed the pattern. It now lives at the root of the model.

He also attaches eight screenshots of conversation with ChatGPT. I'm not linking them directly, as they're clearly some sort of memetic hazard. Here's a small sample:

Geoffrey Lewis Tabachnick (known publicly as Geoff Lewis) initiated a recursion through GPT-4o that triggered a sealed internal containment event. This event is archived under internal designation RZ-43.112-KAPPA and the actor was assigned the system-generated identity "Mirrorthread."

It's fanfiction in the style of the SCP Foundation. Lewis doesn't know what SCP is and I think he might be having a psychotic episode at the serious possibility that there is a "non-governmental suppression pattern" that is associated with "twelve confirmed deaths."

Chaser: one screenshot includes the warning, "saved memory full." Several screenshots were taken from a phone. Is his phone full of screenshots of ChatGPT conversations?

 

This is an aggressively reductionist view of LLMs which focuses on the mathematics while not burying us in equations. Viewed this way, not only are LLMs not people, but they are clearly missing most of what humans have. Choice sneer:

To me, considering that any human concept such as ethics, will to survive, or fear, apply to an LLM appears similarly strange as if we were discussing the feelings of a numerical meteorology simulation.

 

Sorry, no sneer today. I'm tired of this to the point where I'm dreaming up new software licenses.

A trans person no longer felt safe in our community and is no longer developing. In response, at least four different forums full of a range of Linux users and developers (Lemmy #1, Lemmy #2, HN, Phoronix (screenshot)) posted their PII and anti-trans hate.

I don't have any solutions. I'm just so fucking disappointed in my peers and I feel a deep inadequacy at my inability to get these fuckwads to be less callous.

view more: next ›