this post was submitted on 23 Dec 2025
34 points (94.7% liked)

TechTakes

2341 readers
27 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Soyweiser@awful.systems 12 points 1 week ago (1 children)

Talked to somebody who is really into chatbot roleplay (of the 'longer term stories with new fantasy characters' type), and he mentioned that he needs to take his characters stories and archetypes to different models every now and then as a sort of refresh, as the models tend to eventually converge into certain stuck patterns. First clue of this seems to be that the replies seem to start to become a similar pattern of text organization. Sorry if this is vague as it is second hand, but the main point is, text based LLMs prob also do this.

[–] dgerard@awful.systems 8 points 1 week ago (3 children)

oh yeah, Suno does the same, it has about 12 songs

[–] flaviat@awful.systems 12 points 1 week ago

clanker's dozen

[–] Soyweiser@awful.systems 8 points 1 week ago (1 children)

Wonder if this is some sort of pre model collapse sign.

[–] corbin@awful.systems 4 points 3 days ago (1 children)

Nah, it's more to do with stationary distributions. Most tokens tend to move towards it; only very surprising tokens can move away. (Insert physics metaphor here.) Most LLM architectures are Markov, so once they get near that distribution they cannot escape on their own. There can easily be hundreds of thousands of orbits near the stationary distribution, each fixated on a simple token sequence and unable to deviate. Moreover, since most LLM architectures have some sort of meta-learning (e.g. attention) they can simulate situations where part of a simulation can get stuck while the rest of it continues, e.g. only one chat participant is stationary and the others are not.

[–] pikesley@mastodon.me.uk 4 points 1 week ago

@dgerard @Soyweiser the Randy Newman record?