How have I never seen this lmao
JustAPenguin
The thing that annoys me most is that there have been studies done on LLMs where, when trained on subsets of output, it produces increasingly noisier output.
Sources (unordered):
- What is model collapse?
- AI models collapse when trained on recursively generated data
- Large Language Models Suffer From Their Own Output: An Analysis of the Self-Consuming Training Loop
- Collapse of Self-trained Language Models
Whatever nonsense Muskrat is spewing, it is factually incorrect. He won't be able to successfully retrain any model on generated content. At least, not an LLM if he wants a successful product. If anything, he will be producing a model that is heavily trained on censored datasets.
*checks notes* yes
I'm an Australian. We have a similar phrase with the same translation. In our language, we say: "cunts fucked"
Why can you just peg each other like a normal couple?
I've been trying to reach you about your car's extended warranty
I want to use Thunderbird but my university won't let me log into my email outside of Outlook... So dumb.
In my country, the most renowned chefs all work at a hardware store