DigitalAudio

joined 2 years ago
MODERATOR OF
[–] DigitalAudio@sopuli.xyz 11 points 3 months ago (4 children)

That has been me in the past. Not to my wife, but as a younger person, I only read history books and stuff (still do) and felt superior because I did that (I don't do that anymore of course), so I would sneer at my friends' fiction and stuff because it was "worthless" compared to "real history" where you "actually learned stuff".

It's a dumb mindset, and I definitely don't feel like that anymore. I still don't read fiction or enjoy it, but it's just a hobby like any other, or like my thing with history.

[–] DigitalAudio@sopuli.xyz 7 points 3 months ago* (last edited 3 months ago)

That is me. I have a poor sense of color and have needed to be restrained in the past.

Jokes on my wife though because her sense of pitch is shaky, while I sure can sing.

Then again, she's an artist and I'm a musician. She has taught me how to avoid the really bad combinations and some theory of color while I have taught her to stay on pitch when there's a background voice doing something else.

[–] DigitalAudio@sopuli.xyz 12 points 3 months ago

The funniest part is that based on what people are saying or GPT5, the ending where AIs get super bored of humans' stupidity and dump them seems so likely

[–] DigitalAudio@sopuli.xyz 9 points 3 months ago

Damn, imagine the levels of segregation, speciesm and genocide we would see if other human species had thrived and grown like us.

[–] DigitalAudio@sopuli.xyz 4 points 3 months ago (1 children)

It may also be correlated with the population, though. Specifically the working age population.

I imagine that, as populations decrease and you have fewer people available to actually do any research, technological advancement also stagnates and slows down. If populations ever start increasing again in the future, then I imagine technological development will grow as well

[–] DigitalAudio@sopuli.xyz 2 points 3 months ago (1 children)

Honestly, with adequate governance, companies would be required to submit reports on how much labor they're doing using AI, and pay those wages to either their employees or to a sort of "Universal Income" fund to prop up families in poverty. It should be called the AI tax.

The problem is that, with the current state of affairs, asking for regulation from anyone is impossible, and also even if the law were enacted, getting the money from the companies to people who need it instead of the ultra-rich is a major hurdle.

But at the very least, I don't think we should allow companies to simply cut down on human labor without also contributing economically to the employees they cut off.

I don't think anyone is dying to fill in Excel spreadsheets or to write corporate emails. No one is complaining about AI doing those jobs, but about people who lost their livelihoods because of it.

[–] DigitalAudio@sopuli.xyz 2 points 3 months ago

But I don't think that's necessarily a problem that can't be solved. LLM and so on are ultimately simply statistical analysis, and if you refine it and train it enough, it can absolutely summarise at least one paper at the moment. Google's Notebook LM is already capable of it, I just don't think it can quite pull off many of them yet. But the current state of LLMs is not that far off.

I agree with AIs being way over hyped and also just having a general dislike for them due to the way they're being used, the people who gush over them, and the surrounding culture. But I don't think that means we should simply ignore reality altogether. The LLMs from 2 or even 1 year ago are not even comparable to the ones today, and that trend will probably keep going that way for a while. The main issue lies with the ethics of training, copyright, and of course, the replacement of labor in exchange of what amounts to simply a cool tool.

[–] DigitalAudio@sopuli.xyz 3 points 3 months ago (2 children)

The problem is that you do need to keep training models for this to make sense.

And you always need at least some human editorialization of models, otherwise the model will just say whatever, learn from itself and degrade over time. This cannot be done by other AIs, so for now you still need humans to make sure the AI models are actually getting useful information.

The problem with this, which many have already pointed out, is that it makes AIs just as unreliable as any traditional media. But if you don't oversee their datasets at all and just allow them to learn from everything then they're even more useless, basically just replicating social media bullshit, which nowadays is like at least 60% AI generated anyway.

So yeah, the current model is, not surprisingly, completely unsustainable.

The technology itself is great though. Imagine having an AI that you can easily train at home on 100s of different academic papers, and then run specific analyses or find patterns that would be too big for humans to see at first. Also imagine the impact to the medical field with early cancer detection or virus spreading patterns, or even DNA analysis for certain diseases.

It's also super good if used for creative purposes (not for just generating pictures or music). So for example, AI makes it possible for you to sing a song, then sing the melody for every member of a choir, and fine tune all voices to make them unique. You can be your own choir, making a lot of cool production techniques more accessible.

I believe once the initial hype dies down, we stop seeing AI used as a cheap marketing tactic, and the bubble bursts, the real benefits of AI will become apparent, and hopefully we will learn to live with it without destroying each other lol.

[–] DigitalAudio@sopuli.xyz 4 points 3 months ago

It's quite poetic innit

[–] DigitalAudio@sopuli.xyz 8 points 3 months ago

Isn't it better to have specific stations where you can leave them and pick them back up?

I've seen that model in Bogota, Buenos Aires and Tokyo, and people still absolutely use them all the time and they don't make as much of a mess. It's pretty good.

[–] DigitalAudio@sopuli.xyz 2 points 3 months ago

This is pretty common in China. My Chinese classmates say that some of their friends quit their jobs as teachers because they wanted to travel abroad and have a passport.

It's been that way for a long time apparently, only made more evident now by the increasing relevance of Chinese geopolitics and their stronger economy compared to their neighbours

[–] DigitalAudio@sopuli.xyz 4 points 4 months ago

The videos are all really touching. Such an awesome player to have seen throughout the years, even if I don't follow the PL

view more: ‹ prev next ›