this post was submitted on 11 May 2025
59 points (94.0% liked)

Technology

2616 readers
589 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

What goes through the minds of people working at porn companies profiting from videos of children being raped?

Thanks to a filing error in a Federal District Court in Alabama, releasing thousands of pages of internal documents from Pornhub that were meant to be sealed, we now know. The documents, mostly dating from 2020 or earlier, show some employees laughing off what’s on their site.

“I hope I never get in trouble for having those vids on my computer LOOOOL,” one messaged another.

Others are somber, with one messaging another, “There is A LOT of very, very obvious and disturbing CSAM here.” CSAM stands for child sexual abuse material.

One internal document indicates that Pornhub as of May 2020 had 706,000 videos available on the site that had been flagged by users for depicting rape or assaults on children or for other problems. That was partly because, the documents suggest, Pornhub did not necessarily review a video for possible removal until it had been flagged at least 16 times.

The company also made it much more difficult to flag problem videos by allowing only registered users to do so. One internal message noted: This “will greatly reduce overall flag volume.”

Pornhub and other “tube” websites that are part of the same company — like Redtube, Tube8 and YouPorn — don’t make sex videos themselves. Rather, they provide a platform for users to post videos.

Pornhub executives and owners told me they couldn’t comment on the discovery documents, which I was able to see on a court website, or anything related to current litigation. But they emphasized that the company has tightened its policies since the period covered by the documents, and they argued that it is now working hard to keep nonconsensual material off the site. And in fairness, it does seem that there has been significant improvement.

Yet these documents lift the curtain on what the company was doing behind the scenes up to that point. And that was: a relentless pursuit of market share without much concern for the well-being of those in the videos.

To me, the documents underscore how primal the pursuit of profits can be and why we should never trust tech companies to police themselves. And there’s evidence that suggests that, despite changes in the past few years, Pornhub has not gone far enough in eliminating from the platform videos that appear to be of child rapes.

In the message traffic, one employee advises another not to copy a manager when they find sex videos with children. The other has the obvious response: “He doesn’t want to know how much C.P. we have ignored for the past five years?” C.P. is short for child pornography.

Indeed, one private memo acknowledged that videos with apparent child sexual abuse had been viewed 684 million times before being removed.

Internal memos seem to show executives obsessed with making money by attracting the biggest audiences they could, pedophiles included. In one memo, Pornhub managers proposed words to be banned from video descriptions — such as “infant” and “kiddy” — while recommending that the site continue to allow “brutal,” “childhood,” “force,” “snuffs,” “unwilling,” “minor” and “wasted.”

One internal note says that a person who posted a sexual video of a child shouldn’t be banned from the site because “the user made money.”

you are viewing a single comment's thread
view the rest of the comments
[–] KoboldCoterie@pawb.social 17 points 1 day ago (1 children)

Yeah, the part about most having 16+ flags isn't really surprising (or in my opinion concerning). The rest of it, though, is pretty damning, especially the executive response to it all. While I agree with the laws (in my country, at least) that online platforms generally can't be held responsible for what their users post, those executives should absolutely be held accountable for choosing money.

[–] Wrufieotnak@feddit.org 3 points 1 day ago* (last edited 1 day ago) (2 children)

As soon as you want to earn money with it, you are most definitely responsible for it in my eyes. If it truly is a non commercial thing like Lemmy, then I'm on your side.

[–] p03locke@lemmy.dbzer0.com 4 points 23 hours ago (1 children)

Okay, now apply that argument to the next levels upstream, the ISP, backbone providers, CDN providers, the domain name holders, the SSL certificate trust companies. They all earn money with it.

You see how ridiculous that argument becomes.

[–] Wrufieotnak@feddit.org 1 points 17 hours ago* (last edited 16 hours ago)

No, because they provide different services.

To explain with an analogy:
The streets and car producers are not responsible if I drive to the British museum to look at stolen art, but the museum itself is most definitely responsible, even though they themselves didn't steal it.

[–] KoboldCoterie@pawb.social 6 points 1 day ago

The problem with that is that that sort of policy makes the internet just cease to function. User-generated content makes up a massive portion of what's on the internet, and it can't possibly all be policed before being posted, unless you want to make a post on Bluesky or whatever and have to wait weeks for it to be approved after manual review. The law requires companies to promptly respond to takedown requests but as long as they do, they aren't responsible for the content posted by their users.