this post was submitted on 11 May 2025
59 points (94.0% liked)

Technology

2616 readers
556 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

What goes through the minds of people working at porn companies profiting from videos of children being raped?

Thanks to a filing error in a Federal District Court in Alabama, releasing thousands of pages of internal documents from Pornhub that were meant to be sealed, we now know. The documents, mostly dating from 2020 or earlier, show some employees laughing off what’s on their site.

“I hope I never get in trouble for having those vids on my computer LOOOOL,” one messaged another.

Others are somber, with one messaging another, “There is A LOT of very, very obvious and disturbing CSAM here.” CSAM stands for child sexual abuse material.

One internal document indicates that Pornhub as of May 2020 had 706,000 videos available on the site that had been flagged by users for depicting rape or assaults on children or for other problems. That was partly because, the documents suggest, Pornhub did not necessarily review a video for possible removal until it had been flagged at least 16 times.

The company also made it much more difficult to flag problem videos by allowing only registered users to do so. One internal message noted: This “will greatly reduce overall flag volume.”

Pornhub and other “tube” websites that are part of the same company — like Redtube, Tube8 and YouPorn — don’t make sex videos themselves. Rather, they provide a platform for users to post videos.

Pornhub executives and owners told me they couldn’t comment on the discovery documents, which I was able to see on a court website, or anything related to current litigation. But they emphasized that the company has tightened its policies since the period covered by the documents, and they argued that it is now working hard to keep nonconsensual material off the site. And in fairness, it does seem that there has been significant improvement.

Yet these documents lift the curtain on what the company was doing behind the scenes up to that point. And that was: a relentless pursuit of market share without much concern for the well-being of those in the videos.

To me, the documents underscore how primal the pursuit of profits can be and why we should never trust tech companies to police themselves. And there’s evidence that suggests that, despite changes in the past few years, Pornhub has not gone far enough in eliminating from the platform videos that appear to be of child rapes.

In the message traffic, one employee advises another not to copy a manager when they find sex videos with children. The other has the obvious response: “He doesn’t want to know how much C.P. we have ignored for the past five years?” C.P. is short for child pornography.

Indeed, one private memo acknowledged that videos with apparent child sexual abuse had been viewed 684 million times before being removed.

Internal memos seem to show executives obsessed with making money by attracting the biggest audiences they could, pedophiles included. In one memo, Pornhub managers proposed words to be banned from video descriptions — such as “infant” and “kiddy” — while recommending that the site continue to allow “brutal,” “childhood,” “force,” “snuffs,” “unwilling,” “minor” and “wasted.”

One internal note says that a person who posted a sexual video of a child shouldn’t be banned from the site because “the user made money.”

you are viewing a single comment's thread
view the rest of the comments
[–] BrikoX@lemmy.zip 10 points 1 day ago (1 children)

@Pro@programming.dev please add the [Opinion] prefix in the title.

[–] Pro@programming.dev 4 points 1 day ago (1 children)
[–] BrikoX@lemmy.zip 11 points 1 day ago (1 children)

It's posted in the opinion section.