What goes through the minds of people working at porn companies profiting from videos of children being raped?
Thanks to a filing error in a Federal District Court in Alabama, releasing thousands of pages of internal documents from Pornhub that were meant to be sealed, we now know. The documents, mostly dating from 2020 or earlier, show some employees laughing off what’s on their site.
“I hope I never get in trouble for having those vids on my computer LOOOOL,” one messaged another.
Others are somber, with one messaging another, “There is A LOT of very, very obvious and disturbing CSAM here.” CSAM stands for child sexual abuse material.
One internal document indicates that Pornhub as of May 2020 had 706,000 videos available on the site that had been flagged by users for depicting rape or assaults on children or for other problems. That was partly because, the documents suggest, Pornhub did not necessarily review a video for possible removal until it had been flagged at least 16 times.
The company also made it much more difficult to flag problem videos by allowing only registered users to do so. One internal message noted: This “will greatly reduce overall flag volume.”
Pornhub and other “tube” websites that are part of the same company — like Redtube, Tube8 and YouPorn — don’t make sex videos themselves. Rather, they provide a platform for users to post videos.
Pornhub executives and owners told me they couldn’t comment on the discovery documents, which I was able to see on a court website, or anything related to current litigation. But they emphasized that the company has tightened its policies since the period covered by the documents, and they argued that it is now working hard to keep nonconsensual material off the site. And in fairness, it does seem that there has been significant improvement.
Yet these documents lift the curtain on what the company was doing behind the scenes up to that point. And that was: a relentless pursuit of market share without much concern for the well-being of those in the videos.
To me, the documents underscore how primal the pursuit of profits can be and why we should never trust tech companies to police themselves. And there’s evidence that suggests that, despite changes in the past few years, Pornhub has not gone far enough in eliminating from the platform videos that appear to be of child rapes.
In the message traffic, one employee advises another not to copy a manager when they find sex videos with children. The other has the obvious response: “He doesn’t want to know how much C.P. we have ignored for the past five years?” C.P. is short for child pornography.
Indeed, one private memo acknowledged that videos with apparent child sexual abuse had been viewed 684 million times before being removed.
Internal memos seem to show executives obsessed with making money by attracting the biggest audiences they could, pedophiles included. In one memo, Pornhub managers proposed words to be banned from video descriptions — such as “infant” and “kiddy” — while recommending that the site continue to allow “brutal,” “childhood,” “force,” “snuffs,” “unwilling,” “minor” and “wasted.”
One internal note says that a person who posted a sexual video of a child shouldn’t be banned from the site because “the user made money.”
@Pro@programming.dev please add the [Opinion]
prefix in the title.