- Pornhub is now removing all videos that were not uploaded by verified users.
- Before the massive purge, the site hosted around 13.5 million videos. As of Monday morning, there were only 2.9 million videos left.
- The move is part of a series of sweeping changes the company made days after The New York Times published a shocking op-ed detailing numerous instances of abuse on the site, including nonconsensual uploads of underage girls.
- Following the article, numerous businesses cut ties with the company, including Mastercard and Visa, which both announced Thursday that they will not process any payments on the site.
Pornhub Purges Videos
Pornhub removed the vast majority of its existing videos Monday, just hours after the company announced that it would take down all existing videos uploaded by non-verified users.
According to reports, before the new move was announced Sunday night, Pornhub hosted about 13.5 million videos, according to the number displayed on the site’s search bar. As of writing, that search bar shows just over 2.9 million videos.
The decision comes less than a week after the company announced it would only allow video uploads from content partners and members of its Model program.
At the time, Pornhub claimed it made the decision following an independent review launched in April to eliminate illegal content. However, many speculated that it was actually in large part due to an op-ed published in The New York Times just days before. That piece, among other things, found that the site had been hosting videos of young girls uploaded without their consent, including some content where minors were raped or assaulted.
The article prompted a wave of backlash against Pornhub and calls for other businesses to cut ties with the company. On Thursday, both Visa and Mastercard announced that they would stop processing all payments on the site.
“Our investigation over the past several days has confirmed violations of our standards prohibiting unlawful content on their site,” Mastercard said in a statement.
Less than an hour later, Visa tweeted that it would also be suspending payments while it completed its own investigation.
Pornhub Claims It’s Being Targeted
However, in its blogpost announcing the most recent decision, Pornhub claimed that it was being unfairly targeted.
Specifically, the company noted that Facebook’s own transparency report found 84 million instances of child sexual abuse content over the last three years. By contrast, a report by the third-party Internet Watch Foundation found 118 similar instances on Pornhub in the same time period.
Notably, the author of The Times report, Nicholas Krisof, specifically said the Internet Watch Foundation’s findings represented a massive undercount, and that he was able to find hundreds of these kinds of videos on Pornhub in just half an hour.
Still, the site used the disputed numbers to point a finger at others.
“It is clear that Pornhub is being targeted not because of our policies and how we compare to our peers, but because we are an adult content platform,” the statement continued.
“Every piece of Pornhub content is from verified uploaders, a requirement that platforms like Facebook, Instagram, TikTok, YouTube, Snapchat and Twitter have yet to institute,” the company added.
However, Pornhub’s implication that it is somehow more responsible because it only let verified users post content is a highly impractical comparison. First of all, Pornhub is a platform created exclusively for porn, content the social media companies the company name-checked explicitly prohibit.
Second of all, and the vast majority of people who use those platforms are not verified, and it would be impossible for a company like Facebook or YouTube to limit content to only verified users without entirely undermining their own purposes.
Even beyond that, there are also still questions about Pornhub’s verification process. According to their site, all someone needs to do to become verified is to simply have a Pornhub account with an avatar and then upload a selfie of themselves holding a piece of paper with their username and Pornhub.com written on it.
While the company did tell reporters the process would be made more thorough sometime next year, they did not provide any specific details, prompting questions about exhaustive the verification process will ultimately be.
That question is highly important because, at least per its current policies, the verification process makes it so users are eligible to monetize their videos as part of the ModelHub program.
If the new verification process is still weak or has loopholes, people could easily slip through the cracks and continue to profit. However, on the other side, there are also big concerns among sex-workers that if the process is too limited, they will be able to make money on the platform.
That concern has already been exacerbated by some of the other actions taken since The Times article was published. For example, after Mastercard and Visa made their announcements, numerous sex workers and activists condemned the decision, saying it would seriously hurt how porn performers collect income — not just on Pornbub, but on other platforms as well.
“By targeting Pornhub and successfully destroying the ability for independent creators to monetize their content, they have made it easier to remove payment options from smaller platforms too,” model Avalon Fey told Motherboard last week. “This has nothing to do with helping abused victims, and everything to do with hurting online adult entertainers to stop them from creating and sharing adult content.”
Other performers also expressed similar concerns that the move could spillover to smaller platforms.
“I am watching to see if my OnlyFans will be their next target and sincerely hoping not,” amateur performer Dylan Thomas also told the outlet.
“Sex workers are scared by this change, despite not having uploaded any illegal content,” Fey continued, “because we have seen these patterns before and have had sites and payment processors permanently and unexpectedly shut down.”
See what others are saying: (Motherboard) (The Verge) (Bloomberg)
Tech Ethicist Tristan Harris Talks Council For Responsible Social Media, TikTok, Twitter, and More
Harris is part of a bipartisan group that is aiming to reform social media for good.
The Council For Responsible Social Media
Tristan Harris, the co-founder of the Center for Humane Technology, understands why many people view TikTok as a harmless app with jokes and dances. Harris, however, sees the Chinese-owned platform as a national security risk.
“During the Cold War, would you have allowed the Soviet Union to control television programming for the entire western world, including Saturday morning cartoons, the ‘Teletubbies’ and ‘Sesame Street?’” he said during an interview with Rogue Rocket.
That’s what he argues is happening with TikTok. The app, which is the most downloaded in the world, is owned by ByteDance, a Chinese tech company with ties to the Chinese Communist Party. Harris says we are “effectively outsourcing our media environment to, in the case of the United States, the number one geopolitical competitor.”
National security issues with TikTok, the extreme polarization caused by Facebook and Twitter, and a slew of other issues are among the reasons Harris and several other bipartisan leaders formed The Council For Responsible Social Media last month.
Co-Chaired by former congressman Dick Gephardt and former Lieutenant Governor of Massachusetts Kerry Healey, the group was made in partnership with the nonprofit IssueOne. Other members include Facebook whistleblower Frances Haugen, former Sen. Claire McCaskill, former Defense Secretary Chuck Hagel, and Harris.
It aims to pressure tech companies and politicians to make social media less harmful in every facet.
“What are the wins we can get on the scoreboard?” Harris explained. “Things like, frankly, banning TikTok or otherwise forcing a total sale of TikTok?…Can we do things like pass the Platform Accountability and Transparency Act?”
The TikTok Problem
When it comes to TikTok, the idea of banning it is not new. Former president Donald Trump attempted to do so in 2020, and earlier this month, a Federal Communications Commission official urged the U.S. to do away with it.
In Harris’ eyes, the threat posed by TikTok looms much larger than just mindless entertainment.
“When we outsource our media environment to a CCP-controlled company, we are effectively outsourcing our voting machine to the CCP,” Harris said. “How do you know who to vote for? Why is it that you know more about Marjorie Taylor Greene and [Alexandria Ocasio-Cortez] than the other hundreds of members of Congress? Because the attention economy rewards certain people to rise to the top.”
Social media apps, TikTok included, favor people that are more likely to be divisive, on either end of the political spectrum. Harris referred to this as “amplifiganda,” something the CCP can use to interfere with another nation’s political and cultural happenings.
“It’s strategically amplifying who are the voices I want to hear from and who are the voices I don’t want to hear from,” he added. “Without firing a single shot, without creating a single piece of new propaganda, I can simply amplify the politicians and videos that I want you to be seeing.”
In China, domestic users receive what Harris calls the “spinach” version of the app, that largely includes educational content, science experiments, and patriotism videos. He says it is very different from the scroll-for-hours version the U.S. and other international markets receive.
Harris, however, does not think this was part of “a deliberate plan” or that there’s a “large mustache that’s being twirled somewhere in China.” Rather, this is just an after-the-fact consequence of TikTok succeeding at being highly addictive, and China simply regulating it for itself.
Banning the app is not the only solution, Harris noted. Officials could also attempt to force a purchase of TikTok. A similar case happened in the past with Grindr. After a U.S. foreign investment commission said the app’s Chinese ownership was a security risk, the dating app was sold to a U.S.-based group.
“And now it’s not that the company is partially in China or partially in the U.S., or the data is on an American server while the design decisions are made in Bejing, it’s not like that,” Harris explained. “They forced the entire sale.”
“Anything less than that with TikTok would be insufficient.”
Despite the numerous issues posed by nearly every social media platform, enacting meaningful change will be no small feat. The Council For Responsible Social Media has outlined several steps it plans on taking, including awareness campaigns and hearings that could inspire action.
On the legislative front, this could involve the passage of the aforementioned Platform Accountability and Transparency Act, which was introduced by bipartisan senators last year and would “require social media companies to provide vetted, independent researchers and the public with access to certain platform data.”
Harris does not think this bill is a cure-all, he does think it should be a no-brainer for politicians to pass.
“It won’t change the DNA of the cancer cell that is social media, it’ll be more like the cancer cell is printing quarterly reports about what it is doing to society, but that’s still a better world than having a cancer cell where you don’t know what it’s doing,” he said.
Many advocates believe transparency is key when it comes to reforming social media, as it educates the general public about what these apps are really doing.
The Future of Twitter
Harris thinks education about social media has inadvertently grown over the last several weeks as billionaire Elon Musk took over Twitter. The process has proven to be quite chaotic, but it has also forced people to learn about Twitter’s problems.
“Twitter has already been a chaos-making, inflammation-for-profit machine. Elon buying Twitter doesn’t change that, he’s just running the inflammation-for-profit machine,” Harris said.
Musk’s acquisition has created a substantial financial bind and forced the mogul into a position where he has to turn engagement and revenue up. This has involved cutbacks on content moderation and laying off staff that worked on trust and safety.
“He has to figure out a way to lower costs and increase revenue, which unfortunately basically moves the whole system into a more and more dangerous direction,” Harris claimed, though he did say he does not view this as a character flaw on Musk’s part, rather just the reality of how these apps operate.
When it comes to fixing the root problems at Twitter, Harris thinks Musk has his eyes on the wrong target by focusing on censorship and free speech.
“It has to do with Twitter being a bad video game in which citizens earn or score the most points by adding inflammation to cultural fault lines,” he explained.
“If we’re playing a video game, and you earn the most points by finding a new cultural war faultline and inflaming it better than some other guy, you’re an inflammation entrepreneur,” he continued. “Turning citizens into inflammation entrepreneurs for profit is how we destroy democracies.”
Harris said that if Musk wants to change Twitter for the better, he has to “change the video game of what Twitter is” so that people are not rewarded for inflammation, but for consensus.
Meta Fined $24.7 Million for Campaign Finance Violations As Profits Fall 50%
A judge found the company violated Washington State’s campaign finance law more than 800 times since 2020 despite having previously settled a lawsuit for identical violations in 2018.
Judge Fines Facebook
A judge in Washington state slapped Meta with a $24.7 million fine on Wednesday after finding it had intentionally violated the state’s campaign finance disclosure laws.
In a statement, Washington Attorney General Bob Ferguson described the judgment as “the largest campaign finance penalty anywhere in the country — ever.”
According to the judge, Meta violated Washington’s Fair Campaign Practices Act 822 times. Each count carries a maximum fine of $30,000.
The law, which was passed in 1972, requires entities that sell political ads to make certain information public, including the names and addresses of ad buyers, the targets of the ads, how the ads were financed, and the total number of views. While TV stations and newspapers have followed this law for decades in Washington, Meta has continually refused to comply with the law, even arguing unsuccessfully in court that the act is unconstitutional because it “unduly burdens political speech” and is “virtually impossible to fully comply with.”
The matter has been a long, ongoing battle for Meta. In 2018, when Meta was still Facebook, Ferguson sued the platform for violating the same law. As part of a settlement, the social media network agreed to pay $238,000 and commit to transparency in political advertising.
At the time, Facebook said it would rather stop selling ads in Washington state than adhere to the law, but it continued to sell ads while also still refusing to comply. Ferguson responded by filing another suit in 2020, which resulted in the Wednesday ruling.
Meta’s Financial Woes
Although $24.7 million may seem like pocket change to a multi-billion dollar corporation, the fines come as Meta is facing unprecedented financial troubles.
Also on Wednesday, the company reported a 50% drop in profits for the third quarter of 2022. The decline follows a recent trend as Meta’s earnings continue to suffer from slowing ad sales, fierce competition from platforms like TikTok, and CEO Mark Zuckerberg’s decision to spend massive amounts of money on developing the metaverse.
In July, the tech giant posted its first-ever sales decline since becoming a public company. Meta’s stock has also nose-dived over 60% this year. The market reacted poorly to the reported drop in profits Wednesday, sending the stock down nearly 20%.
Despite the fact that the past year has been one of the worse ever for the business following Zuckerberg’s decision to rebrand as Meta and go all-in with the metaverse, his commitment remains fervent.
According to reports, during a call with analysts Wednesday, the CEO argued that people would “look back decades from now” and “talk about the importance of the work that was done here” in regards to the metaverse and virtual reality.
See what others are saying: (The Associated Press) (Axios) (The New York Times)
ByteDance Looks To Expand Music Streaming Service in Potential Threat to Spotify
The move could strengthen the power TikTok currently wields over the music industry.
Talks With Music Labels
TikTok parent company ByteDance is looking to expand its music streaming service, Resso, in a move that could shift both music consumption and marketing, according to The Wall Street Journal.
In a report on Wednesday, the Journal said that ByteDance is currently in talks with music labels about bringing Resso to over a dozen new markets. Currently, the platform is only available in Brazil, India, and Indonesia. While the United States would not be part of this next growth phase, the China-based company has its eyes on an eventual global expansion.
According to the Journal’s sources, in the long run, ByteDance hopes to integrate Resso and TikTok so that users who discover music on the video app can then subscribe and listen on the audio platform. Such a move could pose a threat to audio streaming giants like Spotify.
Over the past several years, TikTok has become increasingly powerful in the music industry. Its short videos paired with snappy soundbites make it prime for songs to go viral, and as a result, it has launched the careers of some of today’s biggest stars.
Lil Nas X was propelled to fame after releasing “Old Town Road” to TikTok. Millions of users began using the track on the app for their viral videos, leading the song to dominate both radio play and streaming. It eventually broke the record as the longest-running song atop the Billboard 100.
Likewise, Olivia Rodrigo went from a Disney+ actress to one of the biggest names in music overnight after her debut single “drivers license” blew up on TikTok. That song, as well as her follow-up singles, topped the charts and landed her multiple Grammy Awards.
Because TikTok is where so many young people discover music, expanding Resso would allow ByteDance to keep its user base under its own umbrella. It could also consolidate work for artists who already market their music on TikTok.
This expansion, however, will likely not come without complications. Sources told the Journal that even though this could potentially serve as another revenue source for TikTok, the biggest hurdle will be figuring out how much to pay out to labels. Some record companies have even expressed direct doubt about Resso to ByteDance.
While TikTok has seen exponential revenue growth over the years, making money from music streaming is a challenge. As a result, Spotify has had to lean heavily on podcasting.
When it comes to Resso, reports say most users do not actually pay for it. Like Spotify, it has an ad-supported free tier. According to the Journal, very few free users become paid subscribers.
The app’s popularity is increasing in the three countries it is available in, though. According to Insider, in Jan. 2021, the app had just a 4.8% market share of monthly active users in music streaming in India. That was just a fraction of the 18% held by Spotify at the time.
By Jan. 2022, that gap got significantly smaller. Resso’s 17% share is only slightly less than Spotify’s 22.8% share.
Wednesday’s news about ByteDance’s intentions to grow the app sent Spotify’s stock sliding, though it had picked up again by mid-day Thursday.