- Multiple people have accused YouTuber Jared “ProJared” Knabenbauer of sending and requesting sexually explicit pictures from fans, including at least two people who were underage at the time.
- The accusations surfaced after Knabenbauer’s wife, Heidi O’Ferrall, said that he had been abusive and said that he had been “soliciting nudes from his fans for years.”
- The creator collective Knabenbauer belonged to, NormalBoots, said in a statement that they were first informed of the allegations on April 4, which prompted them to start an investigation and eventually cut ties with Knabenbauer.
- Knabenbauer has lost more than 200,000 subscribers since the allegations were made public.
Popular YouTube gamer Jared “ProJared” Knabenbauer has been accused of sending and soliciting sexually explicit pictures from his fans, including at least two individuals who were underage.
The allegations first came to light on Wednesday when Knabenbauer, who had over one million YouTube subscribers, announced in a statement on Twitter that he was divorcing his wife, cosplayer Heidi O’Ferrall.
O’Ferrall responded in a series of tweets accusing Knabenbauer of cheating on her and repeatedly lying about it. She claimed that he “gaslit” her for months and acted abusively.
He told his friends a version of events that omits his infidelity and portrays me as an aggressive and unreasonable person. As though I’m randomly angry and upset all the time, for no reason at all.— Heidi O’Ferrall✨ (@AtelierHeidi) May 9, 2019
It’s because he’s abusing me.
In a separate post, O’Ferrall also said that Knabenbauer “has been soliciting nudes from his fans for years.”
“I used to think that among consenting adults, it was fine,” she wrote on Twitter. “Now I see it as an abuse of power for him to intentionally manipulate anyone to show him their naked body on the basis that he’s a popular internet man.”
I used to think that among consenting adults, it was fine. Now I see it as an abuse of power for him to intentionally manipulate anyone to show him their naked body on the basis that he’s a popular internet man.— Heidi O’Ferrall✨ (@AtelierHeidi) May 9, 2019
I would like to apologize for my role in enabling this.
Others Come Forward
Following O’Ferrall’s post, several others came forward to share their experiences, with some saying that he would often send them explicit pictures that were unsolicited.
Those who came forward included at least two people who claim that they were underage at the time of their interactions with Knabenbauer. One fan, who goes by Chai, tweeted details, but later set his Twitter account to private.
“ProJared sexually manipulated me via Tumblr when I was 16,” Chai wrote, according to a reuploaded screenshot of his post. “I sent him many actual, real nudes. He knew I was 16.”
Chai also said that he sent lewd pictures to ProJared’s blog “with the explicit instruction ‘do not post this online. I’m 16.’ He posted them anyway.”
Chai noted in his statement that he had another friend who was also underage and “had been manipulated in the same way,” Chai said this individual was planning on coming forward.
Shortly after, a user who goes by Charlie came forward with a story similar to Chai’s. In a Twitter thread, Charlie shared a statement of their experience with Knabenbauer and provided screenshots of their conversations.
In the statement, Charlie writes that they were a “really big fan of ProJared” when they were “around the same age as Chai (15-16).” Charlie said that Knabenbauer asked fan’s on his Tumblr account to send naked pictures, noting that the blog was “in no way age restricted.”
“Jared is a grown man, I’m sure he understands that a significant chunk of his audience are middle and high schoolers,” Charlie wrote. “An open invitation to send him nudes is oddly predatory […] especially when there is a risk for minors (again, a good portion of his following) sending nudes.”
Charlie then goes on to say that they eventually sent Knabenbauer naked pictures.“He messaged me and thanked me and told me I was beautiful,” Charlie wrote. “I don’t think he ever asked for my age.”
Since Chai and Charlie made their statements, others have come forward to talk about the Tumblr blog Knabenbauer created to send and share explicit pictures with his fans.
Rogue Rocket spoke to a Twitter user who goes by Bren, who told us they never interacted with Knabenbauer personally, but described themself as a witness to the blog. Bren said they found Knabenbauer’s blog in group chat for the game “Asagao Academy.”
“I remember that a handful of people in the group chat were into the blog, many were minors,” Bren said. “I would say that him creating the whole blog was incredibly manipulative. His fan base was made up of so many underage fans.”
Bren described how Knabenbauer would refer to his fans who submitted naked pictures as “sinners.”
“Two adults wouldn’t say they’re ‘sinning’ when sharing nudes, but a teenager who hasn’t had the chance to explore their sexuality might have that view,” Bren said. “So to me, all of the talk about sinning and sinners felt like he was well aware that there was a large gap in the levels of sexual experience between him and his fans, and I think he got off to this idea of him, a 30 something year old man, being “naughty” and “sinful” with people half his age.. essentially grooming a whole group of fans.”
“Not to mention the power dynamic between a famous Youtuber and his audience, which is only amplified when much of that audience is underage,” Bren continued. “So when he was asking for nudes on his tumblr, it only made sense that his underage fans would answer that call.”
Bren’s experience was echoed by a person who goes by Asa, who spoke to The Daily Beast about their experience messaging Knabenbauer.
“I was freshly 18, and I had absolutely no romantic or sexual experiences,” Asa said. “He asked if I was over 18, and I said yes, mentioning that my birthday was scarcely a few months prior. He said that he was honored I would want to ‘use’ my newfound legality with him.”
Asa also claimed that it was “barely a secret” that underage people were interacting with Knabenbauer blog, adding, “I know a few of my friends had taken their ages off of their page so that Jared couldn’t verify that they were under 18.”
In response to the allegations, NormalBoots, the creator collective that Knabenbauer is a part of, posted a statement on Twitter, saying that they had been informed of the allegations in an email a month prior.
“On April 4, 2019, an unsolicited e-mail was sent to the NormalBoots business account containing allegations of inappropriate conduct pertaining to Jared Knabenbauer,” the statement said. “Upon receiving the e-mail, NormalBoots Manager, Jacque Khalil took immediate action to alert the appropriate parties, including Mr. Knabenbauer and the NormalBoots legal team, of the allegations. No other members of NormalBoots were made aware of the allegations while the investigation into the allegations was ongoing.”
The statement goes on to say that NormalBoots had planned to terminate their relationship with Knabenbauer before the allegations were made public, but have since “mutually agreed to part ways.”
In a Twitter post, Chai said that he and Charlie sent the email to NormalBoots, and shared a screenshot of the email the two received from them.
Knabenbauer himself has yet to make a statement, but many of his fans and others in the community have started to distance themselves from him. According to SocialBlade, Knabenbauer’s YouTube channel lost more than 100,000 subscribers in the first 24 hours after the allegations came out alone. Since then, he has lost a total of more than 200,000 followers.
Other gamers creators have also been distancing themselves from Knabenbauer as well, like popular creator collective Game Grumps, which has started deleting videos that Knabenbauer appeared in.
The accusations against Knabenbauer represents part of the broader problem of popular YouTubers using their position to prey on underage fans. Just two weeks ago, famous YouTuber Austin Jones was sentenced to ten years in federal prison for soliciting sexually explicit photos from multiple underage fans, some of whom were as young as 14.
Updates: This article was originally posted on May 10, but has been updated to include statements from Bren and Asa, the correct pronouns for Charlie, and updated records of ProJared’s subscriber loss as of May 14.
See what others are saying: (The Daily Beast) (The Verge) (Kotaku)
South Korea’s Supreme Court Upholds Rape Case Sentences for Korean Stars Jung Joon-young and Choi Jong-hoon
- On Thursday morning, the Supreme Court in Seoul upheld the sentences of Jung Joon Young and Choi Jong Hoon for aggravated rape and related charges.
- Jung will serve five years in prison, while Choi will go to prison for two-and-a-half.
- Videos of Jung, Choi, and others raping women were found in group chats that stemmed from investigations into Seungri, of the k-pop group BigBang, as part of the Burning Sun Scandal.
- The two stars tried to claim that some of the sex was consensual, but the courts ultimately found testimony from survivors trustworthy. Courts did, however, have trouble finding victims who were willing to come forward over fears of social stigma.
Burning Sun Scandal Fall Out
South Korea’s Supreme Court upheld the rape verdicts against stars Jung Joon-young and Choi Jong-hoon on Thursday after multiple appeals by the stars and their co-defendants.
Both Jung and Choi were involved in an ever-growing scandal involving the rapes and sexual assaults of multiple women. Those crimes were filmed and distributed to chatrooms without their consent.
The entire scandal came to light in March of 2019 when Seungri from the k-pop group BigBang was embroiled in what’s now known as the Burning Sun Scandal. As part of an investigation into the scandal, police found a chatroom that featured some stars engaging in what seemed to be non-consensual sex with various women. Police found that many of the message in the Kakaotalk chatroom (the major messaging app in South Korea) from between 2015 and 2016 were sent by Jung and Choi.
A Year of Court Proceedings
Jung, Choi, and five other defendants found themselves in court in November 2019 facing charges related to filming and distributing their acts without the consent of the victims, as well as aggravated rape charges. In South Korea, this means a rape involving two or more perpetrators.
The court found them all guilty of the rape charge. Jung was sentenced to six years behind bars, while Choi and the others were sentenced to five years. Jung was given a harsher sentence because he was also found guilty of filming and distributing the videos of their acts without the victim’s consent.
During proceedings, the court had trouble getting victims to tell their stories. Many feared being shamed or judged because of the incidents and didn’t want the possibility of that information going public. Compounding the court’s problems was the fact that other victims were hard to find.
To that end, the defendants argued that the sexual acts with some of the victims were consensual, albeit this didn’t leave out the possibility that there were still victims of their crimes. However, the court found that the testimony of survivors was trustworthy and contradicted the defendant’s claims.
Jung and Choi appealed the decision, which led to more court proceedings. In May 2020, the Seoul High Court upheld their convictions but reduced their sentences to five years for Jung and two and a half years for Choi.
Choi’s sentence was reduced because the court found that he had reached a settlement with a victim.
The decision was appealed a final time to the Supreme Court. This time they argued that most of the evidence against them, notably the Kakaotalk chatroom messages and videos, were illegally obtained by police.
On Thursday morning, the Supreme Court ultimately disagreed with Jung and Choi and said their revised sentences would stand.
Jung, Choi, and the other defendants will also still have to do 80 hours of sexual violence treatment courses and are banned from working with children for five years.
See What Others Are Saying: (ABC) (Yonhap News) (Soompi)
YouTube Says It Will Use AI to Age-Restrict Content
- YouTube announced Tuesday that it would be expanding its machine learning to handle age-restricting content.
- The decision has been controversial, especially after news that other AI systems employed by the company took down videos at nearly double the rate.
- The decision likely stems from both legal responsibilities in some parts of the world, as well as practical reasons regarding the amount of content loaded to the site.
- It might also help with moderator burn out since the platform is currently understaffed and struggles with extremely high turn over.
- In fact, the platform still faces a lawsuit from a moderator claiming the job gave them Post Traumatic Stress Disorder. They also claim the company offered little resources to cope with the content they are required to watch.
YouTube announced Tuesday that it will use AI and machine learning to automatically apply age restriction to videos.
In a recent blog post, the platform wrote, “our Trust & Safety team applies age-restrictions when, in the course of reviewing content, they encounter a video that isn’t appropriate for viewers under 18.”
“Going forward, we will build on our approach of using machine learning to detect content for review, by developing and adapting our technology to help us automatically apply age-restrictions.”
Flagged videos would effectively be blocked from being viewed by anyone who isn’t signed into an account or who has an account indicating they are below the age of 18. YouTube stated these changes were a continuation of their efforts to make YouTube a safer place for families. Initially, it rolled out YouTube Kids as a dedicated platform for those under 13, and now it wants to try and sterilize the platform site-wide. Although notably, it doesn’t plan to make the entire platform a new YouTube Kids.
It’s also not a coincidence that this move helps YouTube to better fall in line with regulations across the world. In Europe, users may face other steps if YouTube can’t confirm their age in addition to rolling out AI-age restrictions. This can include measures such as providing a government ID or credit card to prove one is over 18.
If a video is age-restricted by YouTube, the company did say it will have an appeals process that will get the video in front of an actual person to check it.
On that note, just days before announcing that it would implement AI to age-restrict, YouTube also said it would be expanding its moderation team after it had largely been on hiatus because of the pandemic.
It’s hard to say how much these changes will actually affect creators or how much money that can make from the platform. The only assurances YouTube gave were to creators who are part of the YouTube Partner Program.
“For creators in the YouTube Partner Program, we expect these automated age-restrictions to have little to no impact on revenue as most of these videos also violate our advertiser-friendly guidelines and therefore have limited or no ads.”
This means that most creators with the YouTube Partner Program don’t make much, or anything, from ads already and that’s unlikely to change.
Every time YouTube makes a big change there are a lot of reactions, especially if it involves AI to automatically handle processes. Tuesday’s announcement was no different.
On YouTube’s tweet announcing the changes, common responses included complaints like, “what’s the point in an age restriction on a NON kids app. That’s why we have YouTube kids. really young kids shouldn’t be on normal youtube. So we don’t realistically need an age restriction.”
“Please don’t implement this until you’ve worked out all the kinks,” one user pleaded. “I feel like this might actually hurt a lot of creators, who aren’t making stuff for kids, but get flagged as kids channels because of bright colors and stuff like that”
Hiccups relating to the rollout of this new system were common among users. Although it’s possible that YouTube’s Sept 20. announcement saying it would bring back human moderators to the platform was made to help balance out how much damage a new AI could do.
In a late-August transparency report, YouTube found that AI-moderation was far more restrictive. When the moderators were first down-sized between April and June, YouTube’s AI largely took over and it removed around 11 million videos. That’s double the normal rate.
YouTube did allow creators to appeal those decisions, and about 300,000 videos were appealed; about half of which were reinstated. In a similar move, Facebook also had a similar problem, and will also bring back moderators to handle both restrictive content and the upcoming election.
Other Reasons for the Changes
YouTube’s decision to expand its use of AI not only falls in line with various laws regarding the verification of a user’s age and what content is widely available to the public but also likely for practical reasons.
The site gets over 400 hours of content uploaded every minute. Notwithstanding different time zones or having people work staggered schedules, YouTube would need to employ over 70,000 people to just check what’s uploaded to the site.
Outlets like The Verge have done a series about how YouTube, Google, and Facebook moderators are dealing with depression, anger, and Post Traumatic Stress Disorder because of their job. These issues were particularly prevalent among people working in what YouTube calls the “terror” or “violent extremism” queue.
One moderator told The Verge, “Every day you watch someone beheading someone, or someone shooting his girlfriend. After that, you feel like wow, this world is really crazy. This makes you feel ill. You’re feeling there is nothing worth living for. Why are we doing this to each other?”
That same individual noted that since working there, he began to gain weight, lose hair, have a short temper, and experience general signs of anxiety.
On top of these claims, YouTube is also facing a lawsuit filed in a California court Monday by a former content moderator at YouTube.
The complaint states that Jane Doe, “has trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind.“
“She cannot be in crowded places, including concerts and events, because she fears mass shootings. She has severe and debilitating panic attacks,” it continued. “She has lost many friends because of her anxiety around people. She has trouble interacting and being around kids and is now scared to have children.”
These issues weren’t just for people working on the “terror” queue, but anyone training to become a moderator.
“For example, during training, Plaintiff witnessed a video of a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; beastiality; suicides; self-harm; children being rapped [sic]; births and abortions,” the complaint alleges.
“As the example was being presented, Content Moderators were told that they could step out of the room. But Content Moderators were concerned that leaving the room would mean they might lose their job because at the end of the training new Content Moderators were required to pass a test applying the Community Guidelines to the content.”
During their three-week training, moderators allegedly don’t receive much resilience training or wellness resources.
These kinds of lawsuits aren’t unheard of. Facebook faced a similar suit in 2018, where a woman claimed that during her time as a moderator she developed PTSD as a result of “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace.”
That case hasn’t yet been decided in court. Currently, Facebook and the plaintiff agreed to settle for $52 million, pending approval from the court.
The settlement would only apply to U.S. moderators
Chinese State Media Calls TikTok-Oracle Deal “Reasonable” as Trump Signals Approval
- On Friday, the United States Commerce Department issued an order that would ban U.S. downloads of TikTok and WeChat starting Sunday night.
- The order for TikTok was delayed for one week on Saturday after President Donald Trump gave his preliminary approval on a deal between TikTok and the software company Oracle.
- A federal judge also issued a temporary injunction Sunday against the WeChat ban, which would have largely destroyed the app’s functionality.
- Oracle and Walmart have since released more details of the deal, including that TikTok Global will likely pay $5 billion in U.S. taxes. This does not seem to be the same as a commission from the deal, even though Trump suggested such.
- On Monday, Chinese state media called the deal “unfair” on ByteDance, TikTok’s parent company. However, it also described it as “reasonable,” suggesting the Chinese government may approve the deal.
U.S. and China Signal Support for Deal
What began as a tumultuous weekend for TikTok ended with both the U.S. and Chinese governments potentially signaling approval of its deal with Oracle.
Last week, TikTok’s parent company, ByteDance, struck a deal with Oracle to avoid a U.S. ban. On Monday, Chinese state media called the deal “more reasonable to ByteDance,” and said it’s less costly than a shutdown.
“The plan shows that ByteDance’s moves to defend its legitimate rights have, to some extent, worked,” it added.
While not officially confirmed, this seems to suggest that the Chinese government may approve the deal.
It also came off the heels of Saturday, when President Donald Trump, after having suggested unhappiness with the deal last week, said he has given his approval “in concept.” He will still need to officially sign off on it before the deal is set into motion.
Because of that, the U.S. Commerce Department staved off a download ban that was set for Sunday, now pushing it back to this coming Sunday, Sept. 27.
Some Republicans, such as Senator Marco Rubio (R-Fl.), have still expressed concern because ByteDance won’t be handing over its secretive algorithm as part of the deal.
What’s in the Deal?
On Saturday, Oracle released more details of its deal with TikTok. Under it, Oracle and Walmart would take a combined 20% stake in TikTok Global.
Still, there’s been much back and forth over how much control ByteDance, will have under the agreement. For his part, Trump has claimed that TikTok Global will “be a brand new company… It will have nothing to do with China.”
However, ByteDance has maintained that it will retain 80% of the stake. The discrepancy here seems to be because 40% of ByteDance is owned by U.S. venture capital firms. Therefore, Trump could technically claim that TikTok Global will be majority-owned by U.S. money.
Trump doubled down Monday and said that he would not approve the deal if ByteDance retained ownership. He added that the Chinese-owned company will “have nothing to do with it, and if they do, we just won’t make the deal.”
Later, Oracle announced that ByteDance will not have any stake in TikTok Global, though this statement heavily conflicts with what is being reported in China.
“Upon creation of TikTok Global, Oracle/Walmart will make their investment and the TikTok Global shares will be distributed to their owners, Americans will be the majority and ByteDance will have no ownership in TikTok Global,” the company said.
According to Walmart and Oracle, if this deal goes through, TikTok Global will pay $5 billion in new tax dollars to the U.S. Treasury over the next few years. As both companies noted, this is just a projection of future corporate taxes, and that estimate could change.
The water around that $5 billion figure was later muddied as Trump claimed that TikTok Global would be donating “$5 billion into a fund for education so we can educate people as to [the] real history of our country — the real history, not the fake history.”
To be clear, Trump is referring to his plans to establish a “patriotic education” commission.
On Sunday, ByteDance said in a statement that this was the first it had heard about a $5 billion education fund.
In fact, TikTok Global never promised to start an education fund. Instead, it promised to create an “educational initiative to develop and deliver an AI-driven online video curriculum to teach children from inner cities to the suburbs a variety of courses from basic reading and math to science, history and computer engineering.”
That initiative doesn’t seem to have anything to do with that $5 billion tax figure. Since he began pursuing a ban, Trump has vowed that the U.S. will receive some form of commission from a deal with TikTok. As far as it is known, this $5 billion figure is also not that commission.
As previously reported, this deal will allow Oracle to host TikTok’s user data on its cloud service and review TikTok’s code for security. According to Treasury Secretary Steven Mnuchin, it would also shift TikTok’s global headquarters from China to the U.S.
On top of that, TikTok’s board members would reportedly have to be approved by the U.S. government, with one being an expert in data security. That person would also hold a top-secret security clearance.
Commerce Department Announces Download Ban
Friday seemed like the beginning of the end for TikTok. That morning, the Commerce Department issued an order that would ban U.S. downloads of not only TikTok but also WeChat starting Sunday night.
Both bans were a result of concerns the Trump administration has that ByteDance and WeChat’s parent company, Tencent, are either already giving or could give U.S. user data to the Chinese government.
The Trump administration has repeatedly said that both apps pose a national security threat.
TikTok and ByteDance have consistently denied these claims, saying that U.S. user data is stored domestically with a backup in Singapore. WeChat, for its part, has also made similar statements.
The download ban was announced in response to two Aug. 6 executive orders from Trump. Those orders ban any U.S.-based transactions with TikTok and WeChat starting on Sept. 20, which is why the Commerce Department set the deadline for this past Sunday.
While this ban would have been much more restrictive for WeChat because a large part of its functionality relies heavily on in-app transactions, for TikTok at least, it would only affect new downloads and updates to the app.
“So if that were to continue over a long period of time, there might be a gradual degradation of services, but the basic TikTok will stay intact until Nov. 12,” Commerce Secretary Wilbur Ross told Fox Business on Friday.
“If there’s not a deal by Nov. 12, under the provisions of the old order, then TikTok would also be, for all practical purposes, shut down.”
What Happens on Nov. 12?
Ross is referring to another executive order, this one signed on Aug. 14. Notably, it gives ByteDance 90 days to divest from its American assets and any data that TikTok had gathered in the U.S. As Ross pointed out, that requirement could be satisfied if a deal is reached before the deadline.
If that doesn’t happen, the TikTok app could begin to see lags, lack of functionality, and sporadic outages.
However, it’s not just the U.S. One of the big questions that loomed after Oracle and ByteDance confirmed their deal last week was whether or not China would also need to approve it. ByteDance later confirmed that it will need the confirmation of the Chinese government, despite the deal not involving a technology transfer.
Downloads Soar and TikTok Sues
On Friday, downloads for both apps soared. TikTok was downloaded nearly a quarter of a million times that day, up 12% from the previous day. WeChat was downloaded 10,000 times, up 150%.
The same Friday, TikTok as a company criticized the Commerce Department order, saying it had already committed to “unprecedented levels of additional transparency.”
TikTok added that the order “threatens to deprive the American people and small businesses across the US of a significant platform for both a voice and livelihoods.”
Later Friday, TikTok sued the Trump Administration to stop the download ban.
On Sunday, a federal judge also halted the download ban for WeChat with a preliminary injunction. The injunction additionally blocks the Commerce Department’s attempt to bar transactions on the app.
The Commerce Department responded by saying that it’s preparing for a long legal battle.
TikTokers: “Scared, angry, and confused”
“I’ve mostly just been feeling scared, angry, and confused,” TikToker Isabella Avila, known online as onlyjayus, told Rogue Rocket on Monday. “Those are just the main things.”
Avila has amassed a following of 8.7 million followers on TikTok in a relatively short amount of time. She’s also gained about half a million followers on YouTube and Instagram each.
A couple of months ago, Avila said she thought a potential ban was all just talk; however, as the situation progressed, she said she became more worried.
While she said that she personally thought her career could survive a TikTok ban (thanks in part to a Netflix podcast deal), she added, “The people in-between a 100,000 to a million [followers], they have a platform right now, and if TikTok’s were to be gone, their platform’s pretty much gone if they haven’t built an audience on anything else.
“This is where we go to express ourselves,” she said. “This is where we go to make videos. I don’t know, TikTok gave everybody a chance to kind of get famous and have a following. That’s what people liked about it. YouTube, it’s really hard to get followers and subscribers. TikTok was a lot easier.”
Avila also expressed that a ban wouldn’t just be detrimental to creators.
“I feel like my generation needed an app,” Avila said. “There was Instagram and Twitter, but it was kind of like for the millennials. Gen Z didn’t really have an app, and TikTok kind of fit that spot, so if TikTok’s gone, I don’t know, I feel like Gen Z isn’t really going to have a place.”
Avila now says she is largely hopeful that TikTok will not be banned in the U.S.