Connect with us

Industry

Meghan Rienks’ Channel Hack Highlights YouTube Support Issues

Published

on

  • For two months, YouTuber Meghan Rienks has been struggling to get YouTube Support’s help to recover her hacked vlog channel.
  • After several confusing email exchanges with the company that presented her with no real solutions, Reinks said she only began to see more helpful and rapid responses when Shane Dawson and Gigi Hadid spoke up or offered their own connections.
  • Rienks said she spoke on the phone with YouTube on Wednesday and learned she may not be able to get her videos back. She also said that she worries about smaller creators who are left with even fewer options when they have issues with their channels.

Rienks Battles with YouTube After Hack

After months of battling with YouTube to regain access to her hacked channel, YouTuber Meghan Rienks said that a call with the company revealed that she may not be able to get her videos back.

On Tuesday, she confirmed via Twitter that YouTube agreed to talk over the phone. The sudden help from the platform came just one day after she posted a 45-minute video detailing the company’s disappointing response to her vlog channel being hacked in January. That call, however, did not go in the direction she was hoping.

She posted on Twitter that the call “wasn’t great.” On a Wednesday night Instagram story, she told her followers that she would likely lose the content she had on the channel, some of which is a decade old.

Her problems with YouTube’s support stem back even further than this phone call. Rienks’ Monday video starts with her explaining that in October, she realized her main channel was not appearing online for viewers, despite it looking fine from her end while logged in. Solving this with YouTube took roughly two weeks. During that time, they had back and forths where they told her nothing was wrong with her channel. 

The company eventually realized they had been looking into her vlog channel instead and had also sent her the wrong link to solve her main channel issues. During this time, she did notice a suspicious upload on her vlog channel but kept that on the back burner so she could focus on her main channel. 

Her vlog channel came back to the forefront on January 2, when Rienks realized it had been fully hacked and rebranded. Her videos were gone, and even though the channel still had her URL, it was now called “Beauty Dior” and has new logos and images.

The page was now full of several newly posted videos, all of which appeared to be re-uploads of beauty tutorials which she suspects are also stolen. On top of that, the email she had associated with the channel was deleted, preventing her from recovering it and regaining control of the account.

Exchanges With YouTube Continue For Two Months

Rienks reached out to YouTube the following morning, thinking this would be an easy fix seeing as the hacking was very obvious. Instead, it led to a series of seemingly empty-worded exchanges between YouTube, Rienks, her manager, and others on her team. In some emails sent from YouTube, Rienks was not even included and had to be kept in the loop via her manager.

In one, the YouTuber support person addresses the email to “Alex.” However, no one involved in these communications is named Alex, or even a name remotely similar to Alex. Rienks stated multiple times that she felt she was not in contact with a real person. 

Substantial news did not come from YouTube until February 22, when YouTube told Meghan they found no signs of abnormal activity on the channel. When she followed up, emphasizing that the channel had been fully rebranded, they maintained their findings in a grammatically messy email. 

“Hi there, thanks for your reply. I understand why you’re wondering that the investigation resulted that no highjacking activity happened on the channel,” they wrote. “However, I can assure you that our internal team carefully investigated this and didn’t found any.” 

They advised that she increase her password and account security, a measure she had actively been taking on all of her channels and social media accounts since the original incident in October.

Rienks Takes to Twitter

The next morning, she emailed them at 9 AM to request a phone call so she could guarantee swift, immediate contact with a real person. She also hopped on Twitter to express her frustrations.

At around the same time she sent her email, she shared YouTube’s response alongside proof that her account had been clearly hacked on Twitter. She also said she had seen a substantial loss in subscribers on the channel since January. 

While those posts gained a decent amount of traction when she uploaded them, they blew up when YouTuber Shane Dawson shared one a little after 2 p.m. Dawson mentioned several YouTube Twitter accounts in his message, which included a plea for help.

Just 45 minutes after Shane sent his tweet out, Rienks saw action from YouTube. She received an email saying that phone support was not an option, but her case was now being marked high priority. She also began direct messaging Team YouTube, which led to more confusing back and forths.

After initially claiming that YouTube had looked into her main channel instead of her vlog, an excuse similar to one give during the first situation in October, Team YouTube they were “not sure why [internal teams] came to that conclusion” that there was no abnormal activity on her vlog. They assured Rienks that she had been in contact with real people at YouTube, and apologized for the delay in solving her problem.

“I am sorry you had to take to twitter to get more help with this,” one of the messages read. “That shouldn’t be the case at all.”

Around the same time, another well-known face slid into Rienks’ DMs –supermodel Gigi Hadid. Hadid, who is a follower of Reinks, told her that she was sorry about her situation, and had a friend at YouTube who could be able to help. 

“This is the only time that I’m getting help,” Rienks said frustratedly in her video. “Is when Shane Dawson and Gigi Hadid help me. Thanks guys.”

On this day, Beauty Dior was still posting content on her channel. She also noted she saw that the account was being sold on a site for $500.

Rienks’ Frustrations with YouTube

While Rienks was recording her video, she got an update from YouTube. 

“The email YouTube just sent is that I can have my channel transferred over to me, I just have to agree to not sue them,” Rienks explained. “And also, I can’t have any of the videos that were privated. Which is all of them.” 

She spoke to her attorney about the email, who said that nothing in their message to her contained a legal document or legally binding clauses. 

“This is a failed system and it’s not working,” she said, explaining her overall anger about YouTube’s response. “And also through all of this I found, if it’s not working for me, it is not working for so many creators who have much smaller channels.” 

In the description of the video, she further expressed that while she wants her channel back, she also wants larger-scale change at YouTube. 

I want a meeting at Youtube. With REAL HUMANS. With the ‘people’ who run the support team & *personally* investigate hijacked channels,” she wrote. “Because it is a broken system and it needs to be changed. I know this is a long shot, but this has been happening for far too long, to far too many creators.”

“There’s no way that Youtube has coded & built software to pickup on less than 10 seconds of skewed pitch copyrighted song, yet they’re still unable to accurately verify a compromised channel,” she added. “This needs to change.”

When heading to Rienks’ vlog channel today, viewers can still find it as Beauty Dior.

Update: This article was updated from its original form to include new information about Rienks’ phone call with YouTube.

Industry

Influencers Exposed for Posting Fake Private Jet Photos

Published

on

  • A viral tweet showed a studio set in Los Angeles, California that is staged to look like the inside of a private jet.
  • Some influencers were called out for using that very same studio to take social media photos and videos.
  • While some slammed them for faking their lifestyles online, others poked fun at the behavior and noted that this is something stars like Bow Wow have been caught doing before.
  • Others have even gone so far as to buy and pose with empty designer shopping bags to pretend they went on a massive spending spree.

A tweet went viral over the weekend exposing the secret behind some influencer travel photos.

“Nahhhhh I just found out LA ig girlies are using studio sets that look like private jets for their Instagram pics,” Twitter user @maisonmelissa wrote Thursday.

“It’s crazy that anything you’re looking at could be fake. The setting, the clothes, the body… idk it just kinda of shakes my reality a bit lol,” she continued in a tweet that quickly garnered over 100,000 likes.

The post included photos of a private jet setup that’s actually a studio in California, which you can rent for $64 an hour on the site Peerspace.

As the tweet picked up attention, many began calling out influencers who they noticed have posted photos or videos in that very same studio.

@the7angels

Come fly with the angels 👼

♬ Hugh Hefner – ppcocaine

Perhaps the most notable influencers to be called out were the Mian Twins, who eventually edited their Instagram captions to admit they were on a set.

While a ton of people were upset about this, others pointed out that it’s not exactly that new of an idea. Even Bow Wow was once famously called out in 2017 for posting a private plane photo on social media before being spotted on a commercial flight. 

Twitter users even noted other ridiculous things some people do for the gram, like buying out empty shopping bags to pretend they’ve gone on a shopping spree.

Meanwhile, others poked fun at the topic, like Lil Nas X, who is never one to miss out on a viral internet moment. He photoshopped himself into the fake private jet, sarcastically writing, “thankful for it all,” in his caption.

So ultimately, it seems like the moral of this story is: don’t believe everything you see on social media.

See what others are saying: (LADBible) (Dazed Digital) (Metro UK)

Continue Reading

Industry

South Korea’s Supreme Court Upholds Rape Case Sentences for Korean Stars Jung Joon-young and Choi Jong-hoon

Published

on

  • On Thursday morning, the Supreme Court in Seoul upheld the sentences of Jung Joon Young and Choi Jong Hoon for aggravated rape and related charges.
  • Jung will serve five years in prison, while Choi will go to prison for two-and-a-half.
  • Videos of Jung, Choi, and others raping women were found in group chats that stemmed from investigations into Seungri, of the k-pop group BigBang, as part of the Burning Sun Scandal.
  • The two stars tried to claim that some of the sex was consensual, but the courts ultimately found testimony from survivors trustworthy. Courts did, however, have trouble finding victims who were willing to come forward over fears of social stigma.

Burning Sun Scandal Fall Out

South Korea’s Supreme Court upheld the rape verdicts against stars Jung Joon-young and Choi Jong-hoon on Thursday after multiple appeals by the stars and their co-defendants.

Both Jung and Choi were involved in an ever-growing scandal involving the rapes and sexual assaults of multiple women. Those crimes were filmed and distributed to chatrooms without their consent.

The entire scandal came to light in March of 2019 when Seungri from the k-pop group BigBang was embroiled in what’s now known as the Burning Sun Scandal. As part of an investigation into the scandal, police found a chatroom that featured some stars engaging in what seemed to be non-consensual sex with various women. Police found that many of the message in the Kakaotalk chatroom (the major messaging app in South Korea) from between 2015 and 2016 were sent by Jung and Choi.

A Year of Court Proceedings

Jung, Choi, and five other defendants found themselves in court in November 2019 facing charges related to filming and distributing their acts without the consent of the victims, as well as aggravated rape charges. In South Korea, this means a rape involving two or more perpetrators.

The court found them all guilty of the rape charge. Jung was sentenced to six years behind bars, while Choi and the others were sentenced to five years. Jung was given a harsher sentence because he was also found guilty of filming and distributing the videos of their acts without the victim’s consent.

During proceedings, the court had trouble getting victims to tell their stories. Many feared being shamed or judged because of the incidents and didn’t want the possibility of that information going public. Compounding the court’s problems was the fact that other victims were hard to find.

To that end, the defendants argued that the sexual acts with some of the victims were consensual, albeit this didn’t leave out the possibility that there were still victims of their crimes. However, the court found that the testimony of survivors was trustworthy and contradicted the defendant’s claims.

Jung and Choi appealed the decision, which led to more court proceedings. In May 2020, the Seoul High Court upheld their convictions but reduced their sentences to five years for Jung and two and a half years for Choi.

Choi’s sentence was reduced because the court found that he had reached a settlement with a victim.

The decision was appealed a final time to the Supreme Court. This time they argued that most of the evidence against them, notably the Kakaotalk chatroom messages and videos, were illegally obtained by police.

On Thursday morning, the Supreme Court ultimately disagreed with Jung and Choi and said their revised sentences would stand.

Jung, Choi, and the other defendants will also still have to do 80 hours of sexual violence treatment courses and are banned from working with children for five years.

See What Others Are Saying: (ABC) (Yonhap News) (Soompi)

Continue Reading

Industry

YouTube Says It Will Use AI to Age-Restrict Content

Published

on

  • YouTube announced Tuesday that it would be expanding its machine learning to handle age-restricting content.
  • The decision has been controversial, especially after news that other AI systems employed by the company took down videos at nearly double the rate.
  • The decision likely stems from both legal responsibilities in some parts of the world, as well as practical reasons regarding the amount of content loaded to the site.
  • It might also help with moderator burn out since the platform is currently understaffed and struggles with extremely high turn over.
  • In fact, the platform still faces a lawsuit from a moderator claiming the job gave them Post Traumatic Stress Disorder. They also claim the company offered little resources to cope with the content they are required to watch.

AI-Age Restrictions

YouTube announced Tuesday that it will use AI and machine learning to automatically apply age restriction to videos.

In a recent blog post, the platform wrote, “our Trust & Safety team applies age-restrictions when, in the course of reviewing content, they encounter a video that isn’t appropriate for viewers under 18.”

“Going forward, we will build on our approach of using machine learning to detect content for review, by developing and adapting our technology to help us automatically apply age-restrictions.”

Flagged videos would effectively be blocked from being viewed by anyone who isn’t signed into an account or who has an account indicating they are below the age of 18. YouTube stated these changes were a continuation of their efforts to make YouTube a safer place for families. Initially, it rolled out YouTube Kids as a dedicated platform for those under 13, and now it wants to try and sterilize the platform site-wide. Although notably, it doesn’t plan to make the entire platform a new YouTube Kids.

It’s also not a coincidence that this move helps YouTube to better fall in line with regulations across the world. In Europe, users may face other steps if YouTube can’t confirm their age in addition to rolling out AI-age restrictions. This can include measures such as providing a government ID or credit card to prove one is over 18.

If a video is age-restricted by YouTube, the company did say it will have an appeals process that will get the video in front of an actual person to check it.

On that note, just days before announcing that it would implement AI to age-restrict, YouTube also said it would be expanding its moderation team after it had largely been on hiatus because of the pandemic.

It’s hard to say how much these changes will actually affect creators or how much money that can make from the platform. The only assurances YouTube gave were to creators who are part of the YouTube Partner Program.

“For creators in the YouTube Partner Program, we expect these automated age-restrictions to have little to no impact on revenue as most of these videos also violate our advertiser-friendly guidelines and therefore have limited or no ads.”

This means that most creators with the YouTube Partner Program don’t make much, or anything, from ads already and that’s unlikely to change.

Community Backlash

Every time YouTube makes a big change there are a lot of reactions, especially if it involves AI to automatically handle processes. Tuesday’s announcement was no different.

On YouTube’s tweet announcing the changes, common responses included complaints like, “what’s the point in an age restriction on a NON kids app. That’s why we have YouTube kids. really young kids shouldn’t be on normal youtube. So we don’t realistically need an age restriction.”

“Please don’t implement this until you’ve worked out all the kinks,” one user pleaded. “I feel like this might actually hurt a lot of creators, who aren’t making stuff for kids, but get flagged as kids channels because of bright colors and stuff like that”

Hiccups relating to the rollout of this new system were common among users. Although it’s possible that YouTube’s Sept 20. announcement saying it would bring back human moderators to the platform was made to help balance out how much damage a new AI could do.

In a late-August transparency report, YouTube found that AI-moderation was far more restrictive. When the moderators were first down-sized between April and June, YouTube’s AI largely took over and it removed around 11 million videos. That’s double the normal rate.

YouTube did allow creators to appeal those decisions, and about 300,000 videos were appealed; about half of which were reinstated. In a similar move, Facebook also had a similar problem, and will also bring back moderators to handle both restrictive content and the upcoming election.

Other Reasons for the Changes

YouTube’s decision to expand its use of AI not only falls in line with various laws regarding the verification of a user’s age and what content is widely available to the public but also likely for practical reasons.

The site gets over 400 hours of content uploaded every minute. Notwithstanding different time zones or having people work staggered schedules, YouTube would need to employ over 70,000 people to just check what’s uploaded to the site.

Outlets like The Verge have done a series about how YouTube, Google, and Facebook moderators are dealing with depression, anger, and Post Traumatic Stress Disorder because of their job. These issues were particularly prevalent among people working in what YouTube calls the “terror” or “violent extremism” queue.

One moderator told The Verge, “Every day you watch someone beheading someone, or someone shooting his girlfriend. After that, you feel like wow, this world is really crazy. This makes you feel ill. You’re feeling there is nothing worth living for. Why are we doing this to each other?”

That same individual noted that since working there, he began to gain weight, lose hair, have a short temper, and experience general signs of anxiety.

On top of these claims, YouTube is also facing a lawsuit filed in a California court Monday by a former content moderator at YouTube.

The complaint states that Jane Doe, “has trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind.

“She cannot be in crowded places, including concerts and events, because she fears mass shootings. She has severe and debilitating panic attacks,” it continued. “She has lost many friends because of her anxiety around people. She has trouble interacting and being around kids and is now scared to have children.”

These issues weren’t just for people working on the “terror” queue, but anyone training to become a moderator.

“For example, during training, Plaintiff witnessed a video of a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; beastiality; suicides; self-harm; children being rapped [sic]; births and abortions,” the complaint alleges.

“As the example was being presented, Content Moderators were told that they could step out of the room. But Content Moderators were concerned that leaving the room would mean they might lose their job because at the end of the training new Content Moderators were required to pass a test applying the Community Guidelines to the content.”

During their three-week training, moderators allegedly don’t receive much resilience training or wellness resources.

These kinds of lawsuits aren’t unheard of. Facebook faced a similar suit in 2018, where a woman claimed that during her time as a moderator she developed PTSD as a result of “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace.”

That case hasn’t yet been decided in court. Currently, Facebook and the plaintiff agreed to settle for $52 million, pending approval from the court.

The settlement would only apply to U.S. moderators

See what others are saying: (CNET) (The Verge) (Vice)

Continue Reading