Connect with us

Industry

An Activist Hedge Fund Wants Jack Dorsey Out as Twitter CEO. Could That Change the Site?

Published

on

  • Last week, it was reported that conservative activist investor Elliott Management had purchased over $1 billion in Twitter shares, or about 4% of the company.
  • Now, Elliott Management wants to replace Twitter’s co-founder, Jack Dorsey, as CEO. 
  • This is largely viewed as an attempt to boost Twitter’s stock, which has been underperforming since Dorsey reclaimed his CEO position in 2015.
  • According to Fox News, a Dorsey ousting by Elliott Management could “raise the prospect that some of the changes to Twitter could make the platform a friendlier place for pro-Trump users.”

Hedge Fund Plans to Push Dorsey Out of Twitter

Twitter employees took to the platform Monday night in support of CEO Jack Dorsey after it was reported that an activist investment fund was trying to unseat him.

Last week, the fund known as Elliott Management announced it had bought roughly $1 billion in Twitter stock. According to Business Insider, that’s nearly 5% of the company and also enough to allow it to pressure Dorsey out of his CEO role. 

Elliot Management wants to oust Dorsey for a number of reasons, but perhaps the most significant reason is that Twitter is underperforming. Dorsey previously served as CEO of Twitter until being fired in 2008. He then returned in 2015. Since then, Twitter’s shares have fallen by 6.2%. Facebook, by contrast, has gained more than 121% in that same timeframe.

In November, Dorsey also announced that he’s preparing to move to Africa for 3-6 months this year.

That’s on top of Dorsey already splitting his time between Twitter and Square, Inc., where Dorsey is also CEO.

Elliott Management’s main argument here will be that a full-time CEO would be able to devote more time to the company to help raise its stock value and grow the company. 

This, however, isn’t the first time someone has announced a plan to oust Dorsey. In fact, such a move seemed bound to happen because unlike Facebook CEO Mark Zuckerberg and Snap Inc. CEO Evan Spiegel, Dorsey does not have voting control of Twitter. 

In December, New York University marketing professor Scott Galloway penned a letter calling for Dorsey’s removal.

“As of 12/6 I am the direct and beneficial owner of approximately 334,000 shares in Twitter,” Galloway said. “To be clear, my primary objective is the replacement of CEO Jack Dorsey.”

Weak governance, a part-time CEO, relocation to Africa, damage to the commonwealth, and poor returns,” he added. “Stakeholders deserve a board and CEO that command the opportunity Twitter occupies.”

Could A Dorsey Oust Make Twitter More “Trump Friendly?”

The reasons why Elliott Management is trying to push Dorsey may not stop there. 

The hedge fund is owned by Paul Singer, a conservative billionaire mega-donor. In 2016, Singer donated $24 million to Republican and right-leaning groups. 

It is possible, as Fox News points out, that Elliot Management’s increased presence within Twitter could, at least in part, ease conservative’s concerns that Twitter has a left-leaning bias. 

Elliott Management’s stake “[raises] the prospect that some of the changes to Twitter could make the platform a friendlier place for pro-Trump users. ”

Last year, California GOP Representative Devin Nunes filed a $250 million lawsuit against Twitter and several users. In that lawsuit, he accused the platform of “shadow-banning conservatives” and hiding their posts. 

#WeBackJack Trends on Twitter

Following all of this, many Twitter employees supporting Dorsey in his role as CEO posted stories of their interactions with Dorsey using the hashtag #WeBackJack. Later Monday night, that tag began to trend. 

“I’ve worked [for] many major corporations,” one user said. “Never did the CEO take 3 minutes to talk with me 1:1. Jack did (more than 3 mins might I add) & he didn’t treat me like someone below him. Ppl speak highly of him in rooms he’s not in. He’s not pretentious or egocentric. So yea #WeBackJack”

Telsa CEO Elon Musk also offered his support for Dorsey on Twitter Monday night, saying Dorsey “has a good [heart].”

Elliott Management Nominates Four Directors

While Elliott Management has not yet ousted Dorsey, it has nominated four people to Twitter’s board of directors.

Notably, there’s only going to be three seats available at this year’s annual meeting, but Elliott Management reportedly wants to ensure that it nominates enough people to fill all three seats and any vacancies that may unexpectedly arise. 

Elliot Management’s move to remove Dorsey comes in the face of several major events including the worsening situation with the coronavirus, U.S. presidential elections, and the upcoming Olympic Games in Tokyo. 

Those events will likely attract more users to the site and could, in turn, drive more advertisers, thus increasing the company’s stock value.

Twitter, however, has fallen behind other social media platforms despite its widespread use. Reportedly, it has decided to focus on its core services even though other platforms have added features such as filters and stories.

It is unknown if a Dorsey ousting could change that policy as Twitter’s board of directors tries to increase its stock value.

See what others are saying: (Bloomberg) (Business Insider) (Fox News)

Industry

South Korea’s Supreme Court Upholds Rape Case Sentences for Korean Stars Jung Joon-young and Choi Jong-hoon

Published

on

  • On Thursday morning, the Supreme Court in Seoul upheld the sentences of Jung Joon Young and Choi Jong Hoon for aggravated rape and related charges.
  • Jung will serve five years in prison, while Choi will go to prison for two-and-a-half.
  • Videos of Jung, Choi, and others raping women were found in group chats that stemmed from investigations into Seungri, of the k-pop group BigBang, as part of the Burning Sun Scandal.
  • The two stars tried to claim that some of the sex was consensual, but the courts ultimately found testimony from survivors trustworthy. Courts did, however, have trouble finding victims who were willing to come forward over fears of social stigma.

Burning Sun Scandal Fall Out

South Korea’s Supreme Court upheld the rape verdicts against stars Jung Joon-young and Choi Jong-hoon on Thursday after multiple appeals by the stars and their co-defendants.

Both Jung and Choi were involved in an ever-growing scandal involving the rapes and sexual assaults of multiple women. Those crimes were filmed and distributed to chatrooms without their consent.

The entire scandal came to light in March of 2019 when Seungri from the k-pop group BigBang was embroiled in what’s now known as the Burning Sun Scandal. As part of an investigation into the scandal, police found a chatroom that featured some stars engaging in what seemed to be non-consensual sex with various women. Police found that many of the message in the Kakaotalk chatroom (the major messaging app in South Korea) from between 2015 and 2016 were sent by Jung and Choi.

A Year of Court Proceedings

Jung, Choi, and five other defendants found themselves in court in November 2019 facing charges related to filming and distributing their acts without the consent of the victims, as well as aggravated rape charges. In South Korea, this means a rape involving two or more perpetrators.

The court found them all guilty of the rape charge. Jung was sentenced to six years behind bars, while Choi and the others were sentenced to five years. Jung was given a harsher sentence because he was also found guilty of filming and distributing the videos of their acts without the victim’s consent.

During proceedings, the court had trouble getting victims to tell their stories. Many feared being shamed or judged because of the incidents and didn’t want the possibility of that information going public. Compounding the court’s problems was the fact that other victims were hard to find.

To that end, the defendants argued that the sexual acts with some of the victims were consensual, albeit this didn’t leave out the possibility that there were still victims of their crimes. However, the court found that the testimony of survivors was trustworthy and contradicted the defendant’s claims.

Jung and Choi appealed the decision, which led to more court proceedings. In May 2020, the Seoul High Court upheld their convictions but reduced their sentences to five years for Jung and two and a half years for Choi.

Choi’s sentence was reduced because the court found that he had reached a settlement with a victim.

The decision was appealed a final time to the Supreme Court. This time they argued that most of the evidence against them, notably the Kakaotalk chatroom messages and videos, were illegally obtained by police.

On Thursday morning, the Supreme Court ultimately disagreed with Jung and Choi and said their revised sentences would stand.

Jung, Choi, and the other defendants will also still have to do 80 hours of sexual violence treatment courses and are banned from working with children for five years.

See What Others Are Saying: (ABC) (Yonhap News) (Soompi)

Continue Reading

Industry

YouTube Says It Will Use AI to Age-Restrict Content

Published

on

  • YouTube announced Tuesday that it would be expanding its machine learning to handle age-restricting content.
  • The decision has been controversial, especially after news that other AI systems employed by the company took down videos at nearly double the rate.
  • The decision likely stems from both legal responsibilities in some parts of the world, as well as practical reasons regarding the amount of content loaded to the site.
  • It might also help with moderator burn out since the platform is currently understaffed and struggles with extremely high turn over.
  • In fact, the platform still faces a lawsuit from a moderator claiming the job gave them Post Traumatic Stress Disorder. They also claim the company offered little resources to cope with the content they are required to watch.

AI-Age Restrictions

YouTube announced Tuesday that it will use AI and machine learning to automatically apply age restriction to videos.

In a recent blog post, the platform wrote, “our Trust & Safety team applies age-restrictions when, in the course of reviewing content, they encounter a video that isn’t appropriate for viewers under 18.”

“Going forward, we will build on our approach of using machine learning to detect content for review, by developing and adapting our technology to help us automatically apply age-restrictions.”

Flagged videos would effectively be blocked from being viewed by anyone who isn’t signed into an account or who has an account indicating they are below the age of 18. YouTube stated these changes were a continuation of their efforts to make YouTube a safer place for families. Initially, it rolled out YouTube Kids as a dedicated platform for those under 13, and now it wants to try and sterilize the platform site-wide. Although notably, it doesn’t plan to make the entire platform a new YouTube Kids.

It’s also not a coincidence that this move helps YouTube to better fall in line with regulations across the world. In Europe, users may face other steps if YouTube can’t confirm their age in addition to rolling out AI-age restrictions. This can include measures such as providing a government ID or credit card to prove one is over 18.

If a video is age-restricted by YouTube, the company did say it will have an appeals process that will get the video in front of an actual person to check it.

On that note, just days before announcing that it would implement AI to age-restrict, YouTube also said it would be expanding its moderation team after it had largely been on hiatus because of the pandemic.

It’s hard to say how much these changes will actually affect creators or how much money that can make from the platform. The only assurances YouTube gave were to creators who are part of the YouTube Partner Program.

“For creators in the YouTube Partner Program, we expect these automated age-restrictions to have little to no impact on revenue as most of these videos also violate our advertiser-friendly guidelines and therefore have limited or no ads.”

This means that most creators with the YouTube Partner Program don’t make much, or anything, from ads already and that’s unlikely to change.

Community Backlash

Every time YouTube makes a big change there are a lot of reactions, especially if it involves AI to automatically handle processes. Tuesday’s announcement was no different.

On YouTube’s tweet announcing the changes, common responses included complaints like, “what’s the point in an age restriction on a NON kids app. That’s why we have YouTube kids. really young kids shouldn’t be on normal youtube. So we don’t realistically need an age restriction.”

“Please don’t implement this until you’ve worked out all the kinks,” one user pleaded. “I feel like this might actually hurt a lot of creators, who aren’t making stuff for kids, but get flagged as kids channels because of bright colors and stuff like that”

Hiccups relating to the rollout of this new system were common among users. Although it’s possible that YouTube’s Sept 20. announcement saying it would bring back human moderators to the platform was made to help balance out how much damage a new AI could do.

In a late-August transparency report, YouTube found that AI-moderation was far more restrictive. When the moderators were first down-sized between April and June, YouTube’s AI largely took over and it removed around 11 million videos. That’s double the normal rate.

YouTube did allow creators to appeal those decisions, and about 300,000 videos were appealed; about half of which were reinstated. In a similar move, Facebook also had a similar problem, and will also bring back moderators to handle both restrictive content and the upcoming election.

Other Reasons for the Changes

YouTube’s decision to expand its use of AI not only falls in line with various laws regarding the verification of a user’s age and what content is widely available to the public but also likely for practical reasons.

The site gets over 400 hours of content uploaded every minute. Notwithstanding different time zones or having people work staggered schedules, YouTube would need to employ over 70,000 people to just check what’s uploaded to the site.

Outlets like The Verge have done a series about how YouTube, Google, and Facebook moderators are dealing with depression, anger, and Post Traumatic Stress Disorder because of their job. These issues were particularly prevalent among people working in what YouTube calls the “terror” or “violent extremism” queue.

One moderator told The Verge, “Every day you watch someone beheading someone, or someone shooting his girlfriend. After that, you feel like wow, this world is really crazy. This makes you feel ill. You’re feeling there is nothing worth living for. Why are we doing this to each other?”

That same individual noted that since working there, he began to gain weight, lose hair, have a short temper, and experience general signs of anxiety.

On top of these claims, YouTube is also facing a lawsuit filed in a California court Monday by a former content moderator at YouTube.

The complaint states that Jane Doe, “has trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind.

“She cannot be in crowded places, including concerts and events, because she fears mass shootings. She has severe and debilitating panic attacks,” it continued. “She has lost many friends because of her anxiety around people. She has trouble interacting and being around kids and is now scared to have children.”

These issues weren’t just for people working on the “terror” queue, but anyone training to become a moderator.

“For example, during training, Plaintiff witnessed a video of a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; beastiality; suicides; self-harm; children being rapped [sic]; births and abortions,” the complaint alleges.

“As the example was being presented, Content Moderators were told that they could step out of the room. But Content Moderators were concerned that leaving the room would mean they might lose their job because at the end of the training new Content Moderators were required to pass a test applying the Community Guidelines to the content.”

During their three-week training, moderators allegedly don’t receive much resilience training or wellness resources.

These kinds of lawsuits aren’t unheard of. Facebook faced a similar suit in 2018, where a woman claimed that during her time as a moderator she developed PTSD as a result of “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace.”

That case hasn’t yet been decided in court. Currently, Facebook and the plaintiff agreed to settle for $52 million, pending approval from the court.

The settlement would only apply to U.S. moderators

See what others are saying: (CNET) (The Verge) (Vice)

Continue Reading

Industry

Chinese State Media Calls TikTok-Oracle Deal “Reasonable” as Trump Signals Approval

Published

on

  • On Friday, the United States Commerce Department issued an order that would ban U.S. downloads of TikTok and WeChat starting Sunday night.
  • The order for TikTok was delayed for one week on Saturday after President Donald Trump gave his preliminary approval on a deal between TikTok and the software company Oracle.
  • A federal judge also issued a temporary injunction Sunday against the WeChat ban, which would have largely destroyed the app’s functionality.
  • Oracle and Walmart have since released more details of the deal, including that TikTok Global will likely pay $5 billion in U.S. taxes. This does not seem to be the same as a commission from the deal, even though Trump suggested such.
  • On Monday, Chinese state media called the deal “unfair” on ByteDance, TikTok’s parent company. However, it also described it as “reasonable,” suggesting the Chinese government may approve the deal.

U.S. and China Signal Support for Deal

What began as a tumultuous weekend for TikTok ended with both the U.S. and Chinese governments potentially signaling approval of its deal with Oracle. 

Last week, TikTok’s parent company, ByteDance, struck a deal with Oracle to avoid a U.S. ban. On Monday, Chinese state media called the deal “more reasonable to ByteDance,” and said it’s less costly than a shutdown.

“The plan shows that ByteDance’s moves to defend its legitimate rights have, to some extent, worked,” it added.

While not officially confirmed, this seems to suggest that the Chinese government may approve the deal. 

It also came off the heels of Saturday, when President Donald Trump, after having suggested unhappiness with the deal last week, said he has given his approval “in concept.” He will still need to officially sign off on it before the deal is set into motion.

Because of that, the U.S. Commerce Department staved off a download ban that was set for Sunday, now pushing it back to this coming Sunday, Sept. 27.

Some Republicans, such as Senator Marco Rubio (R-Fl.), have still expressed concern because ByteDance won’t be handing over its secretive algorithm as part of the deal.

What’s in the Deal?

On Saturday, Oracle released more details of its deal with TikTok. Under it, Oracle and Walmart would take a combined 20% stake in TikTok Global.

Still, there’s been much back and forth over how much control ByteDance, will have under the agreement. For his part, Trump has claimed that TikTok Global will “be a brand new company… It will have nothing to do with China.”

However, ByteDance has maintained that it will retain 80% of the stake. The discrepancy here seems to be because 40% of ByteDance is owned by U.S. venture capital firms. Therefore, Trump could technically claim that TikTok Global will be majority-owned by U.S. money.

Trump doubled down Monday and said that he would not approve the deal if ByteDance retained ownership. He added that the Chinese-owned company will “have nothing to do with it, and if they do, we just won’t make the deal.”

Later, Oracle announced that ByteDance will not have any stake in TikTok Global, though this statement heavily conflicts with what is being reported in China.

“Upon creation of TikTok Global, Oracle/Walmart will make their investment and the TikTok Global shares will be distributed to their owners, Americans will be the majority and ByteDance will have no ownership in TikTok Global,” the company said.

According to Walmart and Oracle, if this deal goes through, TikTok Global will pay $5 billion in new tax dollars to the U.S. Treasury over the next few years. As both companies noted, this is just a projection of future corporate taxes, and that estimate could change.

The water around that $5 billion figure was later muddied as Trump claimed that TikTok Global would be donating “$5 billion into a fund for education so we can educate people as to [the] real history of our country — the real history, not the fake history.”

To be clear, Trump is referring to his plans to establish a “patriotic education” commission.

On Sunday, ByteDance said in a statement that this was the first it had heard about a $5 billion education fund.

In fact, TikTok Global never promised to start an education fund. Instead, it promised to create an “educational initiative to develop and deliver an AI-driven online video curriculum to teach children from inner cities to the suburbs a variety of courses from basic reading and math to science, history and computer engineering.” 

That initiative doesn’t seem to have anything to do with that $5 billion tax figure. Since he began pursuing a ban, Trump has vowed that the U.S. will receive some form of commission from a deal with TikTok. As far as it is known, this $5 billion figure is also not that commission.

As previously reported, this deal will allow Oracle to host TikTok’s user data on its cloud service and review TikTok’s code for security. According to Treasury Secretary Steven Mnuchin, it would also shift TikTok’s global headquarters from China to the U.S.

On top of that, TikTok’s board members would reportedly have to be approved by the U.S. government, with one being an expert in data security. That person would also hold a top-secret security clearance.

Commerce Department Announces Download Ban

Friday seemed like the beginning of the end for TikTok. That morning, the Commerce Department issued an order that would ban U.S. downloads of not only TikTok but also WeChat starting Sunday night.

Both bans were a result of concerns the Trump administration has that ByteDance and WeChat’s parent company, Tencent, are either already giving or could give U.S. user data to the Chinese government.

The Trump administration has repeatedly said that both apps pose a national security threat.

TikTok and ByteDance have consistently denied these claims, saying that U.S. user data is stored domestically with a backup in Singapore. WeChat, for its part, has also made similar statements.

The download ban was announced in response to two Aug. 6 executive orders from Trump. Those orders ban any U.S.-based transactions with TikTok and WeChat starting on Sept. 20, which is why the Commerce Department set the deadline for this past Sunday.

While this ban would have been much more restrictive for WeChat because a large part of its functionality relies heavily on in-app transactions, for TikTok at least, it would only affect new downloads and updates to the app.

“So if that were to continue over a long period of time, there might be a gradual degradation of services, but the basic TikTok will stay intact until Nov. 12,” Commerce Secretary Wilbur Ross told Fox Business on Friday.

“If there’s not a deal by Nov. 12, under the provisions of the old order, then TikTok would also be, for all practical purposes, shut down.” 

What Happens on Nov. 12?

Ross is referring to another executive order, this one signed on Aug. 14. Notably, it gives ByteDance 90 days to divest from its American assets and any data that TikTok had gathered in the U.S. As Ross pointed out, that requirement could be satisfied if a deal is reached before the deadline.

If that doesn’t happen, the TikTok app could begin to see lags, lack of functionality, and sporadic outages.

However, it’s not just the U.S. One of the big questions that loomed after Oracle and ByteDance confirmed their deal last week was whether or not China would also need to approve it. ByteDance later confirmed that it will need the confirmation of the Chinese government, despite the deal not involving a technology transfer. 

Downloads Soar and TikTok Sues

On Friday, downloads for both apps soared. TikTok was downloaded nearly a quarter of a million times that day, up 12% from the previous day. WeChat was downloaded 10,000 times, up 150%.

The same Friday, TikTok as a company criticized the Commerce Department order, saying it had already committed to “unprecedented levels of additional transparency.”

TikTok added that the order “threatens to deprive the American people and small businesses across the US of a significant platform for both a voice and livelihoods.”

Later Friday, TikTok sued the Trump Administration to stop the download ban. 

On Sunday, a federal judge also halted the download ban for WeChat with a preliminary injunction. The injunction additionally blocks the Commerce Department’s attempt to bar transactions on the app.  

The Commerce Department responded by saying that it’s preparing for a long legal battle.

TikTokers: “Scared, angry, and confused”

“I’ve mostly just been feeling scared, angry, and confused,” TikToker Isabella Avila, known online as onlyjayus, told Rogue Rocket on Monday. “Those are just the main things.” 

Avila has amassed a following of 8.7 million followers on TikTok in a relatively short amount of time. She’s also gained about half a million followers on YouTube and Instagram each.

A couple of months ago, Avila said she thought a potential ban was all just talk; however, as the situation progressed, she said she became more worried.

While she said that she personally thought her career could survive a TikTok ban (thanks in part to a Netflix podcast deal), she added, “The people in-between a 100,000 to a million [followers], they have a platform right now, and if TikTok’s were to be gone, their platform’s pretty much gone if they haven’t built an audience on anything else. 

“This is where we go to express ourselves,” she said. “This is where we go to make videos. I don’t know, TikTok gave everybody a chance to kind of get famous and have a following. That’s what people liked about it. YouTube, it’s really hard to get followers and subscribers. TikTok was a lot easier.” 

Avila also expressed that a ban wouldn’t just be detrimental to creators. 

“I feel like my generation needed an app,” Avila said. “There was Instagram and Twitter, but it was kind of like for the millennials. Gen Z didn’t really have an app, and TikTok kind of fit that spot, so if TikTok’s gone, I don’t know, I feel like Gen Z isn’t really going to have a place.” 

Avila now says she is largely hopeful that TikTok will not be banned in the U.S.

See what others are saying: (The Washington Post) (NBC News) (Axios)

Continue Reading