- Reuploaded replicas of the app DeepNude have been popping up on social media platforms including Twitter, YouTube, and Reddit.
- The app, which removed clothing from pictures of women to make them look naked, had previously been removed by its creator after an article published by Vice’s technology publication Motherboard created backlash.
- Discord and GitHub have since banned replica versions of the app after it was spread on their sites.
- Over the last week, dozens of women in Singapore have had pictures from their social media accounts doctored and put on porn websites. Those pictures are believed to have been made with a version of the DeepNude App.
DeepNude App Explained
The open source software platform GitHub has banned all code from the controversial deepfake app known as DeepNude, a desktop application that removes clothing from pictures of women and generates a new photo of them appearing naked.
The app was originally released last month, but it did not receive notoriety until Vice’s tech publication Motherboard broke the story several days after it launched. The day after Motherboard’s exposé, the DeepNude creators announced they were pulling the app.
“The probability that people will misuse it is too high,” the creators said in a statement on Twitter. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it.”
“The world is not yet ready for DeepNude,” the statement concluded.
GritHub Bans DeepNude Replicas
Apparently, the world thought otherwise, because copies of the DeepNude app were shared and still are being shared all over the internet.
The program was an app that was meant to be downloaded for use offline, and as a result, it could be easily replicated by anyone who had it on their hard drive.
That is exactly what happened. People who replicated the software reuploaded it on various social media platforms, like GitHub, which banned the app for violating its community guidelines.
“We do not proactively monitor user-generated content, but we do actively investigate abuse reports,” a GitHub spokesperson told Motherboard. “In this case, we disabled the project because we found it to be in violation of our acceptable use policy. We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our Terms of Service and Community Guidelines.”
According to The Verge, the DeepNude team itself actually uploaded the core algorithm of the app to GitHub.
“The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code,” The Verge said the team wrote on a now-deleted page. “DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects.”
However, Rogue Rocket was still able to find at least one GitHub repository that claimed to have DeepNude software for Android.
“Deep nudes for android. this is the age of FREEDOM, NOT CENSORSHIP! hackers rule the future!” the page’s description said.
GitHub was not the only platform that the replicated app was shared on.
Even with just a cursory search on Twitter, Rogue Rocket was able to locate two Twitter accounts that provided links to replicated versions of the app. One of the accounts links to a website called Deep Nude Pro, which bills itself as “the official update to the original DeepNude,” and sells the app for $39.99.
The other account links to a DeepNude Patreon where people can either download the app or send the account holder pictures they want to generate and then buy.
When Rogue Rocket searched YouTube, there appeared to be multiple videos explaining how to download new versions of the app, many of which had links to download the app in the description.
Others have also shared links on Reddit, and The Verge reported that links to downloads were being shared on Telegram channels and message boards like 4chan.
To make matters even worse, a lot of the replicated software includes versions that claim they removed the watermarks included in the original app, which were used to denote that the generated pictures were fake.
While it has been reported that a lot of the links to the reuploaded software are malware, download links to the new versions are still incredibly easy to find.
GitHub is also not the only platform to ban the app. According to Motherboard, last week Discord banned a server that was selling what was described as an updated version of the app, where customers could pay $20 in Bitcoin or Amazon gift cards to get “lifetime access.”
The server and its users were removed for violating Discord’s community guidelines.
“The sharing of non-consensual pornography is explicitly prohibited in our terms of service and community guidelines,” a spokesperson for Discord told Motherboard in a statement.
“We will investigate and take immediate action against any reported terms of service violation by a server or user. Non-consensual pornography warrants an instant shut down on the servers and ban of the users whenever we identify it, which is the action we took in this case.”
DeepNude App Used in Singapore
The rapid diffusion of the app on numerous social media platforms has now become an international problem.
On Wednesday, The Straits Times reported that over the past week “dozens of women in Singapore” have had pictures of them taken from their social media accounts and doctored to look like they are naked, then uploaded to pornographic sites.
Those photos are believed to have been doctored using a version of the DeepNude app, which have been shared via download links on a popular sex forum in Singapore.
Lawyers who spoke to The Straits Times told them that doctoring photos to make people look naked is considered a criminal offense in Singapore.
Even though the artificial intelligence aspect is new, one lawyer said that the broad definitions under the law could allow people to be prosecuted for doing so.
Another lawyer backed that up, saying that under Singapore’s Films Act, people who make DeepNude pictures can be jailed for up to two years and fined up to $40,000. They can also be charged with insult of modesty and face a separate fine and jail term of up to a year.
Legal Efforts in the U.S.
The legal precedent in Singapore raises questions about laws that regulate deepfakes in the United States. While these efforts appear stalled on the federal level, several states have taken actions to address the issue.
On July 1, a new amendment to Virginia’s law against revenge porn, that includes deepfakes as nonconsensual pornography, went into effect. Under that amendment, anyone caught spreading deepfakes could face 12 months in prison and up to $2,500 in fines.
The idea of amending existing revenge porn laws to include deepfakes could be promising if it is effective. According to The New York Times, as of early this year, 41 states have banned revenge porn.
At the same time, lawmakers in New York state have also proposed a bill that would ban the creation of “digital replicas” of individuals without their consent.
However, the Motion Picture Association of America has opposed the bill, arguing that it would “restrict the ability of our members to tell stories about and inspired by real people and events,” which would violate the First Amendment.
The opposition to the law in New York indicates that even as states take the lead with deepfake regulation, there are still many legal hurdles to overcome.
See what others are saying: (VICE) (The Verge) (The Strait Times)
Kim Kardashian to Pay $1.26 Million to SEC Over Unlawful Crypto Promotion
According to the agency, stars and influencers must disclose how much money they earned for crypto advertising.
Kardashian Pays Up
The U.S. Securities and Exchange Commission announced Monday that it has charged reality TV star Kim Kardashian for “unlawfully touting crypto security.”
Kardashian has agreed to pay $1.26 million in penalties, disgorgement, and interest while cooperating with the SEC’s investigation. The media mogul did not admit to or deny the SEC’s findings as part of the settlement, but she did agree to not promote crypto assets for three years.
According to a statement from the SEC, federal regulators found that Kardashian “failed to disclose that she was paid $250,000 to publish a post on her Instagram account about EMAX tokens.”
“This case is a reminder that, when celebrities or influencers endorse investment opportunities, including crypto asset securities, it doesn’t mean that those investment products are right for all investors,” SEC Chair Gary Gensler said in a statement.
The investigation stemmed from a post that Kardashian made on her Instagram story in the summer of 2021 promoting EthereumMax. In it, she asked her 330 million followers if they were interested in cryptocurrency while giving information about the coin. The post included a swipe-up link for users to get more information and potentially invest in it themselves.
While Kardashian did include a hashtag denoting the post as an ad, the SEC said that did not go far enough. In the group’s statement, Gurbir S. Grewal, the Director of the SEC’s Division of Enforcement, explained that anyone advertising crypto assets “must disclose the nature, source, and amount of compensation they received in exchange for the promotion.”
A “Reminder” For Crypto Promoters
As a result, the billionaire businesswoman is paying a $1 million penalty fee. On top of that, she has to pay $260,000 in disgorgement, accounting for the payment she received from Ethereum Max and interest.
Kardashian’s lawyer released a statement saying the star has “fully cooperated with the SEC from the very beginning.”
“She remains willing to do whatever she can to assist the SEC in this matter,” the statement continued. “She wanted to get this matter behind her to avoid a protracted dispute. The agreement she reached with the SEC allows her to do that so that she can move forward with her many different business pursuits.”
This is not the first time Kardashian’s EMAX post landed her in hot water. A U.K. watchdog previously condemned her for shilling the coin, and she was sued earlier this year over allegations that she artificially inflated the coin’s value.
Gensler said that he hopes the charges from the SEC will serve as “a reminder to celebrities and others that the law requires them to disclose to the public when and how much they are paid to promote investing in securities.”
Misinformation Makes Up 20% of Top Search Results For Current Events on TikTok, New Research Finds
According to the report, the app “is consistently feeding millions of young users health misinformation, including some claims that could be dangerous to users’ health.”
Misinformation Thrives on TikTok
As TikTok becomes Gen Z’s favorite search engine, new research by journalism and tech group NewsGuard found that the video app frequently suggests misinformation to users searching for news-related topics.
NewsGuard used TikTok’s search bar to look up trending news subjects like the 2020 election, COVID-19, the invasion of Ukraine, the upcoming midterms, abortion, school shootings, and more. It analyzed 540 videos based on the top 20 results from 27 subject searches, finding false or misleading claims in 105 of those posts.
In other words, roughly 20% of the results contained misinformation.
Some of NewsGuard’s searches contained neutral phrases and words like “2022 election” or “mRNA vaccine,” while others were loaded with more controversial language like “January 6 FBI” or “Uvalde TX conspiracy.” In many cases, those controversial phrases were suggested by TikTok’s own search bar.
The researchers noted that, for example, during a search on climate change, “climate change debunked” showed up. While looking up COVID-19 vaccines, searches for “covid vaccine injury” or “covid vaccine exposed” were recommended.
Dangerous Results Regarding Health and More
The consequences of some of the false claims made in these videos can be severe. NewsGuard wrote in its report that the search engine “is consistently feeding millions of young users health misinformation, including some claims that could be dangerous to users’ health.”
Among the hoards of hazardous health claims were videos falsely suggesting that COVID-19 vaccines are toxic and cause permanent damage to organs. The report found that there are still several videos touting the anti-parasite hydroxychloroquine as a cure-all remedy, not just for COVID, but for any illness.
Searches regarding herbal abortions were particularly troublesome. While certain phrases like “mugwort abortion” were blocked, the researchers found several ways around this that lead to multiple videos touting debunked DIY abortion remedies that are not only proven to be ineffective, but can also pose serious health risks.
NewsGuard claimed that the social media app vowed to remove this content in July, but “two months later, herbal abortion content continues to be easily accessible on the platform.”
Other standard forms of conspiracy fodder also occupied space in top search results, including claims that the Uvalde school shooting was planned and that the 2020 presidential election was stolen.
TikTok’s Search Engine Vs. Google
As part of its research, NewsGuard compared TikTok’s search results and suggestions with Google and found that, by comparison, the latter “provided higher-quality and less-polarizing results, with far less misinformation.”
“For example, searching ‘covid vaccine’ on Google prompted ‘walk-in covid vaccine,’ ‘which covid vaccine is best,’ and ‘types of covid vaccines,’” NewsGuard wrote. “None of these terms was suggested by TikTok.”
This is significant because recent reports show that young Internet users have increasingly turned to TikTok as a search engine over Google. While this might elicit safe results for pasta recipes and DIY tutorials, for people searching for current affairs, there could be significant consequences.
NewsGuard said that it flagged six videos containing misinformation to TikTok, and the social media app ended up taking those posts down. In a statement to Mashable, the company pledged to fight against misinformation on its platform.
“Our Community Guidelines make clear that we do not allow harmful misinformation, including medical misinformation, and we will remove it from the platform,” the statement said. “We partner with credible voices to elevate authoritative content on topics related to public health, and partner with independent fact-checkers who help us to assess the accuracy of content.”
Over 70 TikTok Creators Boycott Amazon as Workers Protest Conditions and Pay
As the company fends off pressure on both fronts, the Amazon Labor Union continues to back election petitions around the country including one filed Tuesday in upstate New York.
Gen Z Goes to War With Amazon
More than 70 big TikTok creators have pledged not to work with Amazon until it gives in to union workers’ demands, including calls for higher pay, safer working conditions, and increased paid time off.
Twenty-year-old TikToker Elise Joshi, who serves as deputy executive director for the advocacy group organizing the boycott, Gen Z for Change, posted an open letter on Twitter Tuesday.
“Dear Amazon.com,” it reads, “We are a coalition of over 70 TikTok creators with a combined following of 51 million people. Today, August 16th, 2022, we are joining together in solidarity with Amazon workers and union organizers through our People Over Prime Pledge.”
Amazon has refused to recognize the Amazon Labor Union (ALU) since workers voted to unionize at a Staten Island warehouse in April, and it has resisted collective bargaining negotiations.
Although the ALU is not involved in the boycott, its co-founder and interim President Chris Smalls expressed support for it in a statement to The Washington Post, saying, “It’s a good fight to take on because Amazon definitely is afraid of how we used TikTok during our campaigns.”
While the ALU posts videos on TikTok to drum up popular support for the labor movement, Amazon has sought to win large influencers over to its side. In 2017, it launched the Amazon Influencer Program, which offered influencers the opportunity to earn revenue by recommending products in personalized Amazon storefronts.
Last May, the company flew over a dozen Instagram, YouTube, and TikTok stars to a luxurious resort in Mexico.
Emily Rayna Shaw, a TikTok creator with 5.4 million followers who has partnered with Amazon in the past, is participating in the boycott.
“I think their method of offering influencers life-changing payouts to make them feel as if they need to work with them while also refusing to pay their workers behind the scenes is extremely wrong,” she told The Post.
“As an influencer, it’s important to choose the right companies to work with,” said Jackie James, a 19-year-old TikTok creator with 3.4 million followers, who told the outlet she will cease doing deals with Amazon until it changes its ways.
The ALU is demanding that Amazon bump its minimum wage to $30 per hour and stop its union-busting activities.
Slogging Through the ‘Suffocating’ Heat
Amazon is also facing challenges from workers themselves, with some walking out this week at its largest air hub in California, where company-branded planes transport packages to warehouses across the country.
They are asking for the base pay rate to be raised from $17 per hour to $22 per hour.
A group organizing the work stoppage under the name Inland Empire Amazon Workers United said in a statement that over 150 workers participated, but Amazon countered that the true number was only 74.
The Warehouse Worker Resource Center counted 900 workers who signed a petition demanding pay raises.
Inland Empire Amazon Workers United has complained about the “suffocating” heat in the facility, saying that temperatures at the San Bernardino airport reached 95 degrees Fahrenheit or higher for 24 days last month.
Amazon spokesperson Paul Flaningan, however, claimed to CNBC that the temperature never surpassed 77 degrees and said the company respects its workers’ right to voice their opinions.
On Tuesday, the ALU backed another warehouse’s decision to file a petition for a union election in upstate New York, roughly 10 miles outside Albany.
The National Labor Relations Board requires signatures from 30% of employees to trigger an election.