Connect with us

Business

Apple Defends Removal of Rival Apps

Published

on

  • Apple has removed several apps from its App Store that allowed parents to track and limit their children’s screen time.
  • The New York Times issued a report with complaints from competitor apps who suggested Apple was engaging in anti-competitive behavior.
  • Apple denied this claim and said the removal was because the apps were abusing technology that enables third-party access to sensitive information.

Apple’s Statement

Apple said Sunday that it has removed several parental control apps from its App Store because they put user privacy and security at risk, not because the company wants to stifle competition.

Apple removed apps that they say were abusing a technology called Mobile Device Management (MDM), which gives app developers access to sensitive information. “The information includes things like user location, browsing history, saved images and video, and more,” the company said in a statement on their website.

Apple released its statement in response to a New York Times report published Saturday, which suggested that the company removed the apps for anti-competitive reasons. “Apple has always supported third-party apps on the App Store that help parents manage their kids’ devices.” Apple said.

“Contrary to what The New York Times reported over the weekend, this isn’t a matter of competition. It’s a matter of security. In this app category, and in every category, we are committed to providing a competitive, innovative app ecosystem.”

The company went on to say that it had given the affected apps 30 days to submit updated software that did not violate its policies. It said many of the companies did make changes and were not removed afterward. Those that did not were removed from the App Store.

New York Times Report

On Saturday, the New York Times released a report that included statements from several app makers who complained that Apple had removed their products from the App Store after it launched its own similar tools.

The latest version of Apple’s iOS mobile operating system includes a feature called Screen Time, which allows users to see how much time they spend on their phone. The tool allows for content and privacy restrictions that parent’s often used to control how much time their children can spend on certain apps and websites.

Apple’s Screen Time Tool – Source: Apple

According to the Times, over the past year, Apple has removed or restricted at least 11 of the 17 most downloaded screen-time and parental-control apps. The report also says they have cracked down on several other lesser known apps and in some cases, forced companies to remove features that allowed for parental controls on apps.

Features for time and content restrictions. Source: Apple

The newspaper noted that on Thursday, the apps Kidslox and Qustodio filed a complaint with the European Union’s competition office. The two companies were among the most popular parental-control apps on the market, but Kidslox says business had drastically dropped since Apple forced changes to its app that has made it less useful than Apple’s own Screen Time tool.

Apple is also facing an antitrust complaint in Russian from Kaspersky Lab, a Russian cybersecurity firm that American security officials say has ties to the Russian government. The firm says Apple removed important features from its parental control app, and in addition to their antitrust complaint in Russia, they are also now exploring a similar complaint in Europe.

Accusations that Apple engages in anti-competitive behavior are not new, nor are they exclusive to the ScreenTime tool. In fact, Spotify and other streaming services have long complained that Apple gives its Apple Music service an unfair advantage over competitors. In March, Spotify filed an antitrust complaint against Apple with the European Union, alleging that Apple is hurting innovation and consumer choice with its Apple “tax” and restrictive rules in its App Store.

See what others are saying: (CNN) (Gizmodo) ((CNBC)

Business

Uber and Lyft Drivers Offered Incentives to Fight Bill That Targets Gig-Economy

Published

on

  • On July 9 hundreds of Uber and Lyft drivers gathered outside the California State Capitol for a rally about Assembly Bill 5, which would impact how the state determines if a worker is an employee or an independent contractor. 
  • On Monday, the Los Angeles Times reported that the I’m Independent Coalition, a group who works closely with Uber and Lyft, offered to pay drivers to attend the rally against Assembly Bill 5. 
  • Drivers say they also received emails and in-app offers from Uber and Lyft if they attended the rally against the bill.  

The Rally

Drivers for Uber and Lyft say the ride-share companies offered incentives to workers that lobbied against a proposed bill that would allow drivers to be employees instead of independent contractors.

On Monday, the Los Angeles Times reported that drivers for Uber and Lyft who attended the July 9 rally outside California’s State Capitol were compensated for their “travel, parking and time.” 

According to the report, an email from the I’m Independent Coalition was sent to drivers, offering them anywhere from $25 to $100 if they rallied on the group’s behalf. I’m Independent is a coalition that is funded by the California Chamber of Commerce and works to change the proposed legislation. According to their website, both Uber and Lyft are supporters of I’m Independent.

Following the rally, the LA Times says that another email was sent out, reassuring workers that their compensation would be sent over soon. 

“We want to thank you again for taking time to attend the State Capitol Rally on July 9,” the email states. “Your voice had an impact and the Legislature heard loud and clear that you want to keep your flexibility and control over your work! Please expect a driver credit in the next five business days for your travel, parking, and time.”

I’m Independent later confirmed to the paper that the drivers who attended the rally had been paid.

However, the report says the coalition was not the only group offering vouchers and compensation for attending the rally. A Lyft spokesperson confirmed that the company had offered drivers $25 to help cover parking, while Uber sent a $15 lunch voucher through their app and told drivers it was for them, their families, “and anyone you know who also has a stake in maintaining driver flexibility.”

The Bill

The rally outside of the state capitol was held ahead of a Senate labor hearing for Assembly Bill 5, a bill that states it “would provide that the factors of the “ABC” test be applied in order to determine the status of a worker as an employee or independent contractor.” 

The “ABC” test comes from an April 2018 California Supreme Court case, Dynamex Operations West, Inc. v. Superior Court. During that case, the Court ruled that in order to determine if a worker was an independent contractor, three qualifications must be met. According to court documents, those requirements are: 

“(A) that the worker is free from the control and direction of the hirer in connection with the performance of the work, both under the contract for the performance of such work and in fact.”

(B) that the worker performs work that is outside the usual course of the hiring entity’s business,” and “(C) that the worker is customarily engaged in an independently established trade, occupation, or business of the same nature as the work performed for the hiring entity.” 

Under AB5, drivers for both Uber and Lyft would no longer be classified as independent contractors but instead employees. The main difference between an independent contractor and an employee is the regulations and requirements their employer must follow. If a worker is determined to be an employee, they receive things like sick pay, a required minimum wage, and a limit on the hours they can work. 

However, Assembly Bill 5 states that certain occupations are exempted from the “ABC” test, such as health care professionals like doctors and dentists, among others.  

In May, the bill passed in the state assembly in a 59 to 15 vote. Earlier this month the State Senate Committee on Labor, Public Employment, and Retirement voted the bill through. 

Uber and Lyft on AB5 

Uber has previously said the company will not take a side when it comes to the bill, but they do believe there are better solutions than Assembly Bill 5. 

However, at the beginning of June, Uber sent an email to their drivers saying the bill could “threaten your access to flexible work with Uber.” 

Lyft has taken a similar approach and also sent an email to its drivers, telling the workers that the ride-share company is trying to “protect” their jobs. 

“Legislators are considering changes that could cause Lyft to limit your hours and flexibility, resulting in scheduled shifts,” the email, which was later shared by Lyft, states. “We’re advocating to protect your flexibility with Lyft, in addition to establishing an earning minimum, offering protections and benefits and giving drivers representation so that you have a voice in the company.”

Previous Responses to AB5

In May, Uber and Lyft drivers around the world went on strike asking for similar requirements employees receive, such as a minimum hourly wage. The strikes took place just three weeks before the state assembly voted and passed Assembly Bill 5 and advocated for similar requirements for drivers. 

Previous responses from Uber and Lyft drivers

Even though the strikes did not create any massive change to the companies, according to a June 2019 Ipsos study, the majority of drivers from both Uber and Lyft still want “the same workers’ rights as those in more traditional employment positions.”

Assembly Bill 5 advanced to the appropriations committee earlier this month but the committees are currently in summer recess.

See what others are saying: (Los Angeles Times) (International Business Times) (SF Gate)

Continue Reading

Business

DeepNude App Banned on GitHub After Spreading to Multiple Platforms

Published

on

  • Reuploaded replicas of the app DeepNude have been popping up on social media platforms including Twitter, YouTube, and Reddit.
  • The app, which removed clothing from pictures of women to make them look naked, had previously been removed by its creator after an article published by Vice’s technology publication Motherboard created backlash. 
  • Discord and GitHub have since banned replica versions of the app after it was spread on their sites.
  • Over the last week, dozens of women in Singapore have had pictures from their social media accounts doctored and put on porn websites. Those pictures are believed to have been made with a version of the DeepNude App.

DeepNude App Explained

The open source software platform GitHub has banned all code from the controversial deepfake app known as DeepNude, a desktop application that removes clothing from pictures of women and generates a new photo of them appearing naked.

The app was originally released last month, but it did not receive notoriety until Vice’s tech publication Motherboard broke the story several days after it launched. The day after Motherboard’s exposé, the DeepNude creators announced they were pulling the app.

“The probability that people will misuse it is too high,” the creators said in a statement on Twitter. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it.”

“The world is not yet ready for DeepNude,” the statement concluded.

GritHub Bans DeepNude Replicas

Apparently, the world thought otherwise, because copies of the DeepNude app were shared and still are being shared all over the internet. 

The program was an app that was meant to be downloaded for use offline, and as a result, it could be easily replicated by anyone who had it on their hard drive.

That is exactly what happened. People who replicated the software reuploaded it on various social media platforms, like GitHub, which banned the app for violating its community guidelines.

“We do not proactively monitor user-generated content, but we do actively investigate abuse reports,” a GitHub spokesperson told Motherboard. “In this case, we disabled the project because we found it to be in violation of our acceptable use policy. We do not condone using GitHub for posting sexually obscene content and prohibit such conduct in our Terms of Service and Community Guidelines.”

According to The Verge, the DeepNude team itself actually uploaded the core algorithm of the app to GitHub.

“The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code,” The Verge said the team wrote on a now-deleted page. “DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects.”

However, Rogue Rocket was still able to find at least one GitHub repository that claimed to have DeepNude software for Android.

“Deep nudes for android. this is the age of FREEDOM, NOT CENSORSHIP! hackers rule the future!” the page’s description said. 

DeepNude Spreads

GitHub was not the only platform that the replicated app was shared on. 

Even with just a cursory search on Twitter, Rogue Rocket was able to locate two Twitter accounts that provided links to replicated versions of the app. One of the accounts links to a website called Deep Nude Pro, which bills itself as “the official update to the original DeepNude,” and sells the app for $39.99.

The other account links to a DeepNude Patreon where people can either download the app or send the account holder pictures they want to generate and then buy.

When Rogue Rocket searched YouTube, there appeared to be multiple videos explaining how to download new versions of the app, many of which had links to download the app in the description.

Others have also shared links on Reddit, and The Verge reported that links to downloads were being shared on Telegram channels and message boards like 4chan.

To make matters even worse, a lot of the replicated software includes versions that claim they removed the watermarks included in the original app, which were used to denote that the generated pictures were fake.

While it has been reported that a lot of the links to the reuploaded software are malware, download links to the new versions are still incredibly easy to find.

GitHub is also not the only platform to ban the app. According to Motherboard, last week Discord banned a server that was selling what was described as an updated version of the app, where customers could pay $20 in Bitcoin or Amazon gift cards to get “lifetime access.”

Source: VICE

The server and its users were removed for violating Discord’s community guidelines.

“The sharing of non-consensual pornography is explicitly prohibited in our terms of service and community guidelines,” a spokesperson for Discord told Motherboard in a statement.

We will investigate and take immediate action against any reported terms of service violation by a server or user. Non-consensual pornography warrants an instant shut down on the servers and ban of the users whenever we identify it, which is the action we took in this case.”

DeepNude App Used in Singapore

The rapid diffusion of the app on numerous social media platforms has now become an international problem.

On Wednesday, The Straits Times reported that over the past week “dozens of women in Singapore” have had pictures of them taken from their social media accounts and doctored to look like they are naked, then uploaded to pornographic sites.

Those photos are believed to have been doctored using a version of the DeepNude app, which have been shared via download links on a popular sex forum in Singapore.

Lawyers who spoke to The Straits Times told them that doctoring photos to make people look naked is considered a criminal offense in Singapore.

Even though the artificial intelligence aspect is new, one lawyer said that the broad definitions under the law could allow people to be prosecuted for doing so.

Another lawyer backed that up, saying that under Singapore’s Films Act, people who make DeepNude pictures can be jailed for up to two years and fined up to $40,000. They can also be charged with insult of modesty and face a separate fine and jail term of up to a year. 

Legal Efforts in the U.S.

The legal precedent in Singapore raises questions about laws that regulate deepfakes in the United States. While these efforts appear stalled on the federal level, several states have taken actions to address the issue.

On July 1, a new amendment to Virginia’s law against revenge porn, that includes deepfakes as nonconsensual pornography, went into effect. Under that amendment, anyone caught spreading deepfakes could face 12 months in prison and up to $2,500 in fines.

The idea of amending existing revenge porn laws to include deepfakes could be promising if it is effective. According to The New York Times, as of early this year, 41 states have banned revenge porn.

At the same time, lawmakers in New York state have also proposed a bill that would ban the creation of “digital replicas” of individuals without their consent. 

However, the Motion Picture Association of America has opposed the bill, arguing that it would “restrict the ability of our members to tell stories about and inspired by real people and events,” which would violate the First Amendment.

The opposition to the law in New York indicates that even as states take the lead with deepfake regulation, there are still many legal hurdles to overcome.

See what others are saying: (VICE) (The Verge) (The Strait Times)

Continue Reading

Business

Blue Bell Says It’s Working With Authorities to Track Down Viral Ice Cream Licker

Published

on

  • Footage has gone viral that shows a woman in a grocery store opening a container of ice cream, licking the top, and then placing it back into a store freezer for another customer to purchase. 
  • The ice cream brand Blue Bell now says they are working with authorities to track down the woman. 
  • The incident also prompted many to ask why the company does not have protective seals on its products.
  • Blue Bell says a “natural seal” is created when the ice cream hardens upside down during production.

Viral Video 

Blue Bell Ice Cream is looking for the woman seen in a viral clip opening a container of ice cream, licking the top, and putting it back into the store freezer.

Footage of the incident went viral on Twitter after a user who goes by the screen name Optimus Primal shared the clip on Saturday with the caption, “What kinda psychopathic behavior is this?!”

In the clip, which has over 11 million views, an individual offscreen can be heard encouraging the woman to lick the company’s Tin Roof flavored ice cream. 

Who is she?

In a statement to Time Magazine, the user who shared the clip said he doesn’t know the woman in it and actually found the video on Instagram as part of a story shared by the actress Alexis Fields-Jackson. 

The actress apparently doesn’t know the woman either. In a post on her Instagram, she wrote: “I am not the girl in the disgusting ice cream video. Leave me alone.” She also explained that she reposted the video like many others have and said it was removed by Instagram. The actress also promised to report those who have been defaming or threatening her over the confusion. 

Some social media users have identified the ice cream licker as a woman from San Antonio, Texas named Asia. According to a report from Heavy, she was allegedly the owner of the Instagram account “xx.asiaaaa.xx” while it was still active. Twitter users say she boasted about becoming famous over the video and said she recently had the flu, which sparked even more outrage.

“Now you can call it Flu Bell ice cream ’cause I was a lil sick last week,” a screenshot of one comment by the account reads. The post goes on to encourage others to follow her lead and use the hashtag #TinRoofChallenge writing, “Let’s see if we can start an epidemic (literally).”

Blue Bell Responds 

People on Twitter have also made sure to make the ice cream brand aware of the situation. Blue Bell responded to several users on Twitter saying they “take the issue very seriously.”

In a statement on their website, the company said they are “currently working with law enforcement, retail partners, and social media platforms” to investigate. 

“This type of incident will not be tolerated. Food safety is a top priority, and we work hard to provide a safe product and maintain the highest level of confidence from our consumers,” the company added.

Many users are calling for police to take action over the food tampering incident.

If the incident did happen in Texas, state law makes it a felony to tamper with consumer products if someone could be injured as a result. Depending on the degree of the charge, punishments can range from fines to jail time. 

One viral tweet said the ice cream licker was charged with a felony. However, as of now police have not confirmed any arrest. 

In fact, San Antonio police told KSAT that they can’t confirm that the incident even happened in their jurisdiction and say it does not appear that the girl in the video lives in San Antonio. So as of now, where the incident happened and who is responsible remains unclear.

Protective Seals

Aside from the outrage directed at the woman in the video, the incident also prompted many social media users to question why Blue Bell does not have protective seals on its ice cream containers.

Some even say the woman’s action don’t warrant an arrest but show that the company has a safety issue. 

The company responded to those complaints in their statement by saying, “During production, our half gallons are flipped upside down and sent to a hardening room where the ice cream freezes to the lid creating a natural seal.” 

“The lids are frozen tightly to the carton,” the statement continued. “Any attempt at opening the product should be noticeable.”

See what others are saying: (Time) (Heavy) (Insider)  


Continue Reading