Those who believe that their smartphones “listen in” on their conversations aren’t sounding so crazy these days. U.S District Judge Jeffrey White just allowed a proposed class-action lawsuit against Apple, one of the major Big Tech Companies in the country. He said California plaintiffs can pursue the claims that Big Tech violated the federal Wiretap Act, California privacy law, and committed breach of contract. He dismissed the claims of unfair competition.
The lawsuit claims that Apple’s voice-activated Siri assistant has violated users’ privacy by recording private conversations. It talks about “accidental activations” where Apple has disclosed private conversations from users to third parties, such as advertisers.
This type of Big Tech spying was first made public in 2019 when Apple hired contractors to listen in and grade Siri conversations for the purpose of “improving products and services.” They said they knew customers were concerned about people listening to the Siri quality evaluation process, otherwise known as “grading,” and that they had made some recent changes to Siri. With the release of iOS 13.2, Apple announced that users could delete their voice assistant history and opt-out of sharing audio recordings. At this point, any tech presence means that the big companies know everything about you.
The lawsuit was brought by California users who accused Apple of recording their conversations. Apple has continued to try and throw out the lawsuit, but Judge White has argued that plaintiffs could continue pursuing against the companies’ voice assistants. The courts have also warned Apple against passing data on to third parties and violating user privacy.
Apple has continued to deny that their voice assistants violate any privacy concerns and that they only listen for “wake words” and to receive commands. An Amazon spokesperson has also shared that only a “small fraction” of user audio is reviewed when a wake word is used. Judge White has claimed that any device listening in when a user does not engage with Siri still results in a breach of privacy.
Big Tech companies could still claim that the “accidental activations” triggered the Hey Siri prompt, but users have continued to claim that the recordings are being shared with third parties.
One user’s private conversations about a brand name treatment with his doctor have led him to receive ads for that exact same treatment. Others have talked about their conversations regarding shoes, sunglasses, and restaurants, which later appeared in targeted ads for them.
Apple has also had issues with the release of its new “watchdog” feature that would aim at limiting the spread of child sexual abuse materials. The software would scan all files before being uploaded to iCloud, but privacy experts claimed that it could be used to frame innocent people. Apple has scrapped its plans to roll out the feature in the coming months.
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said.
Privacy experts also raised concerns with another feature Apple was looking to implement. It would include an alert system for parents if their children were receiving or sending sexually explicit images. Critics argued again that users could be wrongfully framed if people sent the images to others designed to “trigger matches” for child pornography.
Matthew Green, a John Hopkins University cryptography researcher, warned Apple that they needed to build support before launching something like this. He said they went from “scanning almost nothing” to scanning private files.
Apple has rubbed too many people the wrong way, between their censorship tactics and anti-competitive approach. It’s time for Big Tech to be held accountable – starting with the conversations they listen in on.