Log In

Apple Settles Claim for Siri Eavesdropping

Published 10 hours ago4 minute read

Apple is paying $95 million over claims that Siri secretly recorded private chats and fed targeted ads

Person holding phone w/ SIRI logi in background

Artur Widak/NurPhoto via Getty Images

Sex, drug deals and doctor visits: according to allegations, Apple’s Siri eavesdropped on these and much more—on people’s iPhones, HomePods and Apple Watches—and used the content to target advertisements on users’ devices. Despite having denied selling our pillow talk to marketers, Apple just cut a $95-million check to settle a lawsuit in which plaintiffs reported eerie coincidences: discussing Air Jordan sneakers and immediately seeing ads for them; mentioning Olive Garden only to be served pasta commercials; talking privately with a doctor about a surgical procedure before seeing a promo for that very treatment. In early May the settlement administrator opened a claims website, allowing U.S. owners of every Siri-enabled gadget bought between September 2014 and December 2024 (essentially the lifespan of “Hey, Siri”) to request a payout of up to 20 bucks per affected device—enough for a drink and a wary glance at your phone.

The lawsuit, Lopez v. Apple, dates back to July 2019, when the Guardian published the allegations of an anonymous whistleblower—an Apple subcontractor whose job was to listen to Siri recordings to determine if the voice-activated assistant was being correctly triggered. The whistleblower claimed that accidental Siri activations routinely captured sensitive audio. Despite Apple’s promises that Siri listens only when invited, background noises (often just the sound of a zipper, according to the whistleblower) could switch it on. The contractor said user location and contact information accompanied recordings.

Apple had never explicitly told users that humans might review their Siri requests, and within a week of the Guardian report, the company halted the program. The first Lopez v. Apple complaint was filed in August 2019, and two weeks later Apple issued a public apology in which it promised to make human review opt-in-only and to stop retaining audio by default. That apology was framed to allay customer concerns—not as an admission of wrongdoing. Apple denied all allegations in the lawsuit, which is common in class-action settlements in U.S. courts.


If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


If the situation sounds familiar, your memory works. In 2018 Amazon’s Alexa recorded a married couple’s conversation about hardwood floors and sent it to one of the husband’s employees. Amazon blamed an unlikely chain of misheard cues—basically, it came down to Alexa butt-dialing someone with living room chatter. The following year Bloomberg reported that Amazon had thousands of workers transcribing clips to fine-tune the assistant. Later Google faced similar allegations. The pattern was clear: robots needed to be trained to make sure that they were hearing voice commands correctly, and this training needed to come from humans who, in the process, inevitably heard things they shouldn’t via consumer gadgets. Even TVs were implicated: in 2015 Samsung warned owners not to discuss secrets near its smart sets because voice commands were sent to unnamed third parties, a disclaimer that could have been written by George Orwell.

This isn’t tin-foil-hat territory. A 2019 survey found that 55 percent of Americans believe their phones listen to them to collect data for targeted ads, and a 2023 poll pushed the number north of 60 percent. In the U.K., a 2021 poll found two thirds of adults had noticed an ad that they felt was tied to a recent real-life chat. But psychologists say this perception of “conversation-related ad creep” often relies on a feedback loop driven by confirmation bias: we ignore the thousands of ads that form a constant backdrop to our lives but build a campfire legend from the one time we mentioned “fire,” and an app tried to sell us tiki torches. The result is a low-grade cultural fear, with people placing masking tape on device mics and TikTokers begging Siri to stop stalking them. Knowing how ravenous tech companies are for data, people can hardly be blamed for this attitude.

As for Apple, which once put “What happens on your iPhone, stays on your iPhone” on a Las Vegas billboard, the settlement doesn’t force it to admit fault—but lands a dent in its titanium halo: If the Cupertino, Calif.–based company can’t keep a lid on hot-mic moments, who can?

(Asked for comment by Scientific American, Apple shared information on the settlement and emphasized its commitment to privacy. And Amazon reiterated its commitment to privacy, writing, “Access to internal services is highly controlled, and is only granted to a limited number of employees who require these services to train and improve the service.” Samsung and Google had not responded to requests for comment by the time of publication.)

Origin:
publisher logo
Scientific American
Loading...
Loading...
Loading...

You may also like...