Apple offers $95 million in Siri privacy violation settlement

2 days ago 12
BOOK THIS SPACE FOR AD
ARTICLE AD

Apple offers $95 million in Siri privacy violation settlement

Apple has agreed to pay $95 million to settle a class action lawsuit in the U.S. alleging that its Siri assistant recorded private conversations and shared them with third parties.

The proposed lawsuit alleges that the audio data was disclosed without users' consent to a network of third-party marketers and advertisers.

Users complained of being targeted on their Apple devices with advertisements for products concerning sensitive and very specific matters discussed in private conversations, when Siri had been activated by accident.

The case, submitted by Fumiko Lopez, John Troy Pappas, and David Yacubian, on behalf of others similarly situated, accuses Apple of violations of the federal Wiretap Act and California's Invasion of Privacy Act. 

Settlement agreement

According to court documents, Apple is set to create a $95 million non-reversionary fund to cover payments to class members, attorney fees, awards for class representatives, and administrative costs.

The settlement applies to all U.S.-based current or former owners of Siri-enabled devices, like iPhones, iPads, and Macs, whose communications were obtained or shared without consent due to unintentional Siri activations between September 17, 2014, and December 31, 2024.

Class members can claim up to $20 per Siri-enabled device for up to five devices, while the plaintiffs may receive up to $10,000 for their efforts.

The preliminary approval hearing is scheduled for February 14, 2025. If the case moves forward, there will be a claims submission deadline set to 135 days, on June 29, 2025.

In addition to the monetary aspect of the settlement, Apple is also required to permanently delete all Siri audio recordings obtained while in violation of the said laws within six months after the settlement's effective date.

In the future, Apple is also expected to provide clear and easily understandable disclosures about how users can manage Siri settings to protect their data from unintentional disclosure.

It is important to note that this is a proposed settlement. Depending on what objections are submitted and how the court will handle them, the final form may contain adjusted or even rejected terms.

BleepingComputer has contacted Apple to ask for a comment on this case, and we will update this post as soon as we hear back.

Putting Siri to sleep

Siri is integral to Apple software and users can take steps to adjust its sensitivity or restrict its usage from certain apps.

The first step would be to disable the voice-based "Hey Siri" activation through your device's settings by toggling off the 'Listen for Hey Siri' option. On the Apple Watch, 'Raise to Speak' is a similarly risky setting to enable. 

For apps that handle sensitive data or are often active on the device, go to 'Settings > Siri & Search' and disable 'Use with Siri' for them. Also, toggle off 'Suggestions on Lock Screen' and 'Suggestions in Search.'

Finally, it's useful to regularly delete Siri and dictation history via 'Settings > Siri & Search > Siri & Dictation History' to wipe potentially sensitive data.

Read Entire Article