Science

Apple APOLOGIZES for Siri listening practices and says program will only apply to those who opt-in


Apple APOLOGIZES for letting contractors listen to Siri recordings without users’ knowledge and says the program will now only apply to those who opt-in

  • After backlash, Apple is making major changes to its Siri listening program
  • Users will be defaulted out of the program and can choose to opt in
  • It will now use its own employees to listen to snippets, not contractors 
  • In a blog post, Apple apologized saying that it hasn’t met ‘high ideals’ 

Apple said it is now, by default, opting customers out of a program that listens to audio snippets scraped up by its voice-assistant, Siri.

The company announced the decision in a blog post this week and marks the most significant step since it decided to suspend the program earlier this summer.

Apple has been identified as one of many companies that was harvesting audio snippets from users in an effort to improve the accuracy of its voice-assistant. Unbeknownst to most, those snippets were then reviewed by human contractors.

‘As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize,’ wrote Apple.

Apple will make major changes to a program that scraped up audio from its voice-assistant Siri

Apple will make major changes to a program that scraped up audio from its voice-assistant Siri

According to the company, it will will resume its program later this fall with several major changes.

In addition to opting users out by default, Apple said it will also use strictly Apple employees to pore over the data. Previously, the company had hired third-party contractors to do the work.

Some of those contractors were the first people to speak to The Guardian anonymously in July when the program was first reported.

Part of the impetus for making the practice of harvesting audio snippets public, they said, was the intrinsically personal nature of some of the content. 

Not infrequently, devices like the Apple Watch or Homepod would inadvertently capture audio not intended for the device, including conversations about sex, business, or medical issues.

While the contractors were reportedly encouraged to report accidental triggers, they say the process is purely a technical one and that they were not given a hard procedure on how to handle sensitive information.

As a part of the new changes, Apple said it will work to ensure that any information accidentally harvested from the company will be deleted. 

In addition to opting users out by default, Apple said it will also use strictly Apple employees to pore over the data. Previously, the company had hired third-party contractors to do the work. File photo

In addition to opting users out by default, Apple said it will also use strictly Apple employees to pore over the data. Previously, the company had hired third-party contractors to do the work. File photo

Apple said that all of the information picked up in its program was anonymized, but  whistle blowers said the intrinsically personal nature of some of the recordings puts that anonymity at risk. 

Apple is just one of many other companies recently found to be using human contractors to analyze voice-assistant commands.  

Among the other devices found to be recording users were Amazon’s wildly popular smart speaker, Alexa, the Google Home, and Facebook through its audio messaging feature.

Similarly, these devices have regularly harvested data that most might consider private, including sex, private conversations, business, and even medical information.

IS YOUR SMARTPHONE LISTENING TO YOU TO TARGET ADS?

For years smartphone users have complained of the creepy feeling their gadget is recording their every word, even when it is sat in their pocket.

Many share a similar story: They were chatting about a niche product or holiday destination with friends, and soon afterwards an advertisement on the same theme appears in their social media apps.

According to Dr Peter Henway, a senior security consultant for cybersecurity firm Asterisk, these oddly pertinent ads aren’t merely a coincidence and your phone regularly listens to what you say.

It’s not known exactly what triggers the technology, but Dr Henway claims the technique is completely legal and is even covered in the terms of your mobile apps’ user agreements.

Most modern smartphones are loaded with AI assistants, which are triggered by spoken commands, like ‘Hey Siri’ or ‘OK, Google’.

One cybersecurity researcher suggests that these oddly pertinent ads aren't merely a coincidence, and that your phone regularly listens in on what you say. One expert said it's not known exactly what triggers the technology (stock image) 

For years smartphone users have complained of the creepy feeling their gadget is recording their every word, even when it is sat in their pocket (stock image) 

These smartphone models are constantly listening out for the designated wake word or phrase, with everything else discarded. 

However, keywords and phrases picked-up by the gadget can be accessed by third-party apps, like Instagram and Twitter, when the right permissions are enabled, Dr Henway told Vice.

This means when you chat about needing new jeans, or plans for a holiday, apps can plaster your timeline with adverts for clothes and deals on flights.

Facebook categorically denies it uses smartphone microphones to gather information for the purposes of targeted advertising.

The company has previously said that the eerie feeling that your phone is listening to you is merely an example of heightened perception, or the phenomenon whereby people are more likely to notice things they’ve recently talked about.

A number of other companies, including WhatsApp, also deny bugging private conversations, describing any anecdotal evidence as pure coincidence. 

 



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.