Amazon forced to admit it may keep hold of your data even AFTER you delete audio clips

0
114
Amazon


Amazon has confirmed that its Alexa voice assistant sometimes stores your data indefinitely, even after any corresponding audio clips have been deleted.

The admission comes after inquiries from US Senator Chris Coons, who asked the tech firm to explain what happens to voice records and data gathered by Alexa.

The senator, a democrat, wrote to Amazon following a CNET investigation in May that revealed that the company retains voice records unless users delete them.

The probe had also suggested that, regardless, written transcripts of those voice recordings may also be kept indefinitely.

Scroll down for video 

Amazon’s device – along with Apple’s Siri and, until recently, Google’s Assistant – saves every single interaction a person has with the device, with some unintentional snippets also being recorded. Now, the Echo and the Echo Dot which dominate the market

HOW DOES ALEXA WORK? 

Any time audio is sent to the cloud, a visual indicator appears on the Echo device – a light ring on Amazon Echo will turn blue or a blue bar will appear on Echo Show.  

Amazon also says that voice recordings are kept until a customer chooses to delete them. 

The recordings are used to increase the diversity with which Alexa is trained to help it better understand customer requests.

For example, differentiating between YouTube and U2 and using historical context, such as the Olympics, to know what the user is referring to.   

Amazon maintains the deice is not activated until the wake word is said: this can be configured to be Alexa, Echo or Computer. 

It also records when the microphone button is manually pressed.  

Senator Coons published Amazon’s response to his letter on his website.

In their statement, the tech firm confirmed the findings of the CNET investigation, revealing that they do store voice recordings until users elect to go through and manually delete them from their smart speaker devices.

This means that the devices do not delete the recordings they make by default.

The issue of whether Amazon retains corresponding transcripts, however, has not been clearly resolved.

According to CNET, transcripts of voice records are deleted from Alexa’s ‘main system’, but the files remain in other systems on Amazon’s servers, offering users no option to have these additional copies deleted. 

In Amazon’s statement, they confirmed that transcripts are deleted from Alexa’s ‘primary storage systems’ when users elect to delete corresponding voice recordings through Alexa’s Privacy Hub dashboard.

The tech firm claimed that there is ‘an ongoing effort’ to ensure such transcripts do get universally purged.  

Other data, however, may also be retained.   

‘We do not store the audio of Alexa’s response,’ a spokesperson for Amazon said . 

‘However, we may still retain other records of the customers’ Alexa interactions, including records of actions Alexa took in response to the customer’s request.’

These action records could either be kept by Amazon, or other developers in the instances where Alexa owners use a third-party Alexa skill voice application. 

‘For example, for many types of Alexa requests — such as when a customer subscribes to Amazon Music Unlimited, places an Amazon Fresh order, requests a car from Uber or Lyft, orders a pizza from Domino’s, or makes an in-skill purchase of premium digital content — Amazon and/or the applicable skill developer obviously need to keep a record of the transaction,’ Amazon wrote.

Amazon also said that for requests like setting recurring alarms, reminders, scheduling events or messaging a friend, users would not expect or want their data to be deleted, as such could prevent Alexa completing the task.

The tech firm also explained that transcripts are used to help train and improve Alexa’s machine learning systems, provide customers with logs of what they have said to the smart speakers and how the device replied.

Amazon confirmed that the devices stop recording when customers stop speaking — a move which is represented through either the device’s light or a tone — and that the local recording buffer on each device itself is overwritten frequently.

Alexa has also been designed, they wrote, to record and process as little customer audio as possible, as processing non-essential recordings would be expensive and of no value to Amazon.

HOW TO FIND OUT WHAT YOUR ALEXA HAS RECORDED ABOUT YOU?  

Open the Alexa app which the devices are synced to or go to this link

Select the icon in the top left corner – often dubbed the ‘hamburger’

Press ‘Settings’ at the bottom of the menu 

Select ‘Alexa Account’ located at the top of the menu 

Press ‘Alexa Privacy’ at the bottom of the menu 

In this section a range of options will appear in a different looking menu – select ‘Review Voice History’

Here all the entries of all Alexa-enabled devices attached to an account will be listed in reverse order, with the most recent at the top. 

To view all entries, select the ‘All History’ option from the drop down menu and scroll through the pages. 

It will show all entries and those that it claims were recorded but not meant to be for Alexa are not transcribed, instead it reads ‘Text not available – audio was not intended for Alexa’.

These can still be listened to by selecting the drop down arrow on the right hand side and pressing play – locate don the left. 

For users who want to remove all trace of these recordings – pressing the ‘Delete All recordings for All History’ button will do so. 

There is currently no way of saving the data yourself and taking it off Amazon’s servers.  

‘I appreciate that Amazon responded promptly to my concerns, and I’m encouraged that their answers demonstrate an understanding of the importance of and a commitment to protecting users’ personal information,’ Senator Coons said in a statement published on his website.

The senator had originally given Amazon a deadline of June 30, with their response having been date June 28.

“However, Amazon’s response leaves open the possibility that transcripts of user voice interactions with Alexa are not deleted from all of Amazon’s servers, even after a user has deleted a recording of his or her voice,’ Senator Coons continued.

‘What’s more, the extent to which this data is shared with third parties, and how those third parties use and control that information, is still unclear. 

‘The American people deserve to understand how their personal data is being used by tech companies, and I will continue to work with both consumers and companies to identify how to best protect Americans’ personal information.’ 

‘Amazon isn’t alone in retaining user data indefinitely,’ privacy advocate Paul Bischoff of Comparitech.com told the MailOnline.

‘Most companies don’t put expiration dates on the data you give them. Google doesn’t delete your search history unless you tell it to. Facebook doesn’t remove old posts or photos unless you manually remove them.

‘Removing all this data on your own isn’t difficult, but most people don’t bother, and those who do probably don’t do so regularly.

‘If privacy policies don’t explicitly mention how long data is retained, then you should assume your data is held and used indefinitely.’

WHY ARE PEOPLE CONCERNED OVER PRIVACY WITH AMAZON’S ALEXA DEVICES?

Amazon devices have previously been activated when they’re not wanted – meaning the devices could be listening.

Millions are reluctant to invite the devices and their powerful microphones into their homes out of concern that their conversations are being heard.

Amazon devices rely on microphones listening out for a key word, which can be triggered by accident and without their owner’s realisation. 

The camera on the £119.99 ($129) Echo Spot, which doubles up as a ‘smart alarm’, will also probably be facing directly at the user’s bed. 

The device has such sophisticated microphones it can hear people talking from across the room – even if music is playing. 

Last month a hack by British security researcher Mark Barnes saw 2015 and 2016 versions of the Echo turned into a live microphone.

Fraudsters could then use this live audio feed to collect sensitive information from the device.   

 



Source link