Alexa Records Still Accessible to Developers, Even After Deletion


In a statement that will surprise a few individuals, Amazon says that records of interactions with its virtual assistant Alexa remain in its database indefinitely, even after a user has specifically deleted the voice recordings. We've written several articles detailing how Amazon stores your data and how you can take back control of it, but it turns out that the tools Amazon provides for this task don't completely erase everything.

TheNextWeb points out, Amazon will retain all Alexa records data indefinitely unless a user manually deletes these recordings. Problem is that it's not just recordings that are sitting on Amazon's servers, it's also the transcripts, and there's no way to delete a transcript once it's been made. Amazon creates these transcripts through a series of AI-powered actions and good old human typing, and are designed to better help developers create those famous Alexa Skills that people have come to love using.

Since these transcripts are a core part of how Alexa Skills are created and improved, Amazon doesn't give users a way to delete them and, thus, has created a concern for folks worried that their entire lives are sitting in a database somewhere.


The response from Amazon has been a bit confusing, but much of it is wrapped up in the verbiage used. In May, CNet reported that Amazon keeps Alexa recordings and transcriptions with no way to delete them. Then recently, they rolled out new voice commands to help users delete all their voice interaction data without even using the website. At the same time, Amazon is also saying that it only keeps a record of customers' interactions with Alexa instead of the whole transcript or recording, meaning only the actions taken by Alexa and the command issued is retained.

An Amazon representative confirmed this last sentence was the case and provided a bit more clarity in the form of a statement Amazon made to Senator Chris Coons on the matter:

"When a customer deletes a voice recording, we delete the transcripts associated with the customer's account of both of the customer's request and Alexa's response. We already delete those transcripts from all of Alexa's primary storage systems, and we have an ongoing effort to ensure those transcripts do not remain in any of Alexa's other storage systems."


The entire problem stems from a lack of transparency with our data and how companies like Amazon both handle and retain that information. When records are stored, how much personally identifiable information can come back to individual users when data is later reviewed for development (or other) purposes? The biggest issues may not even be in the ability for Amazon employees to hear your entire conversations, but the fact that this information could be easily linked to several other bits of information during a security breach.

While many will say that Amazon's size and sophisticated infrastructure are nearly impossible to hack into or breach in any way, the recent massive data breach of 500 million people's records through Marriott's servers show that size simply doesn't matter in these sorts of circumstances. Personal privacy has become a huge topic in all corners of the tech sector, especially with the fact that our lives are becoming increasingly more and more connected every day.

Share this page

Copyright ©2019 Android Headlines. All Rights Reserved.

This post may contain affiliate links. See our privacy policy for more information.
Assistant Editor

Nick has written for Android Headlines since 2013 and has traveled to many tech events across the world. He's got a background in IT and loves all things tech-related. Nick is the VR and Home Automation Editor for the site and manages the Android Headlines YouTube channel. He is passionate about VR and the way it can truly immerse players in different worlds. In addition, he also covers the gamut of smart home technology and home automation. Contact him at [email protected]

View Comments