Amazon Alexa, Google Assistant, and many other voice assistants may be coming under fire from the United States' Federal Trade Commission, if the words of Bureau of Consumer Protection director Andrew Smith are any indication.
At a recent press conference, Smith indicated that his department is interested in tackling "the collection and use of audio files". While that issue is already problematic, the press conference was in relation to a fine levied against Google in the wake of a YouTube scandal involving children's privacy.
In context, this almost certainly means that Smith was alluding to the possibility that children's privacy laws are being violated by audio-driven systems like voice assistants.
Smith mentioned that current policies on voice assistants and similar products state that children's audio recordings can only be used as needed, then must be erased as soon as possible.
Quite recently, it was found that many leading voice assistant providers were not only keeping recordings on hand, but also having live humans listen to them and transcribe key bits in order to help build out and train their language processing AI programs.
The practice was put to a hasty stop after being exposed, by all accounts, but there's no sure way to know if the tech giants are telling the truth, and if other problematic behaviors are being addressed. Alexa, for example, has been officially stated to save all recordings for good, so long as the user fails to manually delete them, which constitutes permission to keep them.
Whether this applies with parents' purchased AI assistant products and their children is a delicate issue, since children cannot legally give consent for things like data collection on their own.
The rather thorny subject, in this case, seems to hinge chiefly on what would constitute consent from a parent to keep a child's recordings, whether that consent would supersede FTC policies, and what ways, if any, voice assistants and other voice-driven services are differentiating data collected from children so that it can be treated accordingly.
The worries in this area are compounded in many ways. One big problem lies in the fact that the products do nothing to deter children from using them, and in some instances, even encourage it. This isn't always a bad thing, of course; Amazon, for example, created a kid-friendly version of Alexa that could help with compliance on any existing or new policies, and packed it into the Echo Dot Kids Edition. Another worry is the review and transcription process.
While it has been stopped or severely cut back on by most big firms out there, there's still the chance that it's happening without anybody knowing, and there's existing data out in the ether that's in danger of falling into the wrong hands.
One would be utterly remiss to say that third party workers and contractors leaking or mishandling sensitive data is new or even uncommon in the tech space at this point, and there are tons of other potential trouble spots for this sort of technology on an inherent basis. This essentially leaves tech firms like Google, Apple, Amazon, Samsung and more to wait out the results of the FTC's probe, figure out what need