Amazon gives customers a way to immediately delete Alexa voice recordings — here’s how to do it Wire 1 month ago News
 Amazon gives customers a way to immediately delete Alexa voice recordings — here’s how to do it
The new Echo Show 10 features a panning camera that moves to look at the user. (Amazon Photo)

Privacy was front-and-center for Amazon at the company’s big devices and services event this week.

The most noteworthy privacy-related announcement was a new feature for Alexa that lets users automatically and immediately delete their voice recordings. Amazon rolled out options last year to automatically delete recordings after three months or 18 months, but not immediately after Alexa processes a request.

Amazon unveils spherical Echo speakers, flying indoor security camera, and cloud gaming service

The change is worth highlighting given Amazon’s recent history with Alexa and privacy.

The company last year faced public backlash over human review of voice recordings from Alexa devices. It responded with a privacy setting that let users opt-out of human review — following similar announcements from Apple and Google — and sought to offer assurances about how those recordings were used. The tech giant considered a more extreme measure that would have opted users out of the practice by default, but decided against it.

Dave Limp, Amazon’s devices and services chief, defended the decision at the GeekWire Summit last year, saying that human review is “critically important to making Alexa better.”

Toni Reid, Amazon’s vice president of Alexa Experience & Echo Devices, offered a similar sentiment this week when asked why Amazon does not immediately delete Alexa recordings by default.

“It’s important that the service continue to get better for customers,” she said in an interview on Thursday. “Data does help improve the service.”

Reid added: “We do want to give customers choice. By defaulting it to ‘off,’ it’s actually in some ways making the decision for them.”

In a blog post, Karthik Mitta, Amazon’s director of Alexa Privacy, said “by choosing to save your voice recordings, you have access to more personalized features, Alexa can better understand requests, and we can continue to improve the service.”

Amazon’s new Echo Dot devices. (Amazon Photo)

Assuring customers that the company is thinking about their privacy was a common thread during Thursday’s event. Amazon is investing more in building privacy-specific features, Reid said. For example, users can now also ask, “Alexa, how do I review my privacy settings?” and will be sent a direct link in the Alexa app to Alexa Privacy Settings.

Amazon also unveiled a new feature this week that lets users say, “Alexa, delete everything I’ve said,” to delete all saved voice recordings.

Amazon customers can have Alexa delete recordings immediately by visiting Alexa Privacy Settings online, or by going to “Your Content and Devices,” then “Privacy Settings,” then “Alexa Privacy,” and then “Manage Your Alexa Data.”

Users can also go on the Alexa app and navigate to More Menu, Settings, Alexa Privacy, and Manage Your Alexa Data.

The new Echo.

The listening capacity of digital assistants such as Alexa and Apple’s Siri has also become a major privacy sticking point in the last year. A group of researchers out of Northeastern University and Imperial University of London have been studying smart speakers to learn more about what triggers them, and whether or not they are “listening” all the time. The ongoing study has found “no evidence to support” the possibility that digital assistants are always listening.

Drone home: Amazon’s new Ring indoor security camera gives flight to new privacy concerns

The definition of “listening” can get confusing, even for the people who make the devices. Under questioning on an episode of PBS Frontline earlier this year, Limp was asked how Amazon could convince millions of people to install “listening devices” in their home. Limp appeared to misstep when answering the question, insisting that Alexa isn’t a listening device, before describing how it’s “listening,” then backtracking.

“I would first disagree with the premise. It’s not a listening device,” Limp said. “The device in its core has a detector on it — we call it internally a ‘wake word engine’ — and that detector is listening — not really listening, it’s detecting one thing and one thing only, which is the word you’ve said that you want to get the attention of that Echo.”

The devices can actually listen for different noises beyond the “wake” word. The Alexa Guard service can detect suspicious noises while you’re away from home. Amazon this week revealed Guard Plus, a subscription version of the service.

The question of how virtual assistants are monitoring for wake words will become even more important as they spread to different types of devices and beyond the home. It will also be pertinent as Amazon expands Alexa’s conversational capabilities. With a new “natural turn-taking” feature unveiled Thursday, users will also be able to ask Alexa to join a conversation taking place in the kitchen, for example, chiming in as two people order a pizza and pick a movie for the evening.

Users say “Alexa, join our conversation” to activate the feature. Rohit Prasad, vice president and head scientist of Alexa Artificial Intelligence, said he doesn’t think people should be more concerned about privacy even as Alexa becomes more actively engaged in discussions.

“There are many ways to learn and see if we need to bring even tighter controls,” he told GeekWire. “We believe privacy is foundational and no one has to choose between privacy and utility and delight. We don’t want our customers to make those choices. We believe that privacy should be just baked into everything we do.”

Amazon is bolstering its privacy-related exec team. Last month it hired privacy expert Anne Toth, who is now the company’s director of Alexa Trust. She previously led privacy and policy initiatives at Slack, and spent 13 years at Yahoo. She was most recently head of technology and policy & partnerships for the World Economic Forum.

Read Entire Article!