Last week some questions were raised about the security of Amazon’s Alexa, as a Portland family claimed their Echo secretly recorded a private conversation and sent it to a friend on their contact list.
According to a report from Washington-based TV news local news station in Washington KIRO7, the audio recording was sent to a colleague who notified the family.
Danielle (who did not want to give her last name,) said the friend phoned her family, letting them know they had been hacked.
“Unplug your Alexa devices right now,” the friend caller said.
According to the report Danielle and her husband had joked previously that the devices that were hooked up in her family home and wired to control nearly everything there, from heat, lights, and the security system, could possibly be recording them.
“My husband and I would joke and say I’d bet these devices are listening to what we’re saying,” Danielle said.
“I felt invaded. A total privacy invasion.”
After receiving that urgent call from her friend and confirming Alexa had indeed recorded a private conversation, Danielle immediately unplugged all the home’s Echo devices and called Amazon repeatedly.
After an investigation from an Alexa engineer, company representatives told Danielle they researched through the family’s Echo logs, and her story seemed to line up.
She said Amazon representatives repeatedly apologized, though they were unable to offer specifics on why the issue occurred, or if it’s a widespread problem, the report said.
“He told us that the device just guessed what we were saying,” she said.
Though the company has offered to “de-provision” her Echo communications in order to continue its Smart Home Features, Danielle said she feels a refund for her devices is in order, though Amazon representatives have been unwilling to do so.
In a released statement, Amazon stated that the recording was prompted after a word in background sounding like “Alexa” woke up the Echo.
“Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right.” As unlikely as this string of events is, we are evaluating options to make this case even less likely,” the report said.
This issue raises questions about privacy security when using voice-controlled devices.
According to an article from Risk Management, as people increasingly introduce voice-activated devices into their home and work settings, risks and challenges associated with voice technology also increase.
In the article, Nathan Wenzler, chief security strategist at the cyber risk management company AsTech Consulting, said “The addition of voice absolutely increases the risk level for technology users. When you add more features to a device, you are also adding complexity and more code and, as a result, you are introducing more avenues for people to hack into the device. It’s a major risk component.”
“A husband and wife in the privacy of their home have conversations that they’re not expecting to be sent to someone (in) their address book,” Danielle said in the report.