Locked-down lawyers warned Alexa is hearing calls

Bloomberg

Hey Alexa, stop listening to my client’s information.
As law firms urge attorneys to work from home during the global pandemic, their employees’ confidential phone calls with clients run the risk of being heard by Amazon.com Inc and Google.
Mishcon de Reya LLP, the UK law firm that does corporate law, issued advice to staff to mute or shut off listening devices like Amazon’s Alexa or Google’s voice assistant when they talk about client matters at home, according to a partner at the firm. It suggested not to have any of the devices near their work space at all.
Mishcon’s warning covers any kind of visual or voice enabled device, like Amazon and Google’s speakers. But video products such as Ring, which is also owned by Amazon, and even baby monitors and closed-circuit TVs, are also a concern, said Mishcon de Reya partner Joe Hancock, who also heads the firm’s cybersecurity efforts.
“Perhaps we’re being slightly paranoid but we need to have a lot of trust in these organisations and these devices,” Hancock said. “We’d rather not take those risks.”
The firm worries about the devices being compromised, less so with products like Alexa, but more so for a cheap knock-off devices, he added.
Like Wall Street, law firms are facing challenges trying to impose secure work-from-home arrangements for certain job functions. Critical documents, including those that might be privileged, need to be secured. Meanwhile in banking, some traders are being asked to work at alternative locations that banks keep on standby for disaster recovery instead of makeshift work-from-home stations to maintain confidentiality.
Smart speakers, already notorious for activating in error, making unintended purchases or sending snippets of audio to Amazon or Google, have become a new source of risk for businesses. As of last year, the US installed base of smart speaker devices was 76 million units and growing, according to a Consumer Intelligence Research Partners report.
Amazon and Google say their devices are designed to record and store audio only after they detect a word to wake them up. The companies say such instances are rare, but recent testing by Northeastern University and Imperial College London found that the devices can activate inadvertently between
1.5 and 19 times a day.
Tech firms have been under fire for compromising users privacy by having teams of human auditors listen to conversations without consent to improve their AI algorithms. Google has since said that users have to opt-in to let the tech giant keep any voice recordings made by the device.

Leave a Reply

Send this to a friend