Google has admitted that its Home speakers recorded users at all times, even when they hadn't said "wake words" such as "OK Google," due to a security error earlier this year.
"For privacy reasons, the devices normally don’t listen until they hear the “wake words” designed into them – and are supposed not to record until that point, Protocol reported. But users on Reddit started to get notifications when their Google Home devices had recorded events such as glass breaking, The Register reported. The alerts were normally offered only to users who subscribe to the Nest Aware home security service. Google said the feature had been accidentally turned on due to a software update," reports Yahoo News.
“The latest major privacy failure at Google is a reminder that when you have a microphone nearby, it is likely recording. The important message to any vendor with active smart microphones is that transparency and consent for the users when the device is recording is critical, especially at a time when many employees are working from home and sensitive business details might be leaking via nearby smart devices," Joseph Carson, chief security scientist and Advisory CISO at Thycotic, a Washington D.C. based provider of privileged access management (PAM) solutions, notes. "The good news is that Google reported the privacy incident and made an improvement to notify and alert the user when a recording have been made.”
A Google spokesperson told Yahoo News UK, “We are aware of an issue that inadvertently enabled sound detection alerts for sounds like smoke alarms or glass breaking on speakers that are not part of a Nest Aware subscription. The issue was caused by a recent software update and only impacted a subset of Google Home, Google Home Mini, and Google Home Max speakers. We have since rolled out a fix that will automatically disable sound detection on devices that are not part of Nest Aware.”
Mohit Tiwari, Co-Founder and CEO at Symmetry Systems, a San Francisco, Calif.-based provider of cutting-edge Data Store and Object Security (DSOS), says, “Accidentally recording audio/video can stem from mundane errors, rather than malicious intent on Google's behalf -- they've probably too much more to lose from this kind of news than from eavesdropping. While it sounds dramatically bad, in most cases, the underlying cause is that integration-testing big software systems and putting production-time seatbelts on them is a very hard problem."
What developers need is a tool that can run through a check-list of safety issues -- privacy constraints boiled down into code -- so that if an error occurs, it is automatically caught and the developers can fix it as soon as possible, either in pre-production or early in production, Tiwari adds. "However, such tools today aren't available at scale today -- automated pre-production tools check for close to machine-level errors (like "memory safety") but not for privacy or "information flow" errors. Production-time privacy checks are even scarce. More broadly, there are several challenges to users' privacy from smart-speaker systems. Permissions on things like Android/iPhone are already very challenging...people just say yes to 'do you want to give this wallpaper app access to SD card and internet'. Things like accelerometer or air-pressure sensors can leak browsing history or location. And speakers add the additional layer that instead of a check-box, the input is a machine learning classifier which can err in unpredictable ways. So being able to precisely say 'we will only listen to this dictionary of words and delete everything else' is probably some ways away," Tiwari explains.
Wendy Foote, Senior Contracts Manager at WhiteHat Security, a San Jose, Calif.-based provider of application security, explains that voice assistant technology, including that used in smart speakers designed for the home, have long been suspected of and in some cases sometimes confirmed, of recording more than voice commands -- Google is just the latest company forced to admit recording private conversations and other audio in user’s homes, Foote says.
"The global market for smart speakers in 2020 was approximately $7 billion, and Google is a major player in the market. Their revenue in this market is significant. Assuming they dedicate significant resources to this product line, it is surprising to learn the inadvertent recording of private interactions in people’s homes was caused by a recent software update that turned on advanced sound detection that bypassed “wake up” words (ie. “Siri”, “Hey Google”) that activated Google’s listening feature,” Foote notes.
Foote adds, “Conducting a “Data Privacy Impact Assessment” before beginning projects that involve high risk to people’s private information is required of companies collecting information in the EU but not in the U.S. Best practices call for this type of assessment regardless of geographical requirements. Assuming this practice was carried out, why wasn’t this issue discovered during the software development testing phase given the resources they must dedicate to this product line?
“U.S, European, and other enforcement agencies will be knocking on Google’s door,” Foote warns. “What private right of action do individuals have? Generally, “invasion of privacy’ is the intrusion into the personal life of another, without just cause, which can give the person whose privacy has been invaded a right to bring a lawsuit for damages against the person or entity that intruded. However, proving damages is challenging and critical to obtaining compensation in the U.S.”