Amazon Echo, Google Home Smart Speakers Can Be Hacked With Laser ‘Light Commands’, Researchers Claim
Researchers have found an interesting and scary new way of hacking into smart speakers. It includes the use of lasers to send commands to the smart speakers like Amazon Echo or Google Home without actually using spoken commands. Apparently, the microphones present in the smart speakers also react to the light and by modulating an electrical signal in the intensity of a laser beam, the attackers can trick microphones into believing that they are getting actual spoken commands.
According to a report by Wired, cyber-security researcher Takeshi Sugawara and a group of researchers from the University of Michigan say that they can change the intensity of a laser beam and point it to a smart speaker’s microphone to send any command that would be interpreted as a normal voice command by a smart speaker. The vulnerability is not just limited to smart speakers but can be used to hack into any computer that includes a microphone to accept voice commands. The researchers, who dubbed the vulnerability as Light Commands, also tested the same method with smartphones and tablets and achieved some success.
“Light Commands is a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants, such as Google Assistant, Amazon Alexa, Facebook Portal, and Apple Siri using light,” the researchers wrote on a website dedicated to the Light Commands vulnerability.
The vulnerability can be used to break into other systems that are connected through a smart speaker like smart home devices, connected garage doors, online shopping, remotely starting some vehicles, and more.
While the vulnerability certainly sounds serious, it will need a lot of effort from an attacker to exploit it. It requires specialised equipment, like a laser pointer, laser driver, sound amplifier, a telephoto lens, and more. Also, an attack would only work on unattended smart speakers as an owner could notice a light beam reflecting on a smart speaker.
Google and Amazon told Wired that they are reviewing the research team’s findings but did not share any plans on whether or how they plan to protect against such attacks.
The best mitigation available against the Light Commands vulnerability right now is avoid putting your smart speaker, smartphone, or tablet in the line of sight of a possible attacker. You can also active a secondary authentication for online purchases or other sensitive commands on your smart speaker, if available.