Researchers with the University of Michigan and the University of Electro-Communications (Tokyo) have devised a new technique, dubbed “light commands,” to remotely hack Alexa and Siri smart speakers using a laser light beam, the attackers can send them inaudible commands.
The “light commands” attack exploits a design flaw in the smart assistants microelectro-mechanical systems (MEMS) microphones. MEMS microphones convert voice commands into electrical signals, but researchers demonstrated that they can also react to laser light beams.
“Light Commands
“In our
The tests conducted by the experts demonstrate that it is possible to send inaudible commands via laser beam from as far as 110 meters (360 feet). Popular voice assistants, including Amazon Alexa, Apple Siri, Facebook Portal, and Google Assistant, are vulnerable to remote hack.
“We propose a new class of signal injection attacks on microphones based on the
In a real-life attack scenario, an attacker could stand outside an office or a house and use a laser light onto a voice assistant to instruct a voice assistant to unlock a door or make any other malicious actions.
MEMS microphones are composed of a diaphragm and an ASIC circuit, when the former is hit with sounds or light, it sends electrical signals that are translated into commands
The experts demonstrated how to “encode” commands using the intensity of a laser light beam and causes the diaphragm to move. The movements of the diaphragm generate electrical signals representing the attacker’s commands.
The researchers made various tests, they were able to measure light intensity using a photo-diode power sensor and evaluated the response to the different light intensities on
“
“We recorded the diode current and the microphone’s output using a Tektronix MSO5204 oscilloscope,” they said. “The experiments were conducted in a regular office environment, with typical ambient noise from human speech, computer equipment, and air conditioning systems.”
Below some videos
Experts also explored the feasibility of the attack, hackers could use cheap equipment to send commands to the voice assistants. Researchers explained that they used a simple laser pointer available for as little as $14 on Amazon and eBay, along with a laser driver designed to drive a laser diode by providing a current and a sound amplifier.
The list of voice assistants using MEMS microphones that might be vulnerable to the light commands attack includes Alexa, Siri, Portal and Google Assistant.
The good news is that researchers are not aware of the
Countermasures include the implementation of further authentication, sensor fusion techniques or the use of a cover on top of the microphone to prevent the light hitting it.
“An additional layer of authentication can be effective at somewhat mitigating the attack,” they concluded. “Alternatively, in case the attacker cannot eavesdrop on the device’s response,
[adrotate banner=”9″] | [adrotate banner=”12″] |
(SecurityAffairs – light commands attack , voice assistants)
[adrotate banner=”5″]
[adrotate banner=”13″]