As technology becomes a larger part of our daily lives, trust in tech companies is dropping lower every day. Over the past few years we have seen multiple data leaks, data privacy violations, and other anti-trust allegations from some of the biggest tech giants in the world. Now it seems something as simple as a laster could take control of your smart assistant, an integral part of your data privacy for many people.
Researchers in Japan and the University of Michigan jointly announced that they discovered a simple hack that can allow access into any digital assistants (Siri, Google Assistant, Alexa, etc.) just by using lasers. Or even ordinary flashlights.
Smart speakers with voice assistants all use the same MEMS (microelectro-mechanical systems) based technology for microphones. These microphones are designed in a way that allows them to be sensitive to light, and not just sound. Therefore, carefully modulated light can trick the microphones into thinking they are hearing genuine voices.
The main discovery behind light commands is that in addition to sound, microphones also react to light aimed directly at them. Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio.Researchers statement
The researchers used fairly simple and easy to find materials to construct their experiment.
- Laser Pointer – $14
- Laser Driver – $340
- Sound Amplifier – $28
- Telephoto Lens $200
- Standard Tripod
The working of the experiment is still complex though, and the same results would not be achievable by a novice. However, it is important to note that the materials themselves are relatively cheap. Researchers used this laser setup to focus on a Google Home around 230 feet away. They also said that smart speakers can be compromised from up to 350 feet away.
A direct line of site is required for this attack, as the laser has to directly fall on the target smart device. If achieved, the laser can be used to send “vocal commands” via light to the MEMS-based microphones present on these smart speakers.
Implications of this discovery
These days, you can control anything from your front door to your Tesla car via voice commands with a smart assistant. It is a very serious issue that they can easily be compromised. Seeing how the hack lies in a design flaw loophole, it is unsure whether such a problem can be solved via software update. The root cause for this loophole lies in the way the microphone of these speakers work.
Hopefully, these major companies focus their R&D on changing the technology used in these highly susceptible-to-hacking microphones. At least then, we can be sure that changes are made to prevent these types of attacks, if not retroactively.
Apply to write for us if you love writing and want to see your articles featured on Apple News, Google News, and more!
Watch the video