A study published by researchers at the University of Michigan in 2019 showed that voice assistants could be hacked. The study, conducted in collaboration with the University of Electro-Communications in Tokyo showed voice assistants could be hacked via lasers.
Oddly, microphones were shown to interpret the light signals as voice commands which theoretically left them vulnerable to attacks. You may be thinking that using a laser to talk to a smart speaker is far-fetched. However, it points out that all technology comes with some level of risk, and that we must weigh the added convenience of new technology against any potential risk. Sometimes the added convenience is worth the risk and sometimes added precaution is in order. Let’s look at some of the risks and precautions.
How Can Voice Assistants Be Hacked?
Like the laser example, another hack involved ultrasonic waves. Researchers from Washington University demonstrated something called they called a “Surfing Attack”. Assistant Professor Ning Zhang has found that ultrasonic waves can activate Siri on an iPhone. In a controlled environment, an intruder could access the contents of a text message.
While hardware manufactures should only listen for human speech (and reject laser and ultrasonic input), according to Professor Ning Zhang, something as simple as putting the phone on a tablecloth would prevent a “Surfing Attack”. The soft surface created something called an “impedance mismatch” and eliminated the ultrasonic waves from activating the voice assistant.
Voice Assistant Data Breach Incidents
These sensational news reports were not the only reported Voice Assistant hacks. In 2018, an Amazon customer from Germany received 1700 audio files from someone else’s Echo. They provided enough information to locate the owner of the Echo as well as his girlfriend. Amazon called the incident an “unfortunate mishap”.
Another incident like this occurred in Portland, Oregon. A woman found her Echo had sent recordings of her private conversations to one of her husband’s employees. Amazon attributed the incident to mishearing several key pieces of information; a wake word, a contact, and a confirmation.
Several other incidents have occurred since then. However, these reports haven't stopped the rising sales of smart speakers or the rising use of voice assistants in 2020. Just because so many people are using Voice Assistants doesn’t mean that security shouldn’t be taken seriously.
With the news reports, it’s not surprising that owners of smart devices have security concerns. And 41% of those surveyed said that they had concerns with trust, privacy, and passive listening abilities. And 52% were worried that their personal information was not secure.
Voice Assistant manufacturers need to address security concerns while they also make them more functional and convenient for users. Clearly, voice assistant security needs a lot of improvement.
3 Steps to Increase Smart Speaker & Voice Assistant Privacy
- Set a PIN (especially with voice shopping): PIN protection is pretty standard for any smart device. External users can find it harder to control your device when it’s PIN protected.
- Turn off the Mic: Mute the device when you don’t want it to listen. Your device will simply stop listening for commands. You’ll have to turn it back on when you want to use it, of course. This is particularly useful when personal and confidential conversations are going on.
- Delete Recordings: You can periodically delete your recordings from your smart device. While it records all that you feed into it, you can always access them and erase everything.
At this moment, there isn’t much you can do to secure a Voice Assistant without impacting its convenience. However, as voice assistants become ubiquitous, some solid security protocols will need to be created and implemented.