Berlin-based Security Research Labs (SRL) discovered possible hacking flaws in Amazon Echo (Alexa) and Google Home speakers and installed their own voice applications to demonstrate hacks on both device platforms that turned the assistants into ‘Smart Spies’.
What Happened?
Research by SRL led to the discovery of two possible hacking scenarios that apply to both Amazon Alexa and Google Home which can enable a hacker to phish for sensitive information in voice content (vishing) and eavesdrop on users.
Knowing that some of the apps offered for use with Amazon Echo and Google Home devices are made by third parties with the intention of extending the capability of the speakers, SRL was then able create its voice apps designed to demonstrate both hacks on both device platforms. Once approved by both device platforms, the apps were shown to successfully compromise the data privacy of users by using certain ‘Skills and actions’ to both request and collect personal data including user passwords by eavesdropping on users after they believed the smart speaker has stopped listening.
Amazon and Google Told
SRL’s results and the details of the vulnerabilities were then shared with Amazon and Google through a responsible disclosure process. Google has since announced that it has removed SRL’s actions and is putting in place mechanisms to stop something similar happening in future. Amazon has also said that it has blocked the Skill inserted by SRL and has also put in preventative mechanisms of the future.
What Did SRL’s Apps Do?
The apps that enabled the ‘Smart Spy’ hacks took advantage of the “fallback intent”, in a voice app (the bit that says I’m sorry, I did not understand that. Can you please repeat it?”), the built-in stop intent which reacts to the user saying “stop” (by changing the functionality of that command after the apps were accepted), and leveraged a quirk in Alexa’s and Google’s Text-to-Speech engine that allows inserting long pauses in the speech output.
Examples of how this was put to work included:
- Requesting the user’s password through a simple back-end change by creating a password phishing Skill/Action. For example, a seemingly innocent application was created such as a horoscope. When the user asked for it, they were given a false error message e.g. “it’s not available in your country”. This triggered a minute’s silence which led to the user being told “An important security update is available for your device. Please say start update followed by your password.” Anything the user said after “start” was sent to the hacker, in this case, thankfully, SRL.
- Faking the Stop Intent to allow eavesdropping on users. For example, when a user gave a ‘stop’ command and heard the ‘Goodbye’ message, the app was able to continue to secretly run and to pick up on certain trigger words like “I” or words indicating that personal information was about to follow, i.e. “email”, “password” or “address”. The subsequent recording was then transcribed and sent back to SRL.
Not The First Time
This is not the first time that concerns have been raised about the spying potential of home smart speakers. For example, back in May 2018, A US woman reported that a private home conversation had been recorded by her Amazon’s voice assistant, and then sent it to a random phone contact who happened to be her husband’s employee. Also, as far back as 2016, US researchers found that they could hide commands in white noise played over loudspeakers and through YouTube videos in order to get smart devices to turn on flight mode or open a website. The researchers also found that they could embed commands directly into recordings of music or spoken text.
Manual Review Opt-Out
After the controversy over the manual, human reviewing of recordings and transcripts taken via the voice assistants of Google, Apple and Amazon, Google and Apple had to stop the practice and Amazon has now added an opt-out option for manual review of voice recordings and their associated transcripts taken through Alexa.
What Does This Mean For Your Business?
Digital Voice Assistants have become a popular feature in many home and home-business settings because they provide many value adding functions in personal organisation, as an information point and for entertainment and leisure. It is good news that SRL has discovered these possible hacking flaws before real hackers did (earning SRL some good PR in the process), but it also highlights a real risk to privacy and security that could be posed by these devices by determined hackers using relatively basic programming skills.
Users need to be aware of the listening potential of these devices, and of the possibility of malicious apps being operated through them. Amazon and Google may also need to pay more attention to the reviewing of third party apps and of the Skills and Actions made available in their voice app stores in order to prevent this kind of thing from happening and to close all loopholes as soon as they are discovered.