Your smart speaker is an incredibly handy piece of technology. It helps organize your life, streamline your household activities and even helps you shop online. But what are the security risks posed by having an always-on mic in your living room? Who’s listening to your recorded audio? According to a recent report from Seattle news station KIRO 7, these concerns are not mere conjecture.
Photo Credit: SFGate
A Portland woman identified as ‘Danielle’ told the Seattle news station that her Amazon Echo sent a recording of a private conversation to a contact. The conversation was utterly harmless, apparently a discussion between Danielle and her husband regarding hardwood floors. However, the contact who was sent the audio files immediately contacted Danielle and told her to disable her Alexa-enabled device immediately. The contact feared that Danielle had been hacked and was in danger of having her information stolen.
Photo Credit: Amazon.com
Danielle immediately contacted Amazon to ask for answers. How did this happen? Were they aware the Echo was doing this? The representative that Danielle reached found the relevant logs, confirming what had happened. He apologized profusely, according to Danielle, as the situation was unbelievably unlikely. Apparently, the Echo device overheard the conversation happening in the other room. Some words in the conversation made the device think it was being asked to send a message. After mistaking background noise for a confirmation, it sent the overheard conversation to the contact.
An “Echo Butt-Dial”
Photo Credit: The Seattle Times
Wired Magazine described the ordeal as an “Echo butt-dial,” a complete fluke. Amazon claimed that they will use the data from this incident to tighten Alexa’s recognition protocols and make this scenario more unlikely. However, this individual scenario isn’t the only smart speaker security concern: if they’re always listening, how can they be secure? What are some of the other concerns with these devices?
Background Sounds Being Recorded
Photo Credit: The Verge
The first issue, and most pressing in regard to Danielle’s case, is that of background sounds. Smart Speakers have to be listening for their trigger phrases all the time. If they don’t, they literally don’t work. This represents a serious security concern, though, as Danielle found out. What if the speaker mishears you and sends private conversations to people who aren’t supposed to hear them? What if a malicious hacker accesses your device and listens in to find out when you’re home? Would Amazon bear responsibility for such a breach?
The answers are currently unclear. While such situations are mostly hypothetical, Danielle’s case shows that they are potentially possible.
Who Can Hear Your Conversations?
Photo Credit: CNET
If your smart device is always listening, who can hear what it records? Google is well-known for creating complicated advertising profiles for users. Smart speakers are surely a part of this. Listening for conversations about buying new hardwood floors, for instance, could lead to targeted ads on your favorite websites for floor installation. This alone seems predatory; who would invite a large ad agency into their life to gather valuable data about their spending habits?
This is even before mentioning the hacker issue again. If a malicious hacker were to access Google or Amazon’s user profiles obtained via Smart Speaker, they would have unprecedented data about potential identity theft targets.
Can Law Enforcement Access it?
Photo Credit: Linkedin
If Google or Amazon are listening in, what kind of risk are users at for law enforcement listening in? The NSA’s wiretapping is well-known and, surprisingly, well-documented. Suspected criminals with smart speakers may have their homes bugged voluntarily. Would law enforcement need a warrant to collect audio from these devices? Would they need a court order to obtain user profile information from Google?
If law enforcement begins pursuing smart speaker information as a common means of information-gathering, the private lives of users could be at risk. Everyone had to read 1984 in school right? Yeah, this would be the first step to something like that. No thanks.
How Long is Your Data Stored?
Photo Credit: Management Events
Speaking of access to your personal info, there’s also the issue of just how long Google and Amazon store this information. Thankfully, you can delete your audio request history from your user profile. However, you can’t do anything about the data stored on Google and Amazon’s servers about you.
In fact, even Apple’s Siri stores data you obtain through it and give it. Apple has confirmed that raw audio collected through Siri is stored for 18 months! That’s kind of ridiculous! The issues of law enforcement accessing this info, lawfully or otherwise, are further exacerbated by this data’s inaccessibility to the user who created it.
Who Else Is Using Your Smart Speaker?
Photo Credit: MyTechLogy
Finally, there’s the issue of someone else using your speaker. The main use for Alexa, as far as Amazon is concerned, is as a storefront. Ordering through Alexa is purposefully easy and painless. You just tell her to order you things and she does it, no questions asked. So, imagine, if you will, that your home is broken into and your Echo is stolen. The criminal then orders a ton of stuff with your credit card. Are you on the hook for that money?
While that scenario is a bit extreme, there are more conventional ways that Alexa’s shopping functions can be used against you. For one, if you have children, they could access your Alexa and order themselves a new tablet or gaming console, spending a ton of your money without telling you. Your friends could try to play a prank on you by ordering embarrassing items with your money. Not that that’s happened to this writer…
How to Protect Yourself
Photo Credit: Network World
All this doom and gloom begs the question, then: How do you protect yourself from these potential data breaches? Well, one easy way, of course, is to simply not have a smart speaker. Or, if you do have one, sell it to someone else and use the money to buy a VPN subscription. No? Alright, alright, real advice, then.
Keep your smart speaker in a room where it’ll overhear as little as possible. If you can, keep it in a room with a closed door. The less it overhears, the better. Aside from that, just be careful what you say around it. Try to keep conversations around the device light, don’t talk about sensitive information. Another good security measure would be to not pair any of your debit or credit cards to the speaker. While it may be less convenient, it’s much more secure.
The Future of Smart Speaker Security
Photo Credit: CNET
Hopefully these concerns will be addressed with future updates to the technology. Smart speakers are very handy, convenient devices. Many users have come to depend on the features they offer. It seems as though the companies that make them are at least nominally dedicated to making them secure. While only time will tell what the future hold for the contentious smart speaker technology, stories like Danielle’s serve as reminders: trust no soulless machine with your personal information. Even better: trust no soulless corporation.