I’m sorry, Dave. I’m afraid I can’t do that: When virtual assistants go very, very wrong

The Internet of Things brings us convenience and entertainment. But what happens when hackers invade IoT ecosystems or when connected devices malfunction?
Tips & tricks
4 mins
when iot goes wrong

The Wall Street Journal estimates that one million IoT devices were recently used by hackers to carry out a glut of DDoS attacks.

But it’s not just hackers you have to worry about. The Internet of Things is just as likely to go rogue all by itself.

Hackers have previously unheard of access to your data, via teddy bear

As we clearly live in the future, there is a company that sells internet-connected teddy bears.

Said company allows parents to record messages which are pushed to the teddy bear via the internet. The bear speaks the words recorded by mom and dad, and presumably, the child gasps with awe and astonishment.

It was all fun and games, but then hackers published more than two million of the teddy recordings online, along with over 800,000 customer credentials. It turns out many customer’s passwords were so weak it was a breeze to crack them.

(We’re not saying she’s a soothsayer, but Lexie has previously warned about the dangers of weak passwords—she even published this excellent guide to better securing your online accounts.)

The whole mess wasn’t technically the teddy bears’ fault, but it’s an issue that wouldn’t arise with my 1980s, wireless* Zugly.

Wireless teddy bear

*As in he has no wires. He’s just fluff. Adorable, ugly fluff.

Be careful what you ask your virtual assistant

What’s more terrifying, and definitely not funny, are IoT devices that go wild all on their own, without any hacking.

Amazon Echo is in millions of homes worldwide and is always listening. Sold as an “intelligent assistant,” Echo promises to do the shopping, answer your quandaries, play music, report the weather, and even control the thermostat. Amazon’s all seeing eye supposedly gets smarter the more you ask it, learning from your life as you live it.

A black box sat in the room, silently observing your life so as to please you seems a great idea. There’s even a friendly name for the mechanical voice assistant that aids you: She’s called Alexa.

Wonderful! Apart from the fact it’s a bit like the start of a dystopian Sci-Fi movie (I’m surprised Amazon didn’t call it Hal). Oh, and sometimes it’s not very smart at all.

Not at all.

When Amazon Echo orders what it wants and not what you say

Like the time, on one sunny morning in Dallas, Texas, a six-year-old girl asked Alexa: “Can you play dollhouse with me and get me a dollhouse?” But what Alexa heard was “Can you order four pounds of sugar cookies and a $170 dollhouse, please?”

And so that’s what Alexa did.

But the merriment didn’t end there, when a San Diego TV station reported Alexa’s misstep, and used the word “Alexa” on air, viewers’ Echos sprang into life. It seems Alexa can’t tell the difference between a person saying Alexa in the room and someone saying it on TV.

Guess what happened next? That’s right, Alexa proceeded to order a dollhouse for everyone who was watching the story on TV. Remember, definitely not funny.

There’s been many a boo boo, as it turns out. A user on Twitter showed an Alexa shopping list of a “hunk of poo, big fart, girlfriend, [and] Dove soap” (who knows what he actually wanted?) and another included “150,000 bottles of shampoo” and “sled dogs.”

Other cautionary Alexa tales include the child who asked for a game called Digger Digger and, well, let’s just say that’s not what Alexa delivered.

Still, it could be worse.

Internet assistants versus privacy and the police

U.S. law enforcement has proven itself to be quite demanding when it comes to invasions of privacy. To continue the trend, U.S. police recently obtained a warrant to get data from Amazon Echo with regards a murder case. Although Amazon refused to divulge any information, it does beg the question: What happens to all the data collected by Alexa or Google Home (Google’s Echo equivalent)?

It seems like an episode of Black Mirror, but could Alexa or Google Home ever be called up as a witness in a trial?

We’re handing a lot of power to Google and Amazon, and placing a lot of trust in them. The things they know about we do in the privacy of our own homes could be incredibly embarrassing if released, possibly causing irreparable damage to lives and careers.

Perhaps we should all heed the wise words of Megan Neitzel, the mother of the dollhouse girl:

“I feel like whispering in the kitchen…”

Secure your accounts and take care online

Turn off Amazon’s one-click ordering and secure your account with a PIN. Doing so should stop Alexa ordering whatever she wants, whenever she feels like it.

A strong password is also a must. And don’t forget you can anonymize your internet connection with a VPN.

There are many privacy benefits to a VPN, which this blog has covered many times. Even better, an ExpressVPN router uses a VPN on any Wi-Fi-enabled device, even ones that can’t usually run VPN software, like PlayStation, Xbox, TV, and yes, virtual assistants.

Lovely.

Johnny 5 is the founding editor of the blog and writes about pressing technology issues. From important cat privacy stories to governments and corporations that overstep their boundaries, Johnny covers it all.