Electronic Personal Assistants: Caveat Emptor
We’ve all heard the shocking news stories:
• Approximately one billion user accounts have been stolen from Yahoo, the company confirmed. The data breach is the largest from a single site in history.
• FBI claims of Russian hackers infiltrating DNC computers stealing documents and emails, ostensibly to influence election outcomes. (Rumer has it, the DNC’s chairperson’s password into the database was – get ready – “password.”)
• 40 million debit and credit card numbers stolen from Target along with emails, phone numbers, names, and addresses. Such data can be used “as is” or encoded onto new cards. (Currently being sold in online black markets in bundles of 10,000).
• Myspace announced a breach that reportedly affected 360 million accounts. In a blog post announcing the breach, Myspace said it discovered that email addresses, user names, and passwords for accounts created prior to June 11, 2013 had been posted on an online hacker forum available to anyone.
• And, most recently, a laptop associated with the Vermont electrical grid was infected by Russian malware (malicious software). Perhaps a harbinger of things to come?
As disturbing as these statistics are, there is a modicum of comfort in knowing that a lesser known corollary of the statistical Law of Large Numbers states the probability of a single observation being selected from a very large population approaches zero as the population increases. Simply put, if 30 or 40 or 50 million identities are stolen the chances of any one person’s identity being compromised gets smaller and smaller as the number of purloined identities gets larger and larger. Feel safer? Well, there is a catch.
What if we choose to provide our own private information to identity thieves? What if we give these criminals permission to steal our home address, passwords, account numbers, security codes, birth date, contact information, names of websites we visit, content of emails sent and received, names of family members, and even your fingerprints! Furthermore, what if we sign a legal statement agreeing to provide such information on a 24/7 basis? Ridiculous you say!
Please read on…
Here’s a quick riddle: What’s 9 inches tall, has a cylindrical shape, blue accent ring at the top, and looks like a speaker but can answer your questions and learn your habits? It’s Amazon Echo! Doubling as both a speaker and a digital personal assistant it is reminiscent of Apple’s Siri, but is arguably much better. The device goes by the name “Alexa” and users can ask it anything. Need to know simple things like the time or the weather? Alexa has the answer. Need to know something more obscure, like how many teaspoons are in a tablespoon? Alexa has the proper response. It can tell the news, give the sports scores, play music, remember shopping lists and even tell jokes. More than that, it has seven built-in microphones and sensors that can detect voice commands from any direction – so there’s no need to yell. It “can fill any room with immersive sound,” Amazon said. In a unique but somewhat scary feature, Amazon Echo can actually learn a person’s habits over time. It will get used to the way a person talks, his/her habits and routines and will save all the data in the cloud. Oh, and it is always on, always listening, and if you activate its optional camera, always watching. Sort of gives new meaning to the lyrics, “He sees you when you’re sleeping, he knows when you’re awake, he knows if you’ve been bad or good…” So now let’s look at some substantive evidence.
I am among the approximately 70 million users of Microsoft’s Windows 10 operating system, which incorporates its own personal assistant known as Cortana (a first cousin to Alexa). To Microsoft’s credit they are totally upfront about the data they collect about you, ostensibly as means of providing you with the best experiences with their products. (Please note you can control the amount and type of data you want to share with Cortana, but you are encouraged to be as forthright as possible to better personalize your interactive experience.)
Microsoft provides a document for Windows 10 users describing the kinds of data being collected. It is easily accessed and read just by clicking on the Cortana box in the lower left corner of the screen.
Here are some examples:
• First and last name, email address, postal address, phone number, and other similar contact data.
• Passwords, password hints, and similar security information used for authentication and account access.
• Age, gender, country, and preferred language.
• Credit card number and security code.
• Interests and favorites, such as: sports teams you follow, stocks you track, weather in other cities of interest.
• Data about your contacts and relationships.
• Subject and body of emails, text or content of an instant message.
• Audio and video recording of messages sent or received.
• Calendar appointments.
• Cortana allows you to connect to third-party services such as Uber. When Cortana is connected to a third-party service it can also send your data to that service to better enable that connected service.
• And with the built-in camera activated, it was recently demonstrated that if you decide to hold up two fingers in the ubiquitous “peace sign,” it can duplicate your fingerprints.
Given this information I would indeed be remiss if I didn’t point out that you can always sign out of Cortana, or turn off Alexa (or, can you?). The question then becomes: Is the inherent risk, however great or small, worth the perceived “convenience” of having your electronic assistant order a pizza, or check your bank balance for you? Opinions, it seems, are divided.
But although the technology is new, in the end it seems to all come down to the ancient Latin warning “Caveat Emptor” (i.e., “let the buyer beware”).
Jim Dauer has been a fulltime professor of Computer Science and Information Systems at Elmhurst College for 35 years where he teaches courses in Cyber-security and Artificial Intelligence.