You know that doll or stuffed bear your young child talks to and who talks back? No less than the FBI has warned in an alert that you might have to swap it out for a more old-fashioned model without voice recording capabilities -- in other words a less smart version.
So the doll/bear is somehow breaking the law? Not exactly but close. The FBI yesterday urged consumers to be security aware before buying smart, interactive, internet-facing toys to bring into their homes. Why you ask, would the FBI be worried about toys? Because smart toys and other entertainment playthings using sensors, microphones, cameras, speech recognition and geo-location to engage with your child may be inadvertently risking their privacy and security.
These features, said the FBI, could put the privacy and safety of children at risk “due to the large amount of personal information that may be unwittingly disclosed.”
Better watch out for Elmo is the word on the street among parents.
Digital Toys, Consumer Privacy and Service Providers
The agency’s cautionary flag also came with some advice: “Consumers should examine toy company user agreement disclosures and privacy practices, and should know where their family’s personal data is sent and stored, including if it’s sent to third-party services.”
Let’s think about this for a moment: When we muse about smart homes isn't it usually about houses equipped with technology to operate door locks, security cameras, turn lights on/off, open/close garage doors and the like? Now it seems we have to add our children’s toys into the mix. No offense, but that tonic doesn't go down so easily. What’s next--those finger puppets you bought for your child could divulge your social security number?
While there’s already a law, the Children’s Online Privacy Protection Act, that prohibits website and online service operators from collecting information on children under the age of 13, there’s nothing to stop cyber crooks from digitally prying into smart toys looking for personal material.
We already know sensor-snooping thieves are gearing up to attack smart cars so toys as a target shouldn't really be all that surprising.
Amazon Alexa: Listening In?
Moreover, toys aren't the only consumer privacy/security worry. Amazon is said to be giving developers access to raw transcripts of what consumers say when engaging the device with Alexa applications, The Information reported, citing three people familiar with the company’s plans.
Amazon’s current stated policy is to bar developers from access to Alexa’s "customer-identifiable information" without consumer consent. But it’s easy to see how the online retail colossus could hatch such a plan given how much the competitive landscape for digital assistants is being redrawn. In that sense, Amazon's opt-in approach isn't "particularly reassuring," wrote Lisa Vaas, in a Sophos blog post.
“It’s common for data-gorging companies to point to a lack of identity details and equate that lack to a privacy shield,” she said. “But in these days of Big Data, the claim has been proved to be flawed. After all, as we’ve noted in the past, data points that are individually innocuous can be enormously powerful and revealing when aggregated. That is, in fact, the essence of Big Data.”
Vaas makes a significant point, one that’s particularly applicable to MSSPs considering managing security in the smart home as an open opportunity. On the one hand, as security needs in smart homes become more complex, consumers undoubtedly will need management experts. For example, Amazon apparently already is slowly offering technical support for homes ranging from consulting to installing smart thermostats and lights.
Still, securing toys from digital invaders seems more the job of parents and perhaps of toy makers. Where might MSSPs draw the line to protect smart homes? Or will they need to draw a line at all? Given how fast smart home technology is moving, those questions will need answers relatively soon.