The ‘Teddy Bear and Toaster Act’ is device regulation done wrong
Should government to protect us from snooping teddy bears and untrustworthy toasters? The California State Senate seems to think so.
With traditional devices on the decline, laptop and desktop computers now account for less than 25 percent of internet network traffic. Indeed, American households now use, on average, seven connected devices every day. As this so-called “internet of things” continues to expand, an array of connected objects—from toasters to lightbulbs to dishwashers—now include embedded microprocessors, multiplying the number of potential threat vectors for data breaches and cyberattacks.
Notably, security researchers revealed recently that CloudPets, a company that sells connected stuffed animal toys with voice-recording capabilities, had a security vulnerability that leaked the information of more than 500,000 people. In response to accounts like these and concerns about data collection by internet-of-things devices, California is considering S.B. 327, legislation that would require certain security and privacy features for any connected devices sold in the Golden State.
Device insecurity is a real threat and it’s encouraging to see legislators thinking about consumer privacy and security. But this bill, facetiously called the “teddy bear and toaster act” by its critics, would create more problems than it solves. These concerns do not merit a heavy-handed and wide-reaching legislative response.
First introduced in February, the bill targets a broad range of products that include “any device, sensor, or other physical object that is capable of connecting to the internet, directly or indirectly, or to another connected device.” It would require that their manufacturers “equip the device with reasonable security features.”
The scope and scale of that definition would appear to cover everything from smartphones to cars to tweet-happy toasters. Sweeping such a broad range of connected devices under its rules ignores that all of these items have unique functions, capabilities, and vulnerabilities. What constitutes a “reasonable security feature” for one might be completely unreasonable for another. This one-size-fits-all regulatory approach threatens to chill innovation, as companies from a host of different sectors expend resources just to make sense of the rules.
Should the bill move forward, we should also expect a range of consumer items will be equipped to blink and buzz and beep in ways more annoying than informative. The bill decrees that: “a manufacturer that sells or offers to sell a connected device in this state shall design the device to indicate through visual, auditory, or other means when it is collecting information.”
For some types of devices—such as virtual and augmented reality systems and autonomous vehicles—this requirement is simply infeasible. These devices use sensors to collect data constantly in order to perform their core functions. For always-on devices like IP security cameras, Amazon Alexa or connected cars, an indicator would just be synonymous with an “on” button. Many of these indicators will be superfluous, misunderstood and costly to implement—costs that disproportionately would hit smaller businesses.
Other provisions of the bill urge sellers of connected devices to notify consumers at checkout where they can find the item’s privacy policy and information about security patches and updates. This is valuable information, but the point-of-sale may not be the best time to communicate it. For many devices, a verbal or web-based tutorial likely would be more effective. Companies need the flexibility to figure out the best ways to inform their customers, while these design requirements would remove that flexibility.
In an interconnected world, balancing privacy rights and security is a hugely difficult undertaking. Enshrining that balance in law requires a nuanced and targeted approach. Policymakers at both the state and federal levels should focus their efforts on provable privacy or security harms, while empowering consumers with baseline information, where appropriate. Applying design requirements and compliance tasks in a haphazard way, as S.B. 327 does, will harm innovation without meaningfully improving data security.
Image by Pavinee Chareonpanich