What do you picture when someone refers to a “war room”? Most likely, you see a room filled with screens displaying a crisis scenario, maps of troops’ positions, and a live feed of a military raid. High-ranking military personnel sit in a circle around a table, evaluating risk and plotting their next move.
As a former content moderator for Google, I picture a gathering of my colleagues around a conference table, sitting in nondescript swivel chairs that face a large flat-screen TV, looking over the work we have for this particular session. Someone closes the blinds to prevent passersby from glimpsing the horrors we are about to witness as they make their way to the micro kitchen. Only then do we begin our analysis — of hundreds of pictures of minors depicted in abusive, sexual scenarios. As the children depicted get older, it can become more difficult to tell whether they are minors as predators cake makeup on their faces, dress them up in sexual costumes and pose them in lascivious positions. Sometimes we are certain they are young, because we have seen them before.
I have seen the dangers that exist for children on the internet. The dark truth is that, much like the physical world, the internet is not 100% safe for them, and it never will be. We can take steps, however, to diminish the current risks posed to children online.
For their part, members of Congress have launched offensives on the technology industry to keep children safe. For instance, after a New York Times story  showed YouTube’s recommendation algorithm suggesting videos of younger and younger women and girls when a user starts off watching erotic videos, Sen. Josh Hawley, R-Mo., announced plans to introduce a bill  that would make it unlawful for video-hosting platforms to recommend videos that feature minors. And during a recent Judiciary Committee hearing on protecting innocence online, Sen. Lindsey Graham, R-S.C., proposed that committee lawmakers establish a set of best practices for tech companies to follow in order to protect young users for explicit content and sexual predators. Companies that fail to follow these practices would lose their Section 230 immunity.
While these are valiant efforts, Congress alone is woefully unprepared to draft legislation to ensure the safety of children online. Simply removing recommendations of content featuring children will not stop a pedophile from finding and watching videos of children. Drafting a set of best practices is a more promising alternative, but not if the practices are developed by members of Congress acting alone — or if the solutions end up causing more problems.
Indeed, too often, we see legislators craft bills  that ignore how their provisions will play out in practice. In tech policy — as with any complicated policy issue — the devil is in the details. If lawmakers intend to regulate these platforms, they need to work with tech experts who understand the product and how these measures will be executed.
Best business practices, for example, should be developed by a bipartisan group of lawmakers working alongside industry experts who can give feedback as to whether the ideas proposed are workable. In developing these best practices, there must also be a conversation as to what results are realistically achievable. While it is possible to make sure that all tech companies are on the same page and are properly prioritizing this work, eliminating all the risks to children that exist on the internet would be next to impossible.
Additionally, removing a platform’s Section 230 immunity as a punishment for failure to follow these best practices would only make things worse. Section 230 was crafted to protect the open nature of the internet while protecting kids from the dangers found online. It does so by enabling platforms to moderate their content and encouraging them to develop filtering technologies that allow parents to make their own decisions as to what their children should be able to see. The removal of this immunity would not teach tech companies a lesson. Instead, it would drown them in a sea of lawsuits while allowing predators and illicit content to go unchecked on their platforms. Ironically, this would make kids less safe online.
Above all, lawmakers should keep in mind that today’s children are incredibly tech savvy . If they want to find it, they will find it — which is why any attempt to keep children safe on the internet should include a robust digital education. In short, tech companies and government alone are not responsible for keeping children safe online; we must also empower children to protect themselves on the web.
Image credit: Chinnapong 
- “New York Times story”: https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html
- “bill”: https://www.hawley.senate.gov/sites/default/files/2019-06/Protecting-Children-Online-Predators-Act-Highlight.pdf
- “bills”: https://www.washingtontimes.com/news/2019/jul/1/the-stop-internet-censorship-act-would-ironically-/
- “incredibly tech savvy”: https://www.cnbc.com/2014/08/07/are-6-year-olds-more-tech-savvy-than-you.html
- “Chinnapong”: https://www.shutterstock.com/g/noipornpan