Given YouTube’s socially conscious stance that has led to the removal of thousands of gun-related videos — many of which are as innocuous as cleaning demonstrations or depict bump stock enabled rapid fire, you’d think the Google-owned might have developed a new system-wide standard for the kind of “responsible” content they’re willing to host. You’d be wrong.
On any given day there are almost 300,000 videos on YouTube providing step-by-step instructions how to construct bombs—pipe bombs, pressure cooker bombs, you name the type. Some of the videos are the work of teen-age backyard pranksters mixing up household chemicals for “Gatorade bottle bombs” (lethal in their own right), others are so-called “film prop” instructional videos showing how to construct bombs with more “boom” than bark for film-making purposes. But the clear majority are military-grade instructional videos painstakingly walking a would-be Mark Conditt or ISIS bomber how to construct a lethal pipe or pressure cooker bomb.
Yes, that’s only a ping bong ball, but the principle is the same and could easily be scaled up. Something YouTube apparently approves of.
You’re probably thinking, big deal, these are only a bunch of suburban urchins with too much time on their hands. There isn’t anything really dangerous out there, right?
In the past five years ISIS inspired bombers relied heavily on such tactical YouTube videos to build their homemade bombs. According to Boston police and the FBI, the Boston Marathon Tsarneav brothers constructed their bombs from YouTube videos and Inspire magazine—al Qaeda’s “how to be a terrorist” handbook. So, too, did Syed Farouk and Tashfeen Malik, the San Bernardino terrorists who built a bomb-factory in their garage.
Oh. Well, that’s OK. Just as long as they didn’t view anything dangerous like an AR un-boxing video.
Under the Communications Decency Act of 1996 (CDA), social media companies are immune from liability for uploaded content. Congress has carved out two exemptions to this blanket immunity: child pornography and, just recently, sex trafficking.
How can Congress continue to permit social media companies to arbitrarily determine when they are going to remove content merely in response to public pressure? A tortoise-paced half-hearted pledge to act is glaringly insufficient, especially when American lives are at stake. It also strains credulity that one of the largest and wealthiest technology companies does not have the computer software wherewithal to remedy this challenge.
We’re for less regulation and an open forum in virtually all things. It’s when companies set arbitrary standards and apply even those to only certain groups that tech platforms like Facebook and YouTube become the targets of calls for federal regulation. It’s hard to have much sympathy for them.