A proposed amendment to this year’s National Defense Authorization Act (NDAA) strays a little far from the mark of supporting military readiness or countering threats our military may face on the battlefield. Instead, it mandates that social media companies describe in detail exactly how they are handling online terrorist content, costing these companies and the government time and money that would be better spent on actually countering terrorists.
While on its face, it may claim to be stopping designated terrorist organizations and individuals, the proposal does nothing to enhance security online. The amendment mandates that social media companies tell their users about their policies regarding terrorism, how to report such content, and what the consequences might be. Most social media companies already do this, so on one hand, this is entirely unnecessary. On the other hand, the bill defines “social media platform” in a way that likely includes unintended targets like porn sites.
The amendment also mandates that social media companies provide detailed information to the attorney general on their content moderation practices and policies regarding terrorism. This includes the number of
“flagged” or reported pieces of content;
“actioned” items and a count of how exactly they were actioned (e.g., removed, made less visible, demonetized, or otherwise suppressed);
actions taken against posters of content because of their violating content;
times actioned content was viewed by other users;
times actioned content was shared;
appeals and reversals; and
how these stats and platform polices are changing over time.
And all this information must be broken down in many ways. Was it a post, comment, direct message, profile, etc.? Was it text, a still image, or a video? Was it flagged by users, civil society, artificial intelligence, or other sources? And similarly, how was it actioned?
None of this helps the intelligence community or our military stop terrorists and defeat threats to US interests. How does Meta telling the attorney general that 648,000 people appealed the removal of their content for dangerous organizations and individuals violations help advance US security? Or that X removed 33,693 accounts for terrorism, 92 percent proactively? All it does is give the government granular information with which it can later berate social media companies for not doing enough or not doing the “right” thing.
Many social media companies already provide some of this information in various degrees of detail, but this amendment would likely require greater detail than some companies may track. It also ignores the cooperation that already exists between the government, social media companies, and other civil society groups on issues of terrorism.
And while big companies can more easily absorb and manage these rules, every additional unfunded mandate makes it harder and more costly for companies to operate and for new companies to enter the field. While this bill is nowhere near as costly as the mandates of the European Union’s various tech laws, adding this on top of all the other government demands is simply unnecessary and bad for innovation.
We should also be careful whenever we approach an issue where American’s speech rights are potentially threatened. While many of the groups and individuals designated on these government lists are dangerous and terrible, that does not remove the rights of Americans to exercise their First Amendment rights to speak about these groups, however distasteful many may find it. Tech companies can and do often remove such content, but that is their choice.
When arguments for safety and security override expression, we see situations like in Australia where recently its E‑Safety Commission tried to force X to globally take down imagery of a terrorist attack that didn’t violate X’s content policies. The Commission finally backed down after drawing worldwide attention, but this is only the most recent example of security being used to limit speech.
The NDAA should promote peace and security that allows American businesses to prosper. Tacking new reporting mandates onto social media companies fails on all fronts.