It’s Time to Reevaluate Legal Standards for Social Media’s Role in Radicalization

It’s Time to Reevaluate Legal Standards for Social Media’s Role in Radicalization

By Shayna Koczur

In 2015, fourteen people were killed in a mass shooting at the Inland Regional Center in San Bernardino, California. The perpetrators of the attacks were radicalized and pledged allegiance to ISIS on social media. On October 1, 2017, after calling for vengeance for American airstrikes in Syria in a Facebook post, Omar Mateen carried out a mass shooting at an LGBTQ night club in Florida. It was the deadliest confirmed terrorist attack on U.S. soil since  9/11.  The unforeseeable rise of social media combined with case law that precludes social media platforms from lawsuits regarding online radicalization creates a legal loophole that permits terrorist organizations to use social media as tools that could cause terrorist attacks on American soil. These legal standards need to be reevaluated given how modern terrorist organizations operate. 

Social media is a godsend to foreign and domestic terrorist groups because it cannot be the target of military strikes and is free. “Media jihad” is a term used to describe virtual activism related to the production, distribution, promotion, collection, and redistribution of jihadist multimedia. ISIS, for instance, deploys many tailored tactics online to attract recruits.  Domestic extremist organizations like the Ku Klux Klan also recruit new members online. Despite recent attacks, social media platforms like Facebook don’t face liability for negligent policing efforts. Courts view social media platforms as "marketplaces of ideas" and "essential public squares" and do not feel comfortable violating First Amendment rights, even if that speech incites violence. 

This preclusion, however is incompatible with modern technology. Social media platforms enjoy loose liability under the Communications Decency Act (CDA), which states that a provider like Facebook is only liable if it directly publish content associated with terrorists. The Ninth Circuit held in Megalla et al v. Twitter, Inc., et al  that social media organizations cannot be held liable for "aiding and abetting" terrorism because knowing these groups are likely operating on the server and ignoring them does not reach the level of conduct to be considered be "assisting” terrorism.  

As a result, terrorist organizations can operate on Facebook, or any social media platform. This extends to targeted advertising when it shares terrorist materials.  In Clayborn v. Twitter, Inc. the court found that Twitter’s algorithm, which matched curious web surfers to jihadist online recruiters, was not aiding terrorist organizations because there is too much content for Twitter to police and play a proximate role in the radicalization process. This court held that "general awareness" of terrorist activity happening on its platform is not criminal conduct. The classification of social media platforms as public forums for speech along with the CDA creates tension for national security laws because online radicalization would be very difficult without these platforms.  

Under U.S. law, providing "communications equipment"  to terrorist organizations is materially aiding terrorist organizations. Although social media platforms are not in the room with terrorists, so to speak, they are consciously providing a communication service for terrorists and giving them an audience. Courts seem to ignore the expansiveness of these platforms’ conduct and how terrorists don't use "public forms" in the way the founding fathers imagined.  

International law and artificial intelligence (AI) may offer solutions to online recruitment that would not violate the First Amendment. The United Nations proposes labeling radicalization materials "Propaganda of War," which is information that inspires conflict. It also proposes a nuanced plan for restricting materials based on their context, the status of the speaker, the content and the reach of the expression, and intent of the conduct. The plan could push social media organizations to engage in more police work under international law. If Congress were able to pass a law like this, the classification of speech would be so narrow that judges may not consider it a burden on the First Amendment. The narrow tailoring of this plan also proposes safeguards to distinguish radicalization material from the speech the First Amendment protects. 

Facebook and Twitter can also remove specific material from their platforms by programming software to recognize certain types of content. Facebook has already achieved this by using artificial intelligence to detect sexual misconduct.  If it is possible for Facebook to detect radical content from designated hate group organizations and review it before it is public, Facebook can engage in more active and effective policing against terrorist organizations. Facebook has already stated that its "default is to remove sexual imagery to prevent the sharing of non-consensual or underage content." This shows Facebook is capable of creating a default mechanism that picks up very specific material and deletes it to prevent criminal conduct before it is even posted. 

Given that Facebook can censor its materials preemptively and tailored laws to classify radical material, it can no longer hide behind decades-old case law that precludes it from liability. It is time to put pressure on social media companies to deter the very real threat of online radicalization and harmonize it with national security laws. It is time to evaluate Facebook's deliberate blind eye to online radicalization and consider the role of social media in the online radicalization process as criminal and protecting virtual radical communities.  


 
shaynaheadshot.jpg
 

Shayna Koczur is a second year student at Brooklyn Law School. She has an undergraduate degree from Bard College, where she studied International Relations. During her time at Bard, she wrote her senior thesis, “The Muslim Question” on France’s unique relationship with its immigrant communities. Prior to law school, she was an intern at the UN Liaison Office of the International Action Commitee for Small Arms and a remote intern at the Hudson Insituite.


facebook” by Stock Photo is licensed under CC BY 2.0

Dealing With Non-State Actors In International Law: The “Unwilling And Unable Doctrine”

Dealing With Non-State Actors In International Law: The “Unwilling And Unable Doctrine”

The Rise of China is an Opportunity for Renewed U.S. Global Leadership

The Rise of China is an Opportunity for Renewed U.S. Global Leadership