Facebook Must Apply the Brandenburg Standard to Keep Trump off its Platforms
It has been over 120 days since Trump was de-platformed and last Wednesday’s ruling by Facebook’s Oversight Board upheld the social media company’s ban on Donald Trump from its platforms. The Board found “Mr. Trump created an environment where a serious risk of violence was possible.” This finding justified the banning but found fault with Facebook for their hasty decision without clear criteria. The decision to uphold Trump’s ban will be reevaluated in six months. During this time, Facebook must “review this matter to determine and justify a proportionate response that is consistent with the rules that are applied to other users of its platform.”
For four years Trump weaponized social media and used his platform to peddle lies and spread disinformation. The threat Trump poses to American democracy came to an apex with the January 6th insurrection on the U.S. Capitol. Immediately following the domestic terror attack, Facebook stripped Trump of the ability to reach his 35 million Facebook followers and 24 million Instagram followers. The decision to uphold Trump’s ban is a step in the right direction for defusing disinformation from one of social media’s biggest culprits.
The Board’s ruling to uphold Trump’s ban on social media has wider implications for how the speech of political leaders should be policed in the digital space. In recent years, social media companies have grappled with the regulation of speech on their platforms, and now Facebook is at the forefront of having to develop criteria which promotes public safety while respecting freedom of expression.
As Facebook looks to address the issue of speech regulation online, it would be prudent to review and apply the Supreme Court decision of Brandenburg v. Ohio rendered in 1969, which has defined hate speech and served as a litmus test of free speech.
In Brandenburg v. Ohio, Clarence Brandenburg, a Ku Klux Klan leader in Ohio, asked a Cincinnati reporter to cover a KKK rally in Hamilton County for his television station. The resulting footage captured people burning a cross and making speeches wearing the traditional KKK attire of hooded robes. The speeches referred to taking revenge on African-Americans as well as Jews, and potentially marching on Washington on the Fourth of July to accomplish their goal. They also criticized all three branches of the government for allegedly colluding with non-whites against whites.
Once this footage became public, Ohio authorities charged Brandenburg (who had made one of the speeches) with advocating violence under a criminal syndicalism statute. The law dated back to the First World War era and responded to then-widespread concerns about anarchists, socialists, and communists. Brandenburg was convicted, fined, and sentenced to one to ten years in prison. His conviction was affirmed by a state appellate court and dismissed by the state Supreme Court. The court ruled “speech which supports law-breaking or violence in general is protected by the First Amendment unless it directly encourages people to take an unlawful action immediately.”
The court’s ruling translates for social media platforms to ask if the content posted is considered incitement. Does the content incite violence or call for violent action to be taken? On January 7th, Mark Zuckerberg answered yes to the aforementioned question.
In the wake of the January 6th insurrection on the U.S. Capitol, Facebook moved to ban Trump from its platforms. Mark Zuckerberg, CEO of Facebook, said Trump used his account to “incite violent insurrection.” Other platforms followed suit, sending a clear message that hate speech has zero place in the social media space.
According to Facebook, hate speech is defined as “anything that directly attacks people based on race, ethnicity, national origin, religious affiliation, sexual orientation, sex gender, gender identity, or serious disability or disease.”
The challenge with hate speech is identifying it across a community of more than 2 billion people comprised of different cultures, languages, and circumstances. This problem is not unique to Facebook, but rather is commonplace across all social media platforms where hate speech is not welcomed.
Facebook’s decision to strip Trump of his ability to reach his 35 million Facebook followers along with his 24 million Instagram followers created an outcry from Republicans who claimed censorship and touted the First Amendment as being infringed upon. However, as stated in Brandenburg v. Ohio, speech loses its constitutional protection only when “such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action.”
By Facebook instituting the Brandenburg standard to its platforms, it sets a precedent for other social media companies to do the same and would allow immediate action to be taken should another person weaponize social media similarly the way Trump did. Applying Brandenburg protects public safety by banning a user from access to their account and provides a consequence for spreading hate speech and disinformation to incite violence.
While public discourse is an important part of a healthy democracy, the sentiments and content shared should not incite violence or be constructed upon disinformation with the intent to marginalize a portion of society. Hate speech has no place in our society, be it online or offline, and should not be tolerated.
Social media companies have spent years ignoring the role their platforms have played in emboldening odious actors. The Oversight Board clearly identified the danger Trump poses on social media, underscoring the threat to public safety and American democracy when social media is weaponized with malicious intent.
Social media companies walk a fine line between protecting speech and upholding the first amendment, while simultaneously protecting users from the spread of hateful information. They must not shy away from finding a middle ground and one that starts with applying the Brandenburg decision to its posting policies. We must not wait for another domestic terror attack before policies are enacted to protect public safety. The future of American democracy depends on it. Your move, Facebook.