By Kait Pararas
On Wednesday morning, the U.S. Senate Committee on Commerce, Science, and Transportation held a hearing on mass violence, extremism, and digital responsibility. The purpose of the hearing was to examine the proliferation of extremism online and examine the effectiveness of social media companies’ efforts to remove violent content from their platforms. The senators heard from representatives of Facebook, Twitter, Google, and the Anti-Defamation League.
Monika Bickert, Facebook’s head of global policy management, repeatedly assured senators about Facebook’s commitment to remove terror and hate content from its website. In her opening statement, she said: “We don’t allow any individuals or organizations who proclaim a violent mission, advocate for violence, or are engaged in violence to have any presence on Facebook.”
However, a whistleblower working with the National Whistleblower Center filed a petition in January 2019 with the Securities and Exchange Commission (SEC) contradicting this. The petition shows that Facebook not only hosts terror and hate content, but it has also auto-generated dozens of pages in the names of Middle East extremist and U.S. white supremacist groups, thus facilitating networking and recruitment.