Fact:
On May 8, 2019, Taliban insurgents detonated an explosive-laden vehicle and then broke into American NGO Counterpart International’s offices in Kabul. At least seven people were killed and 24 were injured.
Iconic consumer brands have begun pledging to temporarily or indefinitely halt advertisements on Facebook due to its consistent mishandling of extremist content, hateful rhetoric, and divisive information on its site. The boycott by more than 100 leading advertisers reflects ongoing frustration over Facebook’s ineffective content moderation processes, allowing advertisements to run adjacent to questionable content on its platform.
In 2019, the Counter Extremism Project (CEP) wrote to leadership at the World Federation of Advertisers, the American Association of Advertising Agencies, and the Association of National Advertisers as well as their members, advocating for their organizations to join with other large companies to pull their advertising from platforms identified as having extremist or harmful content on their sites. CEP warned of the risks regarding specific platforms’ placement of their advertisements next to extremist and terrorist content.
Despite assurances to the contrary, Facebook’s policing of extremist material is routinely unaddressed. Senior executives have even gone so far as to block measures that would have addressed systematic software flaws that helped promote extremist content on its site.
“Facebook clearly outlines removal policies for extremist content in its Community Standards, but they inconsistently enforce their terms. It’s clear that advertisers have had enough, and they rightly understand that the best way to influence Facebook is through their dollars,” said CEP Executive Director David Ibsen. “Facebook has a responsibility to businesses, lawmakers, and the public alike to provide greater accountability and transparency and should establish—and uphold—more reliable solutions to curbing the content moderation problems on its platform.”
As CEP Senior Advisor Dr. Hany Farid wrote in a March 2019 USA Today op-ed, advertising revenue is a crucial way to influence tech companies—they must be hit on their bottom line until the industry finally decides to take the issue of online extremism seriously. Dr. Farid said that corporate CEOs can move to pause their social media advertising buys and “stand up and say unequivocally: Enough is enough. We will no longer be the fuel that allows social media to lead to deaths of innocents, to interfere in democratic elections, to be the vessel for distributing child sexual abuse material, extremism material, and dangerous conspiracies … These corporate titans should lead not only because it is the right thing to do, but also because it is in their corporate interest. It is bad for business when their product and corporate logo is advertised against terror-related, child sexual abuse, conspiracy, hateful and harmful content.”
Dr. Farid again echoed these concerns in his testimony last week before a joint subcommittee of the U.S. House Committee on Energy & Commerce on the effect online disinformation has had on the country. In his testimony, Dr. Farid called upon major advertisers to halt their ad spending insisting on effective changes. Dr Farid concluded his testimony saying: “If advertisers, that are the fuel behind social media, took a stand against online abuses, they could withhold their advertising dollars to insist on real change. Standing in the way of this much needed change is a lack of corporate leadership, a lack of competition, a lack of regulatory oversight, and a lack of education among the general public. Responsibility, therefore, falls on the private sector, government regulators, and we the general public.”
Extremists: Their Words. Their Actions.
Fact:
On May 8, 2019, Taliban insurgents detonated an explosive-laden vehicle and then broke into American NGO Counterpart International’s offices in Kabul. At least seven people were killed and 24 were injured.
Get the latest news on extremism and counter-extremism delivered to your inbox.