CEP Calls on Congress to Demand Answers From Facebook, Zuckerberg

Tech Companies Must Answer For Misuse Of Their Platforms

New York, NY – The Counter Extremism Project (CEP) today called on Members of Congress to hold tech companies accountable on issues of public safety and security as they prepare for upcoming hearings with Facebook CEO Mark Zuckerberg. Ahead of his testimony, CEP is releasing a list of questions for lawmakers to consider asking Zuckerberg that will ensure Facebook and other tech companies are queried about their actions.

“Frustrated advertisers have said ‘enough is enough’ in response to a series of tech industry abuses,” said CEP Executive Director David Ibsen. “Now is the time for Congress to insist that demonstrable action be taken and documented by Facebook and others. Time and again, tech companies like Facebook have promised to do better but have failed to take meaningful, transparent action. Millions of consumers have had their personal information stolen. Many people have died or have been injured as a direct result of brutal terror attacks inspired by online by extremism. Misinformation and fake news continues to circulate unabated on the Internet. Congress must act decisively to protect the American people from the excesses and the risks that continue to be tolerated by tech corporations.”

Mr. Zuckerberg is scheduled to attend joint hearings with the U.S. Senate Committees on the Judiciary and Commerce on April 10 and the U.S. House Committee on Energy and Commerce on April 11. The following is a list of questions CEP believes should be asked of Mr. Zuckerberg:

  1. Isn’t it true that this scandal is only a symptom of a much bigger problem; the unwillingness of Facebook to stop the exploitation of its platform? Why should we believe your promises when you have yet to make good on promises made in response to past problems? Will you provide the public with more transparency, including detailed and frequently updated metrics of your progress combating these issues?
  2. In 2016, Omar Mateen pledged his allegiance to ISIS on Facebook and proceeded to murder 49 people in cold blood at the Pulse nightclub in Orlando. Six days later, Facebook Vice President Andrew Bosworth reportedly circulated an internal memo that said, “We connect people. That can be good if they make it positive. That can be bad if they make it negative. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. They ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good.” Would you agree these words embody a corporate culture in which the pursuit of profits is paramount even when weighed against possible negative results, such as the death of your own users?
  3. Given the proven connection between extremist content and terrorist attacks, why have Facebook, Google, and Twitter not come together to develop a standard for what constitutes prohibited extremist or terrorist content, and then adopt software to permanently rid your platforms of that content, as was done to combat child pornography, rather than allowing this same content to be uploaded again and again? 
  4. As recently as March 29, CEP found ISIS content on Facebook, including an execution video as well as dozens of materials and pages associated with a violent neo-Nazi group. Why does this material continue to persist on your platforms? Further, why have you not yet taken the appropriate measures to prevent content like this from being re-uploaded after removal?
  5. In recent interviews, you and your COO Sheryl Sandberg have expressed openness to regulation. What type of regulation would you support that would address exploitation of your platforms starting with extremist content that inspires terrorist attacks?
  6. The 21-year-old Communications Decency Act shields tech companies like your own from liability for content posted to your platforms by third-parties. But you all now create and manipulate your own content, producing news, recommending and promoting certain content, and advocating specific points of view. You sell information and profit from the content posted on your platforms. How, then, can you still claim to be a neutral tool that merely hosts third-party content? And, why should lawmakers continue to give you protections under the Communications Decency Act?

 

 

Daily Dose

Extremists: Their Words. Their Actions.

Fact:

On May 8, 2019, Taliban insurgents detonated an explosive-laden vehicle and then broke into American NGO Counterpart International’s offices in Kabul. At least seven people were killed and 24 were injured.

View Archive