• Press Release

Congress must hold Big Tech to account for their rights-abusing business models

March 25, 2021

LIONEL BONAVENTURE/AFP/Getty Images
Ahead of the CEOs of Google, Facebook and Twitter testifying before Congress today, Joe Westby, Acting Deputy Director of Amnesty Tech, said:

“Social media companies have failed to protect people from content on their platforms that incites violence or advocacy of hatred. And despite growing calls for action, users continue to be bombarded with highly targeted advertising, political messages and propaganda.

“Big Tech’s promises of reform ring hollow when companies fail to acknowledge that the true cause of the problem is their data-hungry platforms themselves.

“The business model of Big Tech firms like Google and Facebook depends on capturing people’s attention to generate ad revenue – to that end, the algorithms that determine what we see on Facebook’s newsfeed or Google’s YouTube frequently amplify discrimination and inflammatory content.

“These companies appeal to our emotions of fear and anger to keep us staring at our screens.

“This can have a devastating effect at a population scale, fueling polarization, division, or serious human rights consequences.

“The inaction of states around the world, who have left these tech giants to self-regulate, has allowed their platforms to grow to a scale where harmful content can spread like wildfire.

“Congress must not fall for Big Tech’s false promises. It must tackle the root causes – unaccountable algorithms and a core business model based on invasive surveillance and profiling.”

Background: 

The heads of Facebook, Google and Twitter will testify to Congress on Thursday over the companies’ role in the proliferation of misinformation online.

Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Twitter CEO Jack Dorsey and will face questioning from two Senate subcommittees and the House of Representatives’ Energy and Commerce Committee, during a virtual hearing entitled: “Disinformation nation: social media’s role in promoting extremism and misinformation”.

The scrutiny comes after the social media companies were accused of failing to take action to stop the proliferation of misinformation during the 2020 election. Facebook has been accused of allowing groups linked to the QAnon, boogaloo and militia movements to glorify violence in weeks leading up to the events of January 6 at the United States Capitol and nationwide.

Ahead of the company’s testimony, Facebook’s Integrity VP Guy Rosen said in a blogpost that the company takes “a hard line” against disinformation and blocks millions of fake accounts daily. YouTube states that its recommendation systems “do not proactively recommend” content that “comes close to – but doesn’t quite cross the line of – violating our Community Guidelines”.

Contact: [email protected]