• Press Release

Global: Emerging Technology and AI Are Driving the Crisis of Widening Inequality Around the World

October 2, 2023

A CORUÑA GALICIA, SPAIN - SEPTEMBER 05: A person observes one of the works at the opening of the international exhibition 'Al: More than human', at the headquarters of Afundacion Obra Social de ABANCA, on September 5, 2022, in A Coruña, Galicia, Spain. (Photo By M. Dylan/Europa Press via Getty Images)
Many uses of emerging technology, including AI, are directly contributing to widening social, racial, and economic inequality around the world, according to new Amnesty International research. The report, Digitally Divided: Technology, inequality, and human rights (Digitally Divided), takes a wide look at different uses of emerging technology across many sectors and shows how many of today’s emerging technologies are built upon models of economic extraction that perpetuate historical biases, particularly against racialized people, people experiencing poverty, people with disabilities, and other marginalized groups.

“What we’ve seen around the world, particularly as a result of the pandemic, is a drastic increase in the ways that people and governments rely upon various forms of emerging technology. Many of these technologies, particularly those based on a business model of data extraction, have been shown to harm marginalized groups, in ways that often perpetuate or exacerbate structural inequalities,” said Eliza Aspen, a researcher on technology and inequality at Amnesty International. 

Digitally Divided examines how a variety of new technologies are increasingly central to how people access work and essential services around the world, and can impact people’s ability to access healthcare, housing, education, social protection, and other essential services that are core to human rights. Labor rights and technology are an area of particular focus in Digitally Divided, as governments around the world have grappled with the human labor exploitation that often is at the heart of many generative AI tools.

Previous research by Amnesty Tech that has examined the extractive nature of the surveillance based business model of big tech companies, as well as the unregulated and growing use of algorithmic decision making in the public sector, particularly in the case of discriminatory systems used to detect fraud in social security enforcement agencies. “Violations of the rights to privacy and non-discrimination are built into the business model of many tech companies,” said Eliza Aspen, “and increasingly, we see how these systems can undermine the rights to social protection and decent working conditions.” 

“The context in which these technologies are developed and deployed is crucial, because, particularly as we’ve seen since 2020, health and economic outcomes are increasingly tied to  previously existing racial, economic, and social inequalities. When big tech companies’ opaque algorithms are crucial to how marginalized people access essential information or services online, or public sector mobile apps are central to how people access their right to asylum at the U.S. border, for example, these are clear examples of how technology, in a certain context, can actually exacerbate existing inequality,” said Eliza Aspen. Both wealth inequality and extreme poverty are on the rise around the world, particularly since the COVID-19 pandemic, which has wiped out almost four years of progress in global poverty reduction and pushed an additional 93 million people into extreme poverty. Amnesty International called for universal social protection in response to the growing crisis of inequality, which is increasingly a driver of human rights violations.

Digitally Divided also builds upon Amnesty’s previous work to help show how human rights violations are increasingly intersectional in nature, with factors of marginalization like gender, sexuality, economic status, race, and disability being essentially intertwined. The report identifies people who are most impacted by technology and inequality as groups with insecure citizenship status, including refugees and asylum seekers; people who experience structural racism, particularly Black, Indigenous, and other racialized groups; people experiencing poverty or economic insecurity; incarcerated or formerly incarcerated people, as well as others who interact with the criminal justice system; children and young people; and people living with disabilities. “Many marginalized people are forced to rely upon tech platforms for communication, work, and education, even as their ability to protect themselves from privacy violations and discrimination – which is often facilitated by these technologies – continues to be eroded,” said Aspen. 

The report follows other research by Amnesty Tech that has highlighted the growing intersection of technology and social welfare around the world, particularly with regard to the increasing use of algorithmic decision making. A legal opinion filed by Amnesty and seven other rights organizations in November 2022 raised concerns over the adoption of a law approving the creation of a centralized government database to process personal data for those applying for social security support, which Amnesty researchers said was likely to affect people’s rights to social security, equality, and non-discrimination. The report Xenophobic Machines, published in 2021, exposed how racial profiling was baked into the design of the algorithmic system used to determine whether claims for childcare benefits in the Netherlands were flagged as potentially fraudulent. “New forms of digitization are being rolled out in more areas of public and private life, with cases of particular concern related to migration and asylum, labor and workplace management, and the criminal justice system,” said Aspen. 

Digitally Divided is the first of four briefings on technology and inequality, which are aimed at starting conversations among civil society organizations, governments, and movements concerned with rising inequality around the world, particularly in the context of new technologies like AI.

Contact: [email protected]

Read the full report.