Our website uses cookies! You can disable them by changing your browser settings but if you carry on using the site we'll assume you don't mind! Read our privacy policy for more details.

Islamophobic governments have a new weapon – AI and algorithms are the new surveillance state

New technologies are reinforcing age-old inequalities in the 21st century

Illustration by Heedayah Lockman

In today’s digital age, most decision-making processes are left to the hands of computers and artificial intelligence. From bosses to teachers to doctors, those in positions of power routinely use digital tools to reduce their work and speed things up. 

But in the hands of authoritarian governments, this increasing efficiency is being weaponised against vulnerable populations. Human rights violations that would have previously been committed at the hands of the state are being outsourced to surveillance technologies, making this process swifter and faceless. In particular, the global war on terror has seen Muslims across the globe disproportionately subject to this oppressive form of profiling.

Automated Apartheid

This is what Amnesty International have termed Automated Apartheid. In the context of Israeli settler colonial occupation, Palestinians are surveilled and treated as suspect, with technologies being weaponised by Israel to control their movements and restrict their freedoms. 

The Israeli authorities have installed a closure system since the 1990s, with checkpoints and biometric surveillance being used against Palestinian civilians. Israel Defence Forces soldiers have facial recognition technology installed on their phones, which they use to scan Palestinians’ faces at any moment. CCTV on the streets spy on Palestinian peoples’ movements constantly, suspending Palestinians in prison-like conditions on their own land. 

Rasha Abdul Rahim is an Independent Expert on technology, human rights and social justice and current Executive Director of People vs Big Tech. “It is no secret that Israel has long used surveillance, biometric and other types of technologies to facilitate its genocide against Palestinians Gaza, as well as entrench its oppression of Palestinians in the West Bank,”she says. “Technologies are integrated into Israel’s very system of racial discrimination and domination, violating Palestinians’ basic rights, such as the freedom to move from place to place.” 

Rasha stresses that this automated apartheid will have repercussions beyond Palestine. “Israel is known to use Palestine as a testing ground for all sorts of weapons and surveillance technologies, which it then exports to other governments. So if you think these technologies won’t ever affect you, think again.” 

In the context of the most recent year-long assault on Gaza, this is only being exacerbated further. Just this month, the Biden administration approved the deployment of 1,000 CIA-trained private mercenaries as part of a joint U.S.-Israeli plan to turn Gaza into a “high-tech dystopia”. The plan aims to erect walls around the neighbourhood, forcing residents to enter and exist using CIA contracted biometric identification. Those who resist will be refused humanitarian aid.

Databases have long been employed as an oppressive technology by those in power.  

In 2021, a report by the Washington Post exposed Israel’s Wolf Pack system, a vast database of information on Palestinians from the West Bank, and is shared with the Israel Security Agency that determines who should be arrested, among other things. During the Holocaust, computer company IBM supplied the Nazis with databases to monitor and track the movements and murder of the Jewish population.

The weaponisation of surveillance technologies against marginalised communities, particularly ethnic groups who have been historically persecuted by the state, is a growing human rights problem in the digital age. 

Big Tech sells war

Oppressive AI technologies and surveillance tools are in bed with authoritarian governments and offer them unwavering loyalty in exchange for their custom. A report in 2022 found that more and more tech companies are becoming military contractors, revealing the insidiously close relationship between the two industries. 

Tech companies have vested interests in the growth of the surveillance state, propping up oppressive regimes with their products. The War on Terror has coincided with the explosive rise of Big Tech, and this is no coincidence. Big Tech Sells War calculated that Amazon, Google, Microsoft, Twitter and Meta have made over $44 billion from contracts with the Pentagon and Department for Homeland Security. In 2021 whilst Israel bombed Gaza, Amazon Web Services and Google Cloud executives signed a $1.22 billion contract to provide cloud technology to the Israeli military and government. 

An Open Democracy investigation found that US tech firms are lobbying local government and police forces in the UK to scale up their surveillance, pressuring decision makers to implement tools that will intrude on our privacy. One advert in the newspaper the New York Times in 2023 captured the insidious relationship between Big Tech and tech companies that sell surveillance tools perfectly. “Palantir stands with Israel” the advert reads in bold white text on a black background , which was published soon after the escalation of the genocide in Gaza. 

Palantir is a US technology contractor that specialises in military and surveillance tools, and Israel is one of its largest clients. It is not surprising then that Palantir stands with Israel since it is Palantir’s capitalist interest that authoritarian states unleash the full force of surveillance on innocent civilians, since these are the interests that line Palantir’s pockets. As one user tweeted, “MurderTech stands with Israel.” would have sufficed. 

Racist profiling

Facial recognition technology is favoured by authoritarian governments worldwide. The technology works by capturing facial features and converting them into numbers, creating facial data which is then compared to other facial data in a database. By scanning the distinct points of our faces, these tools make biometric maps of us without consent, reminiscent of colonial uses of eugenics and phrenology, techniques which used measurements of facial features to compare and make racist assumptions about populations. As author Wendy Wong writes in We The Data, the use of facial recognition technology is problematic not simply because it impedes on our privacy, but because the ways it is used to make (inaccurate) predictions about our behaviour based on our facial features is discriminatory.

One Muslim population who knows this all too well are Uyghur Muslims. In East Turkistan, a region of Central Asia which is under the occupation of the Chinese government, the Uyghur population are currently being subject to a state-sponsored regime of ethnic cleansing. The tools making this possible? Facial recognition technology and artificial intelligence.

Similar to the surveillance infrastructure enforced on Palestinians by the Israeli government, Uyghur people are subject to invasive surveillance and facial recognition technology which is used to single out Uyghurs for the detention camps, where they are sent to be sterilised and ‘re-educated’ to eradicate their culture and existence. In 2018, technology giant Huawei filed for a patent on the ability to determine whether someone was of Uyghur or Han ethnicity. 

Big Tech giants are racing to cash in on invasive technologies that impede human rights, profiting directly from the erasure and assault on Muslims in Palestine and East Turkestan. It is reported that 1.5 million Uyghur Muslims are in detention in East Turkestan, with Chinese tech companies directly profiting from the inescapable surveillance they’re subject to.

Zubayra Shamseden is the Vice President of the World Uyghur Congress. She believes that facial recognition technology is the intensification of the repression and genocide of Uyghur people. “With the support of high tech, especially with the support of facial recognition, China was able to ‘single out’ Uyghurs from anywhere in China. This tech has aided China’s targeted repression against Uyghurs in order to track them, arrest them, detain then, either send them to concentration camps for dehumanising, killing, raping, torturing them, or send them to prisons or Chinese factories.” 

These surveillance tools become synonymous with the repression of minoritised people by authoritarian states. She explains that tracking software is installed forcibly on Uyghur people’s phones, and facial recognition technology spreads across public space, in schools, offices, shopping centres and outside mosques, leaving them no freedom or safety from the constant watch of the Chinese government. 

💌

Breaking up the machine

But tech workers are speaking out against the insidious relationship between today’s tech giants and oppressive governments that target Muslim populations. In June this year, over 1200 STEM students and workers pledged not to take jobs or internships at Google or Amazon, companies involved in Project Nimbus, a $1.2 billion project that provides cloud computing to the Israeli government. 

No Tech For Apartheid is the global movement responsible for this action. Made up of Google and Amazon tech workers who condemn the use of technology to fuel settler colonialism, and stand in solidarity with the Palestinian people. 

The movement was started by Jewish Google employee Ariel Koren, and the reaction from management was far from friendly; she was told to agree to move to Brazil within 17 days or lose her job. Since the escalation of the Israeli military’s assault on Gaza in 2023, the movement against Big Tech’s war profiteering has only gained more traction, erupting in protests across the world, revealing the growing collective consciousness that Big Tech’s business model profits from human rights abuses across the globe.

As artificial intelligence and emerging technologies become more advanced, tech companies are racing to cash in, and for them, human rights are purely a barrier standing in their way. As digital citizens it’s vital we speak up against the use of AI and algorithms as a new weapon of war.

What can you do?

Illustration by Heedayah Lockman @heedayahlockman who says, “This illustration depicts a Palestinian individual under constant surveillance, with multiple cameras scrutinising him. His face, scanned biometrically, shows an attempt to catalogue him as a threat due to his nationality and racial identity. Personal data around him, exposed and weaponised, highlights the dehumanising impact of racial profiling and how technology can reinforce discrimination and violate basic human rights.”