Illustration by Rawand Issa for GenderIT

I turned 18 in the first weeks of Y2K. Folks like me, we’re the last Gen Xers and the first millennials. We were kids in another century. Time travellers. Our minds are bridges between analog and digital, and will likely live to witness quantum. That’s why we love nostalgia although we can’t remember a thing. Our attention is fragile because 20 apps demand it, urgently, at all times. We are still processing the virus, the lockdowns, the isolation. We are anxious people.

And these are anxious times. We know the climate’s changing in dangerous ways but joining an environmental group feels more stressful. We read an article, sandwiched between a puppy and a war crime, where very qualified people warn of an imminent threat to human existence, and we scroll on. Our cognitive dissonance does somersaults all day just so we can cope.

Everything’s Changing Very Quickly and the cart on the rollercoaster is about to become a bullet train. In response, feminists have done phenomenal work. I remember a time women in internet policy were so few, we kept count. Now, we are a swarm, and that’s undoubtedly the fruit of relentless labouring from a feminist tech movement that spans decades and continents.

Yet, I worry. I worry not for resources or knowledge or brilliance; individually, we are terrific. It's our togetherness that worries me. Folks say feminism is experiencing pushback and backlash, rising this and rising that, and sure, always. But in my mind, this analysis could not compute usefully enough to explain these strange, tense times in our spaces. Rather, I think our digital feminist wave of the early 2000s has been captured by surveillance capitalism, the underpinning, seismic economic shift driven by artificial intelligence (AI). I think it threatens our very ability to be a movement rather than a collection of individuals. In this piece, I will explain why.

This is a stressful conversation because we have built so much of our work of the past 15 years on the platforms of surveillance capitalism, the master’s tools. We’ve invested a lot of money and time, endured a lot of violence, and reaped so many wonderful benefits. We witnessed uprisings and exposed crimes and defied platform censorship, got viral attention and support for feminist content, and built so many terrific relationships through these platforms.

But, I will opine, we have done this at the expense of movement-building. Big Tech had its own economic agenda, hidden under neoliberal empowerment campaigns, shifting towards more invasive data practices every year. It does not care for organised movements - it wants data points. And all indications spell worse things ahead. This malicious cooptation is universal, across ideologies and geographies; it is not unique to us. But we are unique to it, because feminism is, undoubtedly, a collective politic. Under secret algorithms and surveillance economies, we become, I worry, surveillant in our own praxis, conformist in our political thinking to this new algorithmic panopticon.

Our digital feminist wave of the early 2000s has been captured by surveillance capitalism, the underpinning, seismic economic shift driven by artificial intelligence (AI). I think it threatens our very ability to be a movement rather than a collection of individuals.

It is only through a brave migration out of this capture that we can build new spaces to heal our movements and ourselves, that we can arrive together, through these foreboding uncertainties, at a mycelial Feminist General Intelligence. And the good news is this migration, this emergence is - if we can seek it through our exhaustion - exciting, unfamiliar, a jolt to our imaginations - as feminist tech has always been. If there ever was a moment for bold imaginations, this is that moment: terrifying and invigorating all at once!

Another world, Arundhati, is hyperventilating.

Part 1: The Capture

I will start with the conclusion: the offering of social justice activism to Big Tech, even with the best intentions, the most revolutionary ideas, and the highest, most positive engagement from audiences, is harmful to the discourse and to the movement in immediate ways.

To understand why, we must unpack the business model of Big Tech and digital economies and examine what is incentivized, using as a guide the seminal 2019 book The Age of Surveillance Capitalism by Shoshana Zuboff. Understanding technology through a political economy lens helps us avoid the pitfalls of trends and techno-solutionism. It helps us set a meaningful, rather than a reactive, agenda.

Optimal Tech

Oftentimes we think technology products are designed most optimally and usefully. Advertising bombards us with this fallacy - that your vacuum cleaner was designed with all the ergonomic, health, and efficiency concerns possible. But these motivations of competitive advantage do not compare to the primary motivation of cutting costs and achieving exorbitant profits - especially at the expense of Less Important People. Consider, for example, the fact that there is no technical reason why we cannot send messages or payments across platforms the same way we can browse websites and receive emails across browsers and apps. This is because protocols like HTTP and SMTP were adopted in an earlier context that valued openness and interoperability. It is technically possible and indeed beneficial to adopt this universally for chat and payments. But Big Tech and Wall Street want to keep it proprietary and closed.

A Brief History of Digital Economies

In the late 90s and early 2000s, internet startups were grossly overvalued because everyone wanted to invest in this disruptive technology (think NFT markets of last year). This led to the dotcom bust which wiped out a large number of the companies and caused all surviving startups - Google among them - to find new revenue streams. With its primary service being a search engine, Google had access to massive amounts of data about user behaviour through search queries and clicked links. They decided to capitalise on this vast wealth of data and enhanced their ad posting and click-through rates by aligning them more accurately with the user's interests. The results were amazing and advertising quickly became as significant as search, if not more so.

In optimising for profit, the Google-gated internet has become unbearable. You cannot look up a recipe without wading through dozens of paywalls, ads, auto-playing videos, clickbait, tricks, and lies. This marked the birth, according to Zuboff, of surveillance capitalism, as Google turned user data into a commodity - and got away with it. In this new form of capitalism, revenue is generated by leveraging data acquired through surveillance, whether secretly or explicitly in the terms of service. Others took note and the new wave of internet companies that started in the mid-2000s knew they needed to mine data to make profits for their shareholders: Skype, LinkedIn, MySpace, followed a few months later by Facebook, YouTube, Twitter, and Spotify. Zuboff writes,

“Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to product or service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as "machine intelligence."

[T]he competitive dynamics of these new markets drive surveillance capitalists to acquire ever-more-predictive sources of behavioural surplus: our voices, personalities, and emotions. Eventually, surveillance capitalists discovered that the most-predictive behavioural data come from intervening in the state of play in order to nudge, coax, tune, and herd behaviour toward profitable outcomes. Competitive pressures produced this shift, in which automated machine processes not only know our behaviour but also shape our behaviour at scale. With this reorientation from knowledge to power, it is no longer enough to automate information flows about us; the goal now is to automate us.”

In optimising for profit, the Google-gated internet has become unbearable. You cannot look up a recipe without wading through dozens of paywalls, ads, auto-playing videos, clickbait, tricks, and lies. This marked the birth, according to Zuboff, of surveillance capitalism.

We could elaborate on all the claims in Zuboff’s passage above, but the thing is: you know this. You feel it. You experience it. You’ve watched The Social Dilemma. But you are also stuck - as am I. We can make radical personal changes, you and me, sure, but individual choices matter little when not part of a concerted, organised effort, a critical mass. That was one important lesson from measuring the impact of a decade of digital security workshops. Individual attempts matter even less than our choices as parts of a movement: organisations, collectives, networks. To explore what concerted efforts might be possible, let us first look at how we got here.

The 'Social Media' Narrative

Narrow AI, which is AI focused on a specific task, in its many applications, has been so seamlessly part of our daily experiences for years that now if you visit a website or download an app that doesn’t recognize you, you feel it might be out of date. It’s an exponentially developing technology, radically different from previous new tech that we could spend years regulating and adapting to. The first personal computer has not been around for more than 50 years, the first smartphone not even 20.

In 2007, with the spreading of social media, many of us who had been bloggers and webmasters got really excited about using these platforms as sites for political engagement. I remember being so excited, I started running workshops and presentations to get women’s organisations like AWID onto Facebook and Twitter. SMEX, the largest digital rights organisation in the MENA, was literally named: social media exchange. By the time Arab Spring protests erupted in 2011, everyone in the world was praising the “hashtag revolutions”.

Only a couple of years later, Facebook and Twitter began deploying manipulative algorithms that eliminated any “social” or “media” offerings. They became, explicitly, data corporations and their interfaces data-stealing platforms. But they kept the ‘freedom’ narrative because every economic shift needs a compelling story, a marketing campaign. In No Logo, Naomi Klein describes how this worked for globalisation and neoliberalism.

Feminists, leftists, progressives, queers, we are wrong to think - like we often do - that we are immune to marketing. In fact, the data corporations sold their story of freedom of expression to us first, because we needed it most. Against our criminal governments and their media industries, of course we chose the internet. But we were not, in fact, on the internet, we were on a privatised section of the internet that was inching towards surveillance capitalism every day.

The companies also positioned themselves as pro-feminism, pro-LGBT, and as sites for our advocacy. They promised they’d listen and created oversight boards and gave Prominent Activists superuser settings. But the business model of data harvesting needed hate speech; they couldn’t do enough to curb it. When Elon Musk took over Twitter in 2022, he admitted this openly in one of his interviews that hate speech (a term he questions) creates drama that brings more engagement. To keep their platforms competitive, Musk is no different than Dorsey, Altman of OpenAI no different than Zuckerburg, even though one might appear more soft spoken, a little kinder than the other. And now that a platform like Twitter actively promotes hateful content towards queers, we in the Rest of the World pay a heavy price with little agency to stop it within the mechanisms imposed by Big Tech.

Feminists, leftists, progressives, queers, we are wrong to think - like we often do - that we are immune to marketing. In fact, the data corporations sold their story of freedom of expression to us first, because we needed it most.

Two moments stand out for me in this history. One was the Google doodle for the Sochi Winter Olympics in 2014, and the other was Facebook’s rainbow filter in 2015. At the time, our biggest worry was ‘slacktivism’. Little did we know, a far more sinister move was happening from rainbow capitalism to rainbow surveillance capitalism.

The cyborg, Donna, was forced to choose one of 54 genders.

Part 2: Surveillance Feminism

Social movements arise as a reflection of the economic conditions in which they operate - either to resist or to assuage. I propose, here, that we are living in the time of ‘surveillance feminism’ - a wave heavily influenced by the rules of the algorithms and by the mass electronic surveillance under which it operates.

I focus on three areas: contentified discourse, feminists in the gig economy, and auto-surveillance.

Feminism Contentified

What happens to feminist discourse when it is ‘contentified’ by these corporate platforms? It goes through the same machinery as any other data: analysed, codified, sold. Advertisers need to sell - what does it matter the political consciousness of those they are selling to? They morph to the times. Who would buy feminist content as big data? And for what purpose? How is it used in AI models? I wonder about these questions all the time and worry, most of all, that the platforms have incentivized us away from intellectual honesty and further into the trappings of surveillance capitalism.

One trap is that the discoverability algorithms punish our accounts by reducing reach if we don’t keep feeding the company more and more content. Artists suffer the same challenge on streaming platforms like Spotify. So we offer up all we can: commentary, mantras, trauma, anger, shock, grief, s.o.l.i.d.a.r.i.t.y. What happens to discourse in this context? Does it thrive? Does it relish in the exchange of ideas? No. Over time, it becomes jaded, snappy, tired. It conforms to the dataset. It becomes predictable. Observe the large language model (LLM) writing up several drafts of an article by a young feminist.

Intersectionality feels to me antithetical to the reduction of our selves to data points and our very understanding of intersectionality becomes, regrettably, reductive and flat. On one hand, tiny little rewards we get for posting about violence or trauma, in the form of likes and views, which companies flat-out lie about all the time. On the other hand, a gnawing feeling of guilt for benefiting from something terrible that we appease by thinking: we’ve raised awareness. We know, as we are posting, that change won’t happen here. To expose hate and note our disdain, we amplify it, we fan the flames. How could this, over time and scale, not influence our commitment to ending violence? How, when it becomes our brand? The activist as politician and the politician as activist.

A friend says electronic communication is the problem - that we used to endure the same toxicity in email groups. She is right; there is something about screen-mediated communication that isn’t conducive, ask anyone in a long-distance relationship. But there’s something more manipulative under surveillance that comes on top of this, a kind of engineered performativity that the AI can read and encourage and manipulate.

Another friend says we cannot “cede the space” on these platforms to misogynists and that they are in fact public by sheer volume of their significant number of users. I disagree. It is the equivalent, for me, of saying we must hold our protest in a shopping mall. A suitable analogy for these platforms is “Pops”, privately-owned public spaces, that are built to resemble public squares, parks, and gardens but are heavily controlled and commercially designed.

Feminists as Gig Workers

The gig economy, growing by double-digitals every year, affects the livelihood of 1.1 billion on-demand gig workers around the world. In the mid-2010s, Silicon Valley campaigned for workers to embrace the “freelance revolution” and tech companies promoted a new economy that takes significant cuts from workers’ earnings and offers no benefits, days off, or health insurance. Surveillance capitalism incentivizes gig work and, once again, radicals and revolutionaries are not immune to its trappings. Young feminists - terrific and brave as their work is - are more incentivized under this economy to be freelance feminists: content creators and influencers, than to work within collectives. Techies who used to fill our movements with experiments in hacker groups and collaborative projects are now incentivized to be “technology fellows.” I, too, am engulfed by this economy.

Intersectionality feels to me antithetical to the reduction of our selves to data points and our very understanding of intersectionality becomes, regrettably, reductive and flat. On one hand, tiny little rewards we get for posting about violence or trauma, in the form of likes and views, which companies flat-out lie about all the time. On the other hand, a gnawing feeling of guilt for benefiting from something terrible that we appease by thinking: we’ve raised awareness.

Perhaps this was also influenced by the proliferation of Master’s Degrees in Gender and - to find livelihoods - the rise of the Gender Expert and the “intersectional feminist” heading on resumes. Perhaps also, NGOization does this to a movement. In many of the feminist NGOs, staff are no different than gig workers: without access to decent employment benefits and fair treatment, vulnerable to the whims of an executive director who’s held her position for 20 years and sees no urgent need to address this corruption. It is the Great Feminist Alienation - from our labour, from our comrades, from our very political selves.

In no way do I imply that the content feminists produce - videos, opinions, explainers, reporting - is not excellent or inspiring or that we should retreat from a confrontation with hatred or violence. Not at all. My objection is the servers to which this knowledge is uploaded precisely because they do a grave injustice to this intelligence.

We must think seriously of ways to migrate towards independent, self-governed, or worker-owned platforms and take our audiences with us. This offers a terrific chance for us to rethink our “public” and how we communicate with them, to rethink our consciousness-building strategies, to draw up plans to gradually break free of these platforms, to rethink how we express support and who we are accountable to. We must re-engage with open-source, anonymity, pseudonymity, and all the ways one can be free in their virtual selves. We must get off of data corporations and back on the internet.

Otherwise, I worry, over time, collective work and movement-building - already very difficult and complicated on their own - retreat to a giggification of activist labour. A migration from these platforms and new resistances of surveillance capitalism are necessary (albeit not sufficient, but a necessary step) to free our movements from these trappings and to free ourselves from the worst effects of this surveillance: fear of each other.

Auto-Surveillance: The Cops Become Us

Surveillance, we remember, is a project of control. Feminists have historically understood this really well, both theoretically and viscerally; is gender not a product of surveillance?

In the panopticon, we, the watched, become the watchers. Norms emerge. Fear, conformity, performance, performativity. We become more afraid of each other rather than of our political opponents. The cops become us. Under surveillance capitalism, the freelance feminist must distinguish herself from other influencers in what becomes One-Up Feminism, lost in semantics and arguments for their own sake rather than praxis. Our privacy is under severe violation and prone to more threats all the time. We are made to sacrifice our privacy for connectivity.

To do the difficult work we all keep saying we should do: the calling in, the calling out, speaking truth to power, holding the powerful accountable, seeking historical and immediate justice, stopping All This Violence, healing, building alliances, we cannot operate from a place of fear. We cannot operate from within surveillance - it negates the very idea of safe spaces.

We must think seriously of ways to migrate towards independent, self-governed, or worker-owned platforms and take our audiences with us. This offers a terrific chance for us to rethink our “public” and how we communicate with them, to rethink our consciousness-building strategies, to draw up plans to gradually break free of these platforms, to rethink how we express support and who we are accountable to.

When we work from fear, panic takes over and we cannot explore the right things to do in the short term or what we should build towards in the long term. We worry so much about critique on these vast, merciless platforms that we hesitate to try something new. We find more comfort and safety in critique mode. Activism finds better praxis in science, not in logic. It must allow room, always, to play, to pollute politics with practice. In a 2019 interview, feminist scholar Donna Haraway says:

“We need critique; we absolutely need it. But it’s not going to open up the sense of what might yet be. It’s not going to open up the sense of that which is not yet possible but profoundly needed.”

A friend says this mass surveillance brought useful power like the #metoo movement  and exposing everyday sexism. Yes, I reply, terrific things have happened on these platforms as one can posit that different feminist waves brought great things at great compromise. In my mind, what we gained is not worth the cost of these privacy violations nor do I think will these good things sustain. I do not disagree that cameras pointed at the powerful are necessary, like body cams or in the hands of journalists who document crimes and hold power to account. But mass electronic, pervasive, economic surveillance, there is no truth in it, it does not bend towards justice. For two decades, we have celebrated the camera at the protest pointed at the police, but we have not paid enough attention to the same camera at the protest we point at each other.

Historical challenges lie ahead of us in the coming weeks and months. Algorithms are a mirror of real world social power. We cannot be focused on beautifying the reflection or thinking about how to change the angle to capture a different reality. We must change the real world balance of power it’s reflecting. We have to win IRL. And to have any chance of discussing how we win and what we need to do differently, we must start with a Great Migration. Naomi Klein says,

“There is a world in which generative AI, as a powerful predictive research tool and a performer of tedious tasks, could indeed be marshalled to benefit humanity, other species and our shared home. But for that to happen, these technologies would need to be deployed inside a vastly different economic and social order than our own, one that had as its purpose the meeting of human needs and the protection of the planetary systems that support all life.”

Part 3: The Emergence

I made a friend last summer at a tech conference and when we presented our work, theirs was about feminist utopias and mine was about the end of everything. I asked how they could see optimism in these conditions, and they looked at me with kind confusion as if utopia was right there, on the chair, sitting between us. It hit me that my pandemic years had me stuck in a pessimism of the intellect because I had lost community.

When we work from fear, panic takes over and we cannot explore the right things to do in the short term or what we should build towards in the long term. We worry so much about critique on these vast, merciless platforms that we hesitate to try something new.

We cannot think about utopias alone. I wish we could. There will be no Expert Consultant to tell us. Only organised global networks can move rapidly on urgent issues to exert meaningful pressure to uphold our rights and freedoms everywhere, not just in Europe. Our critical ideas on decoloniality, intersectionality, and solidarity will serve us well in this endeavour. Digital freedoms can no longer stand alone, we must cross-cut them across all our work, bring all social and ecological movements into shared goals and strategies. Here, I end with a focus on one critical issue that showcases why this is most urgent.

Labour and Automation

Women in the gig economy endure numerous hurdles of discrimination, violence, and pay gaps. Legislators and civil society organisations alike have been very slow to address and curb these fundamental labour rights violations. Most of the really important work is, understandably, from universities and professional organisations. Feminists are debating proposals like universal basic income and what it could mean for historically invisibilized labour like housework and care work.

What’s amazing is that now, only a decade into the gig economy, we must contend with partial or full automation of work, as AI gets better at various tasks and gradually, inevitably takes over gig jobs. In many cases, the gig worker is, unwittingly and non-consensually, training the algorithm to do so, as the platform harvests and analyses the worker’s data. Linguists, for example, pour tedious underpaid labour into training Google projects like Assistant and Translate. Amazon’s Mechanical Turk service for businesses who need data processing hosts over 200,000 gig workers. Everyday internet users also participate in this, inputting decades of free human labour per day. We are all looped into this.

Friends say we need not panic because this automation worry is recurrent throughout history. Indeed, of course, even Aristotle in the 4th century BC thought of automation:

“If each instrument could do its own work, at the word of command or by intelligent anticipation (...) A shuttle would then weave of itself, and a plectrum would do its own harp playing.”[1]

“In this situation,” Aristotle continues, “managers would not need subordinates and masters would not need slaves.” All consulting firms right now have put massive resources into exactly this angle. Their “Future of Work'' discourse is all about opportunities for reducing costs for capitalists. Hundreds of reports and op-eds claim that global wealth will increase, that workers will be 10x more productive, that we can train populations to adapt to new jobs. Who are they saving with “labour-saving technology”?

Digital freedoms can no longer stand alone, we must cross-cut them across all our work, bring all social and ecological movements into shared goals and strategies.

Recruitment, decision-making, and performance management algorithms are already presenting new modes of suppressing workers’ rights. We cannot afford a lax posture on these issues or to get stuck in the debate of how probable or how soon our livelihoods - and all intersecting issues - will suffer. We must treat it with the mindset that we, personally, undoubtedly, will not have jobs within the next decade.

If we know that during Covid, domestic violence numbers spiked because men stayed at home and were anxious about finances, what do we expect will happen when jobs are lost, en masse, to automation without safety nets, without new, fair ways for decent living?

As we scramble to eliminate data bias on race and gender, the companies are scrambling to remove bias against billionaires. If I were a billionaire, of course I’d know that mass unemployment threatens my status. Perhaps I’d also worry that a superintelligence would find my hoarding of wealth the main problem for any prompt. Maybe OpenAI knew they were committing crimes when secretly scraping “massive amounts of personal data from the internet” in “unprecedented” ways, and took a bet on a race to Artificial General Intelligence (AGI) making them too big to fail.

Oftentimes, social movements come up against the Great Wall of Capitalism and say “oh, well”, and turn back to re-examine something else that suffers within this model. Feminists do not even blink when proclaiming that we can and will uproot 12,000 years of patriarchy. Capitalism, in comparison, has only been around a few hundred years, yet we are conditioned to think we cannot possibly defeat it. We cannot let this happen with surveillance capitalism - it will not be any kinder a system and we – the experts in pointing to structural, systemic problems – must reckon with this one head on.

We are like Shcrodinger and his cat. Two worlds exist right now. One is much like this one. In the other, we are free. What our movements decide to do right now will collapse the superposition into one realised state. Let us be the spectre, the memory, and the herald, friends. Let it be liberation.

Add new comment

Plain text

  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <br><p>