There is no shortage of evidence and research that show online gender-based violence is the same old tale of a political, economic, cultural and social power imbalance to keep a hierarchy of status based on one’s gender – a system that often privileges the experience of cis heterosexual men. Yet, there are aspects to the digital spaces that incentivise the proliferation of gender-based violence towards cis women, trans people, queer and gender non-conforming persons. Violence has a way of manifesting itself across different platforms – SMS, Zoom, Telegram, Facebook including the newly emerging platforms like Tik Tok and Clubhouse. The problem, therefore, lays not merely in the technology itself, but the underlying logic and profit model that propels the modus operandi of the algorithm, the content moderation policy and all other technologies deployed to run the digital ecosystem.

The problem, therefore, lays not merely in the technology itself, but the underlying logic and profit model that propels the modus operandi of the algorithm, the content moderation policy and all other technologies deployed to run the digital ecosystem.

It was never about the people’s power

 

The idea of a laissez-faire approach to internet governance – the self-regulated, unencumbered, open market has indeed disrupted the power held by traditional institutions i.e. government and media conglomerates. New technologies provided alternate spaces for marginalised and geographically dispersed communities to engage and catalysed new ways of self-expression. Yet, the disruptive impact of the internet has been uneven, and in some ways, power is ever more obscured, less visible and harder to resist.1 Transnational digital corporations like Facebook and Google, and Super Apps like Grab in Asia, are increasingly consolidating their power across sectors – from banking, advertising, home, news, automobiles, and more. The monopolistic concentration of power has led to the emergence of what Shoshana Zuboff calls “surveillance capitalism” – a system that claims every aspect of human’s experience as free raw material for translation into behavioural data, with the ultimate aim to predict human’s future behaviour and to automate us.2 Whether it is buying that lipstick at 2 a.m. on Friday or promoting a video by the left-wing political video to you as you mindlessly scroll through your Facebook. These corporations and their business visions are now mediating the lens through which we view our world.

 

When human experiences and connections become a means to a commercial end, contents and expressions that encourages interactions are welcome and prioritised. Facebook’s algorithms are designed to highlight contents that generate strong reactions, whether joy or indignation, whether it is a cute puppies meme or hate speech against the trans community.3 These algorithms then dictate how information, contents and news get reconfigured, re-assembled, buried or amplified.4 It provides a new avenue for aggressors to collectivise and abuse women and queer people at a much quicker speed. Our social identity i.e. class, gender, sexual orientation, ethnicity, religion, bodily ability etc. still dictates our ability to access digital spaces and the quality of our engagement in these spaces.5 Being able to say or express something in digital spaces does not always mean you are heard, this is especially true for voices that are marginalised. The laissez-faire approach to content generations on these spaces always means that companies are not claiming any responsibility or duty to eliminate discrimination or to ensure a safe space for truly equal rights to freedom of expression for all.

The laissez-faire approach to content generations on these spaces always means that companies are not claiming any responsibility or duty to eliminate discrimination or to ensure a safe space for truly equal rights to freedom of expression for all.

Case study: an unintended Tik Tok creator

 

Allie is a 20-years-old cis woman university student. She started putting random content about her everyday life and thoughts on Tik Tok early this year. Her account had less than 100 followers then, with most of them being her friends. One day, she posted a video of herself dancing in her room, wearing a cartoon character onesie, without her bra. The next morning, she woke up to a million viewership on that particular Tik Tok video, accompanied by thousands of sexually offensive and slut-shaming comments, all focusing on her breasts in the video. For reasons that she did not fully understand, her video started appearing on the “for you” page of many men from other countries. She observes that her follower's number has since then increased, qualifying her for the local Tik Tok creator program. At the same time, the interactions and comments for her subsequent videos (especially for those where she spoke about feminism and social justice issues) remain low and sometimes as low as 500 viewership for an account with 71,000 followers6. In an interview with her, she has expressed how the said video has messed up her Tik Tok account and she does not know how else she can reclaim and redirect her account to her intended audience. Every now and then, she will receive messages from men asking her to produce a similar video of herself dancing in a onesie.

 

Allie’s experience highlights the underlying faulty logic that engineers the algorithms and content policy in most social media platforms. The objectification of her body is multi-layered too. Sexual objectification of women’s body is, unfortunately, nothing new and remain an on-going struggle even in the digital spaces. In addition to that, Allie’s body is reduced to data points and resources to be harvested and exploited, where she has no agency as to how her body should be treated or viewed by the algorithm. All data exists as part of our embodied self and every decision made based on our “data bodies” affects our very physical bodies. Data does not exist outside of our bodies, they are an extension of our bodies. Misuse and abuse of our data is in effect not just a breach of personal data, but a violation of our bodily integrity.7

 

Our likes and dislikes are tracked, analysed and predicted constantly, yet, these data do not fully represent the complexity and fluidity of our lives and society. They are designed to mine behavioural data that are out there, and never to challenge the status quo or norms. We are forcefully slotted into boxes and are presented with contents that fit into the machine’s prediction of our behaviour. The result is the reinforcement and legitimisation of misogyny and a culture of slut-shaming that has been serving the capitalist market well for centuries. It is done at the cost of gender equality and equal access to freedom for all. There continues to be little to no economic incentive for corporations to relook into their algorithms and profit models. As human beings, we are susceptible to confirmation bias, we are always searching and accepting contents that reinforce our assumptions and dismissing evidence that shows otherwise.8 This matters a lot when our pre-existing societies are inherently patriarchal and prejudiced towards marginalised communities.

 

Imagining a feminist internet future

 

Social media companies’ existing mechanisms to combat online gender-based violence remain far from adequate, especially for women and queer people from the global South. Yet, resisting the platforms or staying away from digital spaces is no longer a viable option for many of us. The recent Oversight Board was yet another attempt by Facebook to address ever-ending vitriol, violence and viciousness on their platforms, of which the effectiveness of the board has been questioned and remained to be seen.

As aptly described by Shoshana Zuboff, our effort to combat online gender-based violence begins with the recognition that we must hunt the puppet master, not the puppet. Unless and until the issue of commodification of attention and data can be addressed, no amount of technocratic fix will address the core problem. Perhaps the most important conversation is whether we can do away with Facebook, Google and all these giant digital corporations and transform the current data-driven profit models? My answer to the question is that we start by imagining a feminist internet. A feminist internet does not have the full solutions for everything, instead, it challenges us to question “what can I imagine for myself and my community for a world that is truly inclusive, diverse and equal?” It can sound daunting and impossible when we are trying to change a system that is already embedded in every aspect of our lives, even within ourselves. However, it can also be a rewarding journey as there are endless opportunities to experiment and learn, multiple collaborations to explore and boundless possibilities await us.

Perhaps the most important conversation is whether we can do away with Facebook, Google and all these giant digital corporations and transform the current data-driven profit models? My answer to the question is that we start by imagining a feminist internet.

 

1 Taylor, A. (2015). The People’s Platform. Picador Paperback

2 Zubodd, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books.

3 Vaidhyanathan, S. (2018). Anti-social media: How Facebook disconnect us and undermines democracy. Oxford University Press

4 Gurumurthy, A., Vasudevan, A., & Chami, A. (2017). A Feminist Perspective on Gender, Media and Communication Rights in Digital Times – Key issues for CSW62 and beyond. IT for Change. https://itforchange.net/index.php/csw62-position-paper

5 A. Jane, E. (2017). Misogny Online: A short (and brutish) history. SAGE Publications.

6 The numbers are recorded as of 12 September 2021

7 Kovacs, A. & Ranganathan, N. (2019). Data Sovereignty, of whom? Limits and suitability of sovereignty frameworks for data in India. Internet Democracy Project. https://internetdemocracy.in/reports/data-sovereignty-of-whom

8 Vaidhyanathan, S. (2018). Op. cit.

Add new comment

Plain text

  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <br><p>