As the International Day for the Elimination of Violence against Women (IDEVAW) on 25th November has just past, voices, initiatives and critical reflections are multiplying to revamp public attention on the manifold facets of what remains a truly inhumane obstacle to gender equality and the empowerment of women and girls all over the globe. Nonetheless, there are still areas that remain largely under the radar – both of international institutions but also of activists, citizens, civil society organizations, and informal platforms of action. One of these areas is at the intersections of digital media and the long and everlasting history of gender-based violence (GBV). While over time, a lot of attention has gone towards fostering girls and women’s access to the internet, what has not got equal attention thus far is the myriad discriminations, abuses and inequalities girls, women but also other gender and sexual minorities constantly face within the digital space.
While over time, a lot of attention has gone towards fostering girls and women’s access to the internet, what has not got equal attention thus far is the myriad discriminations, abuses and inequalities girls, women but also other gender and sexual minorities constantly face within the digital space.
The digital space is indeed a space that invariantly hosts what Emma Jane calls in her book “old and depressingly predictable phenomena [such as] androcentrism, sexism, and misogyny” (Jane 2017:114). It is a fact that we live in an (online) context that is often far from the naïve ideal of a dialogic, respectful internet. Regardless, there seems to be a hard-to-overcome difficulty in grasping the gendered component of what Jane calls “online vitriol”. When attempts have been made to understand in more detail how such a violent online discourse can affect, penalize, and impair specific categories of individuals, only some categories have been systematically recognized as targets – most notably, characterizing online vitriol as “racist hate speech”, or “cyberbullying” against young people.
There are objective difficulties in addressing online GBV. First, its intangible and sociotechnical nature. Online GBV is intangible in nature and is at the crossroads between social and technological factors, as, on the one hand, it ties back to long-term discrimination, disempowerment, and abuse of girls, women, and other sexual and gender minorities; while, on the other, it finds in the online space new means for spreading, consolidating, taking new and unexpected forms. Second, it is a diverse phenomenon, as it can take many different forms (e.g., slurs, harassment, threatening, doxing, etc.) and these forms can vary in different contexts – for example, taking advantages of the different features made available by different platforms, but also varying depending on different sociocultural contexts. Third, it is a dynamic phenomenon, as it changes rapidly together with the digital communication technologies that enable it. Finally, it is a complex phenomenon, as different forms of online GBV involve, over time, different actors – victims, perpetrators, but also actors who should oversee to secure women’s online safety or at least restore justice once violations occur.
Against this background, there is hardly one win-all solution. However, uncertainty even about how to begin to address systematically the issue has often resulted in stagnation of discussion but, more dramatically, of political strategies. In this context, a thorough, high quality, and systematic research can become a true resource as it can contribute some solid knowledge to construct a collective counter-action strategy to fight online GBV.
Online GBV is characterised by its intangible, sociotechnical nature as well that it is a dynamic and complex phenomenon.
Online discourse as a space of gender-based violence: the All Women Count! project
What type of research activities can contribute necessary knowledge to counteract online GBV? The academic community is increasingly engaging with the collection and the analysis of daily experiences of online abuse, discrimination and even violations (for an example, see the volume edited by Marie Segrave and Laura Vitis) as much as with forms of feminist resistance that take advantage of digital media affordances (for an example, see the work of Emma Jane on feminist digilantism). Similarly, outside academia, civil society organizations are committing towards the production of critical and systematic knowledge, carrying out research projects that address largely unexplored topics – as it has been the case for the APC led project “From impunity to justice: exploring corporate and legal remedies for technology-related violence against women”. Increasingly, academics and activists work hand-in-hand, combining scientific, grassroots knowledge(s) and methods, and generating unique resources to unveil and fight online GBV.
An endeavour of this type is being carried out in the context of the project “All Women Count!” (AWC – coordinated by the Association for Progressive Communications). The project involves a number of research teams, particularly, from Egypt, India, Malaysia and Kenya, working side by side with social and computer scientists to monitor, map, and analyse online GBV looking at interactions that occur upon social media platforms and that are concrete practices of violence perpetrated against girls, women, and sexual or gender minorities.
The preferred space of inquiry for the AWC! Project is thus online discourse. In the digital space, where every process is invariantly sustained by communication networks, discourse becomes a crucial element and expands “from its original roots in interpersonal conversation to the social dialogue which takes place through and across societal institutions, among individuals as well as groups and . . . political institutions themselves” (Donati 1992:138)1. Every act performed by citizens, government, companies, civil society organizations etc., through the employment of digital media can be legitimately seen as an active contribution to the collective construction of online discourses which, in most cases, are also public discourses. Therefore, mapping, tracing, and analysing the practices through which these online discourses are created and evolve over time becomes a fruitful entry-point to intercept also those practices within these same online discourses that are offensive, abusive, or have a violent component grounded in gender imbalance. Moreover, as seen above, online GBV is intangible in nature – a trait that separates it significantly from “offline”’ gender-based violence yet without necessarily making it less severe or important. Underneath this intangibility, however, there is the concreteness of user-generated online discourse – one that is actively shaped by the affordances of the different platforms, that is nurtured by the multimedia nature of contemporary digital media that users practically translate into posts and tweets, and that distributes rapidly thanks to social media unprecedented networking potential.
Every act performed by citizens, government, companies, civil society organizations etc., through the employment of digital media can be legitimately seen as an active contribution to the collective construction of online discourses which, in most cases, are also public discourses.
By joining social and computer scientists with teams of researchers and activists from different countries as well as by exploring different platforms – particularly, mainstream social media platforms such as Twitter and Facebook -, the AWC! project is attempting to reach a very ambitious goal – i.e., analysing and understanding online GBV practices in connection with socio-political, technological, and gender-related that in fact underpin and generate different understandings and degrees of concern with online GBV in various national contexts and communities.
Online gender-based violence practices: insights from ongoing research activities
In what is unfolding as a unique collective experience of knowledge production, the AWC! project is working to shed light on both overt and covert forms of gender-based offences, abuse and violence online. While research activities are proceeding, the investigation of different chunks of online discourse reveals that, above and beyond local specificities, online GBV (particularly against women) presents some transversal features.
Sexual objectification and non-conformity
In the first place and, perhaps, not too surprisingly, the sexual dimension continues to be the main one along which online discourse is turned into a weapon of abuse. Women are often addressed in ways that recall their sexual characteristics, or their body parts – interestingly, either reducing them to these parts (i.e., sexually objectifying them) or shaming them in light of their “non-conformity” to alleged physical standards based on gender. The usage of body and sexually-related terms to minimize female subjectivities may or may not be associated with overtly violent speech (sometimes also in the context of proper threats), techniques such as “body shaming” appear to be increasingly on the rise, as they may be used to mortify individuals yet without necessarily violating platforms’ community standards. In both cases, regardless of the fact that the tone may be different, violent discursive practices share a common trait: they reinforce the long-term depletion of femininity or womanhood as a function of a physical and/or sexual component. Whether this component is used to mortify or, alternatively, to underline inadequacy, it is still the main tool through which online discourse is practically turned into a means to discriminate, abuse or harass.
Techniques such as “body shaming” appear to be increasingly on the rise, as they may be used to mortify individuals yet without necessarily violating platforms’ community standards.
A second set of discursive practices emerged particularly in connection with the study of discourses unfolding around women in public offices or public roles. This second set of practices point to the dynamic of delegitimisation, which grounds on the assumption that women, as much as individuals that refuse to conform to religious, cultural, or group norms, should not occupy public offices as they are “cognitively inadequate”. Women but also members of sexual and gender minorities that are seen to illegitimately occupy public places and spaces are attacked in ways that often underline their absolute lack of cognitive skills – e.g., often using the adjective “brainless”. Quite interestingly, this “lack of brain” adds to, rather than taking the place of, the reduction to physical or sexual attributes. Insofar as these persons are not skilled to cover public roles or do not possess the capacities to interpret and read properly the cultural standards of a community, all that is left to carry out their public tasks is their sexuality, which apparently they evilly use for their own advantage.
Inadequacy and delegitimisation are often used as an excuse to attack women and other gendered subjectivities, by also using the benchmark of an ideal of “femininity” or “womanhood” that is defined more specifically depending on the context under examination. In some cases, femininity and womanhood are driven by rigid religious norms; while in some others they depend more on political beliefs, or can be even defined in abstract in terms of a preformatted idea of a “good woman”, a “good mother”, or a “good lover”. Interestingly, the resurgence of abstract and preformatted ideas of femininity and womanhood seems to characterize online discourse especially when it is of populist and/or authoritarian nature. Indeed, populist discourse tends to recover general and hard-to-dismantle stereotypes (including those of women and non-conforming gender-subjectivities) and to propose them as something that is absolutely necessary to restore to gain back control, certainty over the future and to overcome current situations of unfairness and injustice. While it may not appear surprising, the entwining between populist and gender-discriminatory discursive practice should nonetheless raise our concerns. More often than not, populism is not only a “political feeling” but a true component of governmental arrangements. If one of the mechanisms underneath its political vision and programmes connects to re-establish gender stereotypes, online discursive practices that attack the neglecting of these same stereotypes can be expected to become the norm, rather than the exception.
If one of the mechanisms underneath its political vision and programmes connects to re-establish gender stereotypes, online discursive practices that attack the neglecting of these same stereotypes can be expected to become the norm, rather than the exception.
Much remains to be done to get to know as well as to counteract online GBV. The collective and multidisciplinary experience of the AWC! project is providing evidence of the fruitfulness of joining efforts in this domain. As this experience continues to grow, and more fine-grained analyses are being produced to complement abovementioned trends, important insights have also emerged about the challenges and the criticalities that characterize research endeavours of this sort - namely, that focus on a still overlooked topic, within the digital space, coupling large amount of data that can be scraped with the need to generate careful and aware insights that are hardly achievable through automated analyses that are typical of “big data” approaches.
First, in the impossibility of monitoring and analysing every bite of discourse that is produced online, it becomes necessary to select “chunks” of discourse to be analysed. Researchers need to choose topics, accounts, individuals to anchor their analysis and, more often than not, this leads to a selection of themes or individuals that enjoy public recognisability. On the one hand, these choices allow to actually focus on evident violent discursive practices. On the other hand, though, the selection of “exceptional” starting points continue to “leave out” also from empirical investigation the more common, underground, daily dimension of online GBV. That great majority of distributed, intermittent, and fragmented practices that are carried out within personal niches of the online space which are hardly recognizable from the outside but, in fact, constantly fuel gender-based discrimination and marginalization.
The selection of “exceptional” starting points continue to “leave out” also from empirical investigation the more common, underground, daily dimension of online GBV.
Second, there are paramount ethical and privacy issues. Every piece of online discourse that is monitored and analysed to unveil online gender-based violence is a piece of “public discourse” – at least, following the standards of the different platforms upon which it unfolded. However, accessing and analysing only “public data” is not tantamount to protecting someone’s privacy nor does it imply that it is the ethically compliant way. How to reconcile research questions, curiosities, and necessities with the imperative of not harming anyone remains perhaps one of the most challenging aspects in digitally-grounded research. Targeted individuals but also perpetrators need to be protected to avoid seeing once again the effects of stigmatisation triggered by scientific research on single individuals. Yet, on the other side, disclosing experiences, providing examples, showing evidence of what online GBV is, the forms it takes, are all necessary actions to “give flesh to the bones” to all those forms of discrimination and abuses that, because they are intangible, are often dismissed too quickly.
How to reconcile research questions, curiosities, and necessities with the imperative of not harming anyone remains perhaps one of the most challenging aspects in digitally-grounded research.
Third, there are methodological challenges. Established methods to work with "big data" - e.g., network analysis, natural language processing, etc. - enable us to handle massive amounts of data. However, especially in light of the progressive shift towards covert discrimination, these methods can be less effective than expected when it comes to actually unveiling online violent discourse. When violence is nuanced and not explicitly “phrased” as such, how good are pattern-finding methods and related techniques in revealing abuses? How are we to combine big data approaches with more in-depth, qualitative insights, without turning these research endeavours into daunting efforts?
Finally, there is the element of language. Most of the collected materials to investigate online GBV and, therefore, to produce knowledge about it tend to be in English. Certainly, English remains the most widely spoken language online, and scraping and analytical techniques are better defined to work through this language rather than with others. Also, English continues to be the language through which collaborative transnational endeavours like the one underneath the AWC! projects are coordinated. However, this language-related choice imposes important burdens: online violent discursive practices go hand in hand with language and different languages support different offensive, abusive and violent practices. While working with English may provide a useful entry point to shed light on online GBV, it does also add constraints to our understanding of a subset of possible practices – one that is not necessarily generalizable or encompassing.
How are we to combine big data approaches with more in-depth, qualitative insights, without turning these research endeavours into daunting efforts?
1 Paolo Donati (1992). 1992. “Political discourse analysis”. In Studying collective action, edited by Mario Diani and Ron Eyermann, pp. 136–167. London: Sage