head screaming with amplifier while many little people are drowning around a life saver

Illustration: Hate Speech by Jianan Liu. CC BY NC-ND via Behance.

Facebook, Google, TikTok and Twitter released their commitments to tackle online abuse and improve women’s safety on their platforms at the UN Generation Equality Forum in Paris.https://webfoundation.org/2021/07/generation-equality-commitments/ The commitments translate the findings of the Web Foundation’s policy design workshopshttps://ogbv.webfoundation.org on online gender-based violence and abuse, where tech companies joined experts from civil society, academia and governments. Together they worked to co-create better ways for women to curate their own safety online through privacy settings, safety tools and reporting systems based on highly visible profiles of women with intersecting identities (e.g. politicians, journalists, activists). The progress is set to be reported annually by the Web Foundation against three areas, namely: test, timebound and transparency.

Amid a global context of rampant gender-based violence on social media platforms, the commitments represent a concrete and necessary step for every tech company to meet its responsibilities to respect human rights in the context of gender-based violence and abuse against women on its platform.

Amid a global context of rampant gender-based violence on social media platforms,https://www.researchgate.net/publication/327962592_Human_Rights_by_Desi… the commitments represent a concrete and necessary step for every tech company to meet its responsibilities to respect human rights in the context of gender-based violence and abuse against women on its platform. However, and while the onus of safety and protection remains on women, Facebook, Google, TikTok and Twitter fail to provide a comprehensive approach to online gender-based violence. Here are seven reasons why:

1- Women’s experiences of gender-based violence and abuse online should not be seen as a series of isolated incidents, but rather treated as part of the wider context of systemic inequalities that are embedded, reflected and reinforced in codes. Building better ways for women to use safety tools will not alter the fact that women still face structural challenges to their full participation in virtual spaces where gender-based violence is normalised, amplified and weaponised.

2- Safety tools should focus on affected women and address their specific realities. There cannot be a one-size-fits-all approach to how “help yourself and take control of your tech” plays out in different scenarios. Safety tools need to be contextualised and intersectional, and as inclusive as possible of users' needs that are varied and shaped by their race, culture, language, gender, disability and other identities.

Safety tools need to be contextualised and intersectional, and as inclusive as possible of users' needs that are varied and shaped by their race, culture, language, gender, disability and other identities.

3- Entrenched gender bias in algorithms puts non-binary people at greater risk of abuse and marginalisation. Therefore, experiences of people of diverse genders and sexualities cannot be treated as a monolith with those of cis women.

4- The multifaceted nature of experiences of gender-based violence and abuse online makes it hard for women to identify threats that include a variety of physical or sexual violence targeting one or more aspects of a woman’s identity.https://www.amnesty.org/download/Documents/AMR5129932020ENGLISH.PDF Sometimes one or more forms of such violence and abuse will be used together as part of a coordinated attack against an individual. While the burden falls principally on women to identify the abuse and report it, the data gauged may underplay the true scale of abuse on the platform.

Entrenched gender bias in algorithms puts non-binary people at greater risk of abuse and marginalisation.

5- No particular attention is given to the cases of women human rights defenders (WHRDs) whose work is heavily reliant on technological tools and infrastructures. Here it is important to note that WHRDs recently issued a list of demands calling on social media companies to develop and implement policies to end stigma against WHRDs and ensure that rights to privacy are respected through measures that are necessary and proportionate.https://www.apc.org/sites/default/files/IMD-GEF-Demandas-ENG-Final.pdf

6- Testing is good, but according to Amnesty’s Twitter Report Card, these controls have not been successful in the past. Testing should be done widely, in multiple languages, on multiple devices, in different realities and contexts with different womenhttps://www.amnesty.org/download/Documents/AMR5129932020ENGLISH.PDF to yield more efficient and accurate progress.

7- Transparency reporting mechanisms need to become more disaggregated by category and include additional information on the average time it takes for moderators to respond, the number of moderators employed per region, and language to respond to reports of abuse on the platform.https://www.amnesty.org/download/Documents/AMR5129932020ENGLISH.PDF

Although Facebook, Google, TikTok and Twitter have brought welcome improvements to their platforms, we do not foresee or anticipate any structural change. Privacy and security are still major concerns for women on social media platforms,http://webfoundation.org/docs/2020/10/Womens-Rights-Online-Report-1.pdf and thus should be prioritised ahead of “retaining” users' attention and time when designing algorithms that profit off of immediacy, emotional impact, and virality. As a result, they also amplify abusive behaviour.https://pen.org/report/no-excuse-for-abuse/ Commitments towards tackling online gender-based violence are only possible by going beyond tool-focused solutions, with an emphasis on root causes, prevention, protection, liberation, well-being and respect of fundamental rights and freedoms where tech companies put the onus on themselves and act with enhanced public transparency and accountability.

Although Facebook, Google, TikTok and Twitter have brought welcome improvements to their platforms, we do not foresee or anticipate any structural change.


Written by: Marwa Azelmat (APC), for further information reach out to marwa@apcwomen.org

Acknowledgments:

Debarati Das, Point of View (POV)

Erika Smith (APC)

Erin Williams, Global Fund for Women (GFW)

 

Add new comment

Plain text

  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <br><p>