Addressing Image-Based Sexual Abuse in the Global South: A Call for Inclusive Global Trust and Safety Standards

As we continue to navigate the complexities of online safety, there has been a noticeable increase in efforts to address some of the most pervasive and harmful forms of digital abuse, particularly image-based sexual abuse (IBSA). These abuses, which involve the non-consensual sharing, creation, and distribution of intimate images, are made increasingly difficult to combat by the rise of generative AI and sophisticated manipulation tools. The global trust and safety landscape has recognized the need to tackle this issue, and recent efforts, including the announcement of Voluntary Principles to Combat Image-Based Sexual Abuse, are encouraging steps in the right direction.

The principles, which were developed through consultations among companies, civil society organizations, and academic experts, aim to establish guidelines for tech platforms to mitigate the devastating impacts of IBSA. The list of signatories—comprising companies such as Aylo, Bumble, Discord, Hugging Face, Match Group, Meta, Microsoft, and TikTok—and the range of participants in the multi stakeholder working group, including organizations like the Center for Democracy & Technology (CDT), the Cyber Civil Rights Initiative (CCRI), and the National Network to End Domestic Violence (NNEDV), reflect a concerted global effort to address this issue. The principles themselves are well-constructed and offer a thoughtful approach to preventing and addressing IBSA, acknowledging its disproportionate impact on marginalized communities, including women, girls, people of color, and the LGBTQI+ community.

However, while these principles are undoubtedly a positive development, I must highlight a recurring concern: the absence of voices from the Global South, particularly from countries like India, in the creation of such frameworks. Once again, global efforts to address technology-facilitated abuse have been made without adequately considering the perspectives of countries where these harms are rampant. When tech platforms operate globally, the principles that guide their conduct must also reflect the global majority’s needs and challenges. Unfortunately, in this case, as in many others, there is a gap—a seat at the table for India and other nations of the Global South is missing.

As a professional deeply involved in the field of online safety in India, I have witnessed firsthand the devastating impacts of image-based abuse on victims. At the Centre for Social Research (CSR), where I work closely with victims of online harassment and image-based abuse, we see how these violations wreak psychological and social havoc, often leaving victims with few avenues for justice. CSR was also one of the first NGOs to partner on StopNCII.org, an initiative aimed at providing immediate relief to victims of non-consensual image sharing. Image-based sexual abuse is not just a problem in countries like India—it is a severe and underreported crisis that has not received the international attention it truly needs. Moreover, addressing this issue is complicated by a lack of understanding.

In India and other countries across the Global South, victims of image-based abuse face unique challenges. Deeply entrenched cultural norms around modesty, shame, and victim-blaming exacerbate the harm caused by IBSA. Women and girls, in particular, are often stigmatized and ostracized when intimate images are shared without their consent. The psychological toll of this kind of abuse is compounded by societal judgment, making it incredibly difficult for victims to come forward, report the abuse, or seek help. Additionally, the legal frameworks in many of these countries are not equipped to address the specific nature of image-based abuse, often leaving victims with little recourse for justice or support.

The principles for combating IBSA laid out by the working group are comprehensive, addressing crucial issues such as consent, trauma-informed approaches, and accessibility. They call for the integration of consent and autonomy by design, ensuring that users have control over how their likeness is used and shared online. They also emphasize the importance of prevention, harm mitigation, and transparency in platform policies. These are all vital components of a robust response to image-based abuse, but what remains missing is an understanding of the cultural, legal, and social contexts in which these abuses occur, particularly in countries like India.

When developing global standards and voluntary principles for tech platforms, it is essential to recognize that a one-size-fits-all approach is inadequate. The cultural nuances that shape how image-based abuse is perceived, reported, and responded to in different regions must be accounted for in the creation of these frameworks. In countries like India, there are unique barriers to addressing IBSA, including victim-blaming, the lack of robust legal protections, and the intersection of online abuse with offline gender-based violence. For tech platforms operating in these regions, it is not enough to merely adopt global principles—they must engage directly with local stakeholders, including civil society organizations, local law enforcement, and survivor advocacy groups that have a deep understanding of the issues on the ground.

At CSR, we have spent years advocating for stronger responses to online harassment and image-based abuse in India. Our work with victims has shown that IBSA is often used as a tool of control and coercion, particularly in intimate partner violence and cases of revenge porn. In these situations, victims are left vulnerable not only to public shame but also to coercion and blackmail, with perpetrators using threats of exposure to control their actions. This dynamic, while not unique to India, is intensified by cultural norms that prioritize family honor and modesty, often at the expense of individual safety and justice.

As an advisor to Meta’s Women’s Safety Board, I have pushed for greater awareness of these issues within global tech platforms. More effort needs to be made to understand the local realities of image-based abuse in India and other parts of the Global South. While global frameworks and principles are important, they must be informed by local expertise and experiences. The tech industry must commit to working collaboratively with local organizations and governments to develop region-specific interventions that address the cultural, social, and legal barriers to combating IBSA.

The multistakeholder working group formed to combat IBSA is a laudable initiative, and I commend the participants for their commitment to addressing this pressing issue. However, the exclusion of voices from the Global South represents a missed opportunity to create more inclusive, effective principles that reflect the realities of those most affected by these abuses. As we move forward, I urge the working group and its signatories to prioritize the inclusion of countries like India in these discussions.

Creating safe online spaces is a global challenge, and it requires global collaboration. The Global South, particularly countries like India, has much to contribute to this dialogue, and their participation is essential to crafting solutions that are not only effective but also equitable. Without a seat at the table for the Global Majority, we risk perpetuating the very inequities we seek to dismantle. As the global trust and safety landscape continues to evolve, let us ensure that the voices of those most affected by online abuse are not just heard but are central to the solutions we create.

Leave a comment