ADVERTISEMENT
Rampant growth of image-based sexual abuse needs a coordinated responseIBSA, colloquially called ‘revenge porn’, refers to the non-consensual capture, creation, publishing or distribution (or the threat to do so) of nude or sexually explicit images, or videos of a person without their consent.
Rohini Lakshané
Keerty Nakray
Last Updated IST
<div class="paragraphs"><p>Image for representation.</p></div>

Image for representation.

Credit: iStock Photo

Videos of rape, sexual assault and abuse allegedly done by the former MP Prajwal Revanna have laid bare an extremely rampant but overlooked occurrence in India.

ADVERTISEMENT

It shows the stark need for policy interventions and a coordinated response across sectors to prevent and mitigate incidents of image-based sexual abuse (IBSA), protect the dignity and safety of victims and ensure justice for them.

IBSA, colloquially called ‘revenge porn’, refers to the non-consensual capture, creation, publishing or distribution (or the threat to do so) of nude or sexually explicit images, or videos of a person without their consent.

It is rampant in India and is often driven by monetary gains. IBSA perpetuates via pornographic websites, messaging boards, forums, social media, instant messaging apps, cloud storage services, the dark net, and offline modes, almost all of which monetise the content.

The ‘Shame The Rapist’ campaign and investigations by Al Jazeera and the Times of India show that India has a market for rape videos. 

IBSA ruptures the lives of victims. Its impact is grave and encompasses all aspects of their lives; many consider suicide. Once released, the images disseminate rapidly on the internet.

Because of the virality of IBSA content and how the pornography distribution mechanisms work, the content stays in circulation perpetually. It is challenging to curb its spread and remove it from all locations online and offline. However, it is also necessary to quickly trace and remove the content. Timely action greatly reduces the spread of the content and, in turn, harm to the victims.

The paper Non-consensual Intimate Imagery: An Overview shines a light on various aspects of IBSA, including remedies and strategies available to victims.

Based on our experience of researching and responding to gender-based violence, some possible interventions and remedial actions can be carried out via One-Stop Centres (OSCs) and helplines.

Across India, 769 OSCs, also known as Sakhi Centres, provide emergency response services to women.

The first intervention that IBSA victims need is psychological support. Trained social workers there provide first-line assistance in seeking medical help, lodging FIRs, psycho-social counselling, legal aid and counselling and temporary shelter.

IBSA incidents should be brought under the ambit of OSCs, which are an essential support system for women in vulnerable situations.

Calls made to the state-run helpline number 181 are directed to OSCs. It is commendable that the SIT in the Prajwal Revanna case started a helpline for victims. However, this is an ad hoc solution. Strengthening
the 181 helpline and OSCs and synergising them with law enforcement is vital for supporting victims.

It is critical that the Ministry of Women and Child Development, in collaboration with the Ministry of Health, the World Health Organisation and leading legal and health NGOs, draft standard operating
procedures for OSCs to support IBSA victims nationwide.

It should include the full range of technical, legal, health and psycho-social interventions required. This will require an expansion of capacity and roles at OSCs. There is currently no reliable, public information about the handling of IBSA cases, if any, at OSCs.

Some helplines run by NGOs in India, such as the Vanitha Sahayavani in collaboration with Bengaluru City Police, and the Cyber Wellness Helpline by Responsible Netism, already support IBSA victims.

The entities on the internet that distribute IBSA content - pornographic websites, groups and channels on messaging apps, forums and so on - urge their users to submit new IBSA content, do not state anywhere that the images and photos were published and distributed with the explicit consent of the persons who can be clearly seen and heard in the content, and do not mention age verification to indicate whether the persons in the content are of legal age.

These entities are also hubs where IBSA content gets both deposited and distributed on the internet. The operators of these entities earn money through advertising revenues, direct sales of the content and subscriptions, among other means.

While anyone may report these entities to the police under different sections of the Information Technology (Amendment) Act and the Indian Penal Code, the scale of non-consensual nude or sexually explicit content requires an easier mechanism to report them. Pornographic websites that solely publish IBSA and other sources should be shut and not merely blocked at the URL-level.

The sole purpose of ‘nudify’ mobile apps is to generate naked photos of anyone for users who upload clothed photos of the target. These images are called ‘deepfakes’.

These apps utilise artificial intelligence (AI) image diffusion models to quickly, easily and cheaply generate IBSA content at scale. AI companies should take proactive measures to reduce the risk of their products/ services being used to generate IBSA content.

Similarly, app stores must have safety policies, terms of service and reporting mechanisms to prevent the platform from being used to host nudify or similarly harmful types of apps.

Knowledge, education, care and support are necessary to sensitise the public on IBSA. South Korea, another country where IBSA is rampant and massive, displays messages in public places, public transport etc. against the non-consensual capture and distribution of nude and explicit images.

Progress has been made in child education regarding sexual abuse, safe touch, and informed consent. Expanding this education to cover consent related to image taking/sharing and encouraging early reporting is equally essential for protecting children and adults.

We have not named or provided links to some of the mentioned apps, websites etc. in order to avoid publicising them.

(Rohini Lakshané is a technologist, interdisciplinary researcher and Wikimedian. Keerty Nakray is a gender, social policy and public health researcher and writer)

ADVERTISEMENT
(Published 08 July 2024, 02:51 IST)