Technology | UK Mandates 48-Hour Takedown of Non-Consensual Images by Tech Firms
By Newzvia
Quick Summary
The UK government has introduced new laws requiring technology companies to remove non-consensual intimate images within 48 hours of being reported, under penalty of significant fines. This development aligns with a global push, including recent stringent measures in India, to enhance online safety.
UK Government Enforces 48-Hour Takedown Rule for Non-Consensual Intimate Images
The UK government has introduced new laws, effective , mandating that technology companies remove non-consensual intimate images within 48 hours of being reported. This legislative amendment to the Crime and Policing Bill aims to bolster online safety, particularly for women and girls, with companies facing substantial fines for non-compliance.
Key Regulatory Changes
Under the new provisions, tech companies are now legally required to act swiftly upon reports of non-consensual intimate imagery. Failure to remove such content within the stipulated 48-hour timeframe could result in significant penalties, potentially reaching up to 10% of their qualifying worldwide revenue [cite: Raw Snippet]. This move underscores the UK government's commitment to creating a safer online environment by placing greater accountability on digital platforms [cite: Raw Snippet].
Implications for Tech Firms and Global Reach
The legislation is expected to have far-reaching implications for technology companies, regardless of their operational base, as long as their services are accessible to users in the UK. The fines, calculated based on worldwide revenue, highlight the global impact of these new regulations [cite: Raw Snippet, 13, 16]. This means that companies with a global presence, including those with significant user bases and operations in India, will need to ensure their content moderation and reporting mechanisms comply with these stringent UK requirements.
India's Stance on Online Safety
In India, the issue of non-consensual intimate images (NCII) is addressed through various legal provisions, including Section 354C (Voyeurism) and Section 509 (Insulting modesty of a woman) of the Indian Penal Code (IPC), and Sections 66E and 67A of the Information Technology (IT) Act, 2000. However, these existing laws have faced criticism for their narrow definitions or potential for misinterpretation.
Significantly, the Indian government recently reinforced its commitment to online safety by notifying amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, on . These amendments introduce a framework specifically targeting AI-generated sexual abuse material and other illegal content. Crucially, the new Indian rules mandate an even stricter takedown timeline: non-consensual deepfakes or intimate imagery must be removed within two hours of being reported, a significant reduction from the previous 24-hour window. This makes India's response to such content swifter than the UK's newly introduced 48-hour rule in these specific categories.
The UK's legislative actions, particularly under the broader Online Safety Act, are seen as a benchmark for nations like India as they develop their own AI governance guidelines and legal templates to prevent the misuse of generative AI.
Official Statement
According to the raw snippet, this amendment to the Crime and Policing Bill is specifically designed to enhance online safety for women and girls [cite: Raw Snippet]. The UK government aims to ensure that victims of non-consensual intimate image sharing receive prompt action and protection from further harm.
Industry Response and Future Outlook
Specific industry reactions to this latest amendment were not immediately available in the provided input. However, the broader UK Online Safety Act, under which such measures fall, has previously drawn concerns from major technology firms regarding implications for user privacy and encryption. The Act's regulator, Ofcom, possesses extensive investigatory and enforcement powers, including the ability to issue fines up to 10% of global turnover. These regulatory shifts compel tech businesses operating in the UK to meticulously assess their services for compliance.
Background
The introduction of these new laws comes amidst increasing global concern over online abuse and the rapid proliferation of non-consensual intimate imagery, often referred to as 'revenge porn' or image-based sexual abuse. Governments worldwide are grappling with how to hold technology platforms accountable for harmful content and safeguard users in the digital realm.
Key Takeaways
- The UK government now requires tech companies to remove non-consensual intimate images within 48 hours of being reported.
- Non-compliance can lead to fines of up to 10% of a company's worldwide revenue.
- This legislation is an amendment to the Crime and Policing Bill, targeting enhanced online safety for women and girls.
- India has its own legal framework for NCII, with recent (February 2026) amendments mandating a 2-hour takedown for non-consensual deepfakes and intimate imagery.
- The UK's Online Safety Act, under which these rules are implemented, influences global tech companies, including those serving the Indian market.
People Also Ask
What is the new UK law regarding non-consensual intimate images?
The UK government has introduced new laws requiring technology companies to remove non-consensual intimate images within 48 hours of being reported. This aims to enhance online safety for women and girls. [cite: Raw Snippet]
What are the penalties for tech companies failing to comply with the UK law?
Tech companies that fail to remove non-consensual intimate images within the 48-hour period could face significant fines, potentially reaching up to 10% of their qualifying worldwide revenue. [cite: Raw Snippet]
How does this UK law compare to regulations in India?
India has existing laws against non-consensual intimate images. Notably, recent amendments to India's IT Rules on , mandate an even stricter 2-hour takedown timeline for non-consensual deepfakes and intimate imagery.
Why is the UK government introducing these new online safety measures?
The primary purpose of these new laws is to enhance online safety, particularly for women and girls, by holding tech companies more accountable for the swift removal of harmful non-consensual intimate content. [cite: Raw Snippet]
Last updated: