UK regulator Ofcom has published a discussion paper exploring the different tools and techniques that tech firms can use to help users identify deepfake AI-generated videos.
The paper explores the merits of four ‘attribution measures’: watermarking, provenance metadata, AI labels, and context annotations.
These four measures are designed to provide information about how AI-generated content has been created, and – in some cases – can indicate whether the content is accurate or misleading.
This comes as new Ofcom research reveals that 85% of adults support online platforms attaching AI labels to content, although only one in three (34%) have ever seen one. Deepfakes have been used for financial scams, to depict people in non-consensual sexual imagery and to spread disinformation about politicians.
The discussion paper is a follow-up to Ofcom’s first Deepfake Defences paper, published last July.
The paper includes eight key takeaways to guide industry, government and researchers:
- Evidence shows that attribution measures can help users to engage with content more critically, when deployed with care and proper testing.
- Users should not be left to identify deepfakes on their own, and platforms should avoid placing the full burden on individuals to detect misleading content.
- Striking the right balance between simplicity and detail is crucial when communicating information about AI to users.
- Attribution measures need to accommodate content that is neither wholly real nor entirely synthetic, communicating how AI has been used to create content and not just whether it has been used.
- Attribution measures can be susceptible to removal and manipulation. Ofcom’s technical tests show that watermarks can often be stripped from content following basic edits.
- Greater standardisation across individual attribution measures could boost the efficacy and take-up of these measures.
- The pace of change means it would be unwise to make sweeping claims about attribution measures.
- Attribution measures should be used in combination with other interventions, from AI classifiers and reporting mechanisms, to tackle the greatest range of deepfakes.
Ofcom said the research will also inform its policy development and supervision of regulated services under the Online Safety Act.
BBC Group to deliver original content for YouTube
As viewer behaviour reaches a critical tipping point, the BBC Group has struck a deal to produce new programming for YouTube, in an effort to capture younger viewers.
Netflix switches to all-cash offer for Warner Bros Discovery
Netflix has revised its bid for Warner Bros Discovery's studio and streaming business to an all-cash offer.
IBC2026 opens call for technical papers
The call for technical papers is now open for the IBC2026 Conference, which will take place at RAI Amsterdam from 11-14 September.
Lucasfilm President Kathleen Kennedy steps down
Kathleen Kennedy, President of Lucasfilm, is stepping down after 14 years. She plans to transition back to full-time producing, including the studio’s upcoming feature films The Mandalorian and Grogu and Star Wars: Starfighter.
Paramount appoints Reemah Sakaan as President of 5
Paramount has appointed Reemah Sakaan as President of its UK public service broadcaster, 5.
.jpg)


