Cover Image for Women in AI: Build Tools That Protect - The Deepfake Hackathon
Cover Image for Women in AI: Build Tools That Protect - The Deepfake Hackathon
Avatar for tomoro.ai
Presented by
tomoro.ai
Generative AI at enterprise scale

Women in AI: Build Tools That Protect - The Deepfake Hackathon

Register to See Address
Registration
Approval Required
Your registration is subject to host approval.
Welcome! To join the event, please register below.
About Event

AI is being weaponised against women.

The same tools enabling creative breakthroughs are being used to generate non-consensual intimate synthetic imagery that destroy lives. We're bringing together the people who will build the solution.

The Problem: There is still no free, accessible, end-to-end tool that truly helps and tackles the problem head on.

Generative AI has accelerated image manipulation into a new form of digital abuse. Non-consensual intimate imagery now includes AI-generated deepfakes — synthetically created media that is often impossible to distinguish from real content.

Victims face an impossible journey: navigating fragmented platforms, inconsistent enforcement, opaque processes — without any single tool to guide them. 

  • 96% of deepfakes online are non-consensual intimate imagery

  • 99% of those deepfakes depict women and girls

  • 0 free, accessible, end-to-end tools exist to prevent, detect, and remove synthetic NCII

This hackathon exists to change that.

We're bringing together developers, policy advocates, researchers, and platform representatives for a one-day sprint with a focused mission: design and prototype tools that protect women from synthetic NCII — before, during, and after an incident. This event is open to anyone who wants to help solve the problem with a prize of £1,000 to the winner.

Challenge Tracks: Three critical failure points to fix.

Track One: Prevention

How can we embed protections at the point of content creation or distribution? Explore content authenticity standards, provenance watermarking, consent-layer tools, and AI model safeguards that make it harder to generate NCII in the first place.

Example approaches

  • C2PA / Content Credentials 

  • Provenance watermarking 

  • Consent layers 

  • Decentralised registry

Track Two: Detection & Authentication

How can we build accessible tools that detect synthetic NCII across image, video, and audio? Solutions should consider real-time detection, cross-platform interoperability, and staying ahead of rapidly improving generation techniques.

Example approaches

  • Real-time detection API 

  • Browser extensions 

  • Perceptual hashing 

  • Red team simulator

Track Three: Victim Support & Accountability

How can we build survivor-centred tools that help victims track where their image appears, automate takedown requests, monitor removal, and document evidence for legal proceedings — reducing the emotional burden at the moment of crisis?

Example approaches

  • Survivor dashboard 

  • Batch takedown requests 

  • Platform scorecard 

  • AI first responder

What a Great Submission Looks Like

  • Design considerations: Solutions should be open-source where possible, privacy-preserving by design, survivor-centred (not re-traumatising), and consider the legal and ethical dimensions of working with sensitive content.

  • A functional prototype or proof-of-concept: Demonstrating prevention, detection, or victim support capabilities.

  • Dataset or evaluation results: Showing how the system performs against baseline models or existing solutions.

  • User interface or dashboard: (Where applicable) demonstrating how the tool can be used by platforms, moderators, or survivors.

  • Your solution’s real-world application: Teams are encouraged to think about interoperability, how their tool connects with existing infrastructure (platforms, law enforcement, support organisations).

Who Should Come

We're looking for engineers, researchers, policy advocates, platform representatives, designers, and organisers to build tools that protect women from synthetic NCII through prevention, detection, and victim support. This includes ML engineers, trust & safety leads, UX designers, and civil society representatives who can create survivor-centred, legally grounded solutions with real-world impact. This event is open to anyone who wants to help solve the problem.

Agenda

One focused day with real outcomes.

9:00 AM ☕ Registration & Morning Arrival

Coffee, introductions, and team formation. Get set up and meet your collaborators.

9:45 AM 🎤 Opening Panel: The Scale of the Problem

Three expert speakers from policy, survivor advocacy, and platform trust & safety set the context. Why this matters, where the gaps are, and what we need to build.

10:30 AM 💻 Hacking Begins

Teams self-organise into tracks and begin building. Mentors circulate throughout. Resources, datasets, and APIs provided.

1:00 PM 🥗 Lunch & Mid-day Check-in

Food, refuelling, and a chance to present early progress for informal feedback.

1:30 PM ⚡ Afternoon Sprint

Heads-down building time. Final prototypes take shape. One-to-one mentoring available on request.

5:00 PM 🏁 Demos & Judging

Teams present their prototypes to a panel of judges from the tech, policy, and advocacy space. 5 minutes to demo, 3 minutes Q&A.

6:30 PM 🏆 Awards, Pizza & Networking

Winners announced, prizes awarded, and the evening closes with drinks, pizza, and the start of what we hope are lasting collaborations.

Location
Please register to see the exact location of this event.
Avatar for tomoro.ai
Presented by
tomoro.ai
Generative AI at enterprise scale