Platforms and Deepfake Nudes: A Responsibility to Act
All Tech Is Human and Cornell Tech's Security, Trust, and Safety Initiative (SETS) will host an event on May 21, 2026, from 5:30 to 8:30pm on Platforms and Deepfake Nudes.
This gathering will bring together a mixture of Trust and Safety professionals, researchers, academics, students, civil society orgs, and more for a discussion about Platforms and Deepfake Nudes: A Responsibility to Act, and in particular the TAKE IT DOWN Act. The TAKE IT DOWN Act creates new federal tools to fight digital sexual abuse, and its scope and implementation are a subject of ongoing discussion.
Agenda
5:30 to 6:00pm: Arrivals
6:00 to 7:00pm: Panel Conversation
7:00 to 8:30pm: Networking with light refreshments and bites
Please Note: We anticipate receiving more interest than we can accommodate in the space. In order to foster a dynamic and balanced group, we’ll be selecting attendees based on a range of factors. Completing this form expresses your interest, but does not guarantee a spot. Thank you for your understanding!
Reach out to us at [email protected] for any questions.
===
PANEL
Malika Saada Saar (Senior Fellow, Human Rights Law and Tech, Brown University)
Rebecca Portnoff (Head of Data Science, Thorn)
Miranda Wei (Fellow, Princeton CITP)
Malika Saada Saar is a human rights lawyer, technology policy strategist, and global leader shaping how artificial intelligence is governed, deployed, and regulated to protect people and strengthen democratic institutions. Most recently, she served as Global Head of Human Rights at YouTube, where she led company-wide efforts to embed human rights principles into the design, development, and deployment of AI systems. In this role, she spearheaded initiatives on AI fairness, accountability, and safety, building global frameworks to safeguard digital rights and advance responsible innovation for billions of users worldwide.
Previously, Malika was Senior Counsel on Civil and Human Rights at Google, where she designed multi-stakeholder engagement strategies, implemented human rights safeguards across global supply chains, and built strategic partnerships with organizations including UN Women and the UN High Commissioner for Refugees. Earlier in her career, she founded and led Rights4Girls, a national advocacy organization that helped transform U.S. legal and policy responses to child sex trafficking, the criminalization of vulnerable girls, and systemic inequities in the justice system.
Today, Malika is a Senior Fellow at Brown University’s Watson School of International and Public Affairs, where she teaches and advises on the governance of AI, democracy, and the rule of law, with a particular focus on how emerging technologies reshape power, access, and opportunity.
Her leadership has been recognized widely: she was named one of Newsweek’s “150 Women Who Shake the World,” served on the Presidential Advisory Council on HIV/AIDS under President Barack Obama, and currently sits on the boards of the Watson School and the Peabody Awards. Malika holds a B.A. from Brown University, an M.A. in Education from Stanford University, and a J.D. from Georgetown University Law Center.
Dr. Rebecca Portnoff has dedicated her career to defending children from sexual abuse. She is currently Vice President of Data Science at Thorn, owning strategy and vision for machine learning (ML)/AI across the organization. The ML/AI and algorithmic solutions her team builds have global impact: used across hundreds of LE agencies, hotlines and technology companies. She acts as an ecosystem leader to address emerging threats against children via novel research and cross-industry collaborations, bridging the gap between child safety experts and technologists.
Rebecca brings with her over a decade of experience in ML/AI, child safety, and trauma-informed leadership. She holds a bachelor’s degree in Computer Science from Princeton University, where she also minored in vocal jazz, and a Ph.D. in Computer Science from UC Berkeley.
Miranda Wei studies online abuse and societal factors in sociotechnical safety, especially concerning social media, gender, and interpersonal relationships. Wei holds a Ph.D. from the Paul G. Allen School of Computer Science & Engineering at the University of Washington. In fall 2026, they will start as an assistant professor at École Polytechnique Fédérale de Lausanne (EPFL) in Lausanne, Switzerland.
The TAKE IT DOWN Act is a 2025 U.S. federal law criminalizing the nonconsensual sharing of intimate images (NCII) and digital forgeries (deepfakes), requiring platforms to remove such content within 48 hours, and creating penalties for creating and distributing these images, particularly those involving minors or threats, with enforcement by the FTC. The law aims to protect victims, but faces debate over its broad notice-and-removal rules potentially impacting privacy and encrypted services.
Key Provisions & Purpose:
Criminalizes NCII & Deepfakes: Makes publishing or threatening to publish nonconsensual intimate images (authentic or digitally altered) a federal crime.
Swift Takedown Mandate: Requires online platforms to remove reported NCII/forgeries within 48 hours.
FTC Enforcement: Grants the Federal Trade Commission (FTC) broader power to regulate platform compliance.
Victim Protection: Aims to protect individuals, especially minors, from exploitation by deepfakes and unauthorized intimate content.
==
All Tech Is Human is a non-profit organization taking a whole-of-ecosystem approach to tackling thorny tech & society issues. Through our multistakeholder community-building, education, and career-related activities and resources, we aim to build a tech future aligned with the public interest.
Attend our in-person and virtual gatherings, join our Slack community of over 14k members, read our Responsible Tech Guide and issue-specific resources, take our Responsible AI courses, use our Responsible Tech Job Board, and more. See all of our links here. Be on the lookout for our Responsible Tech Summit on October 29th at The New York Times Center (in-person + livestream).
