

AI Safety Forecasting Hackathon
AI Safety Requires Ambition
The trajectory of AI development represents one of the most consequential questions for humanity's future. Understanding when and how transformative AI capabilities will emerge is critical for policy, safety research, and societal preparedness. Yet current forecasting methods struggle with unprecedented technological shifts, compounding uncertainties, and the challenge of predicting emergent capabilities.
In this hackathon, you can:
Build forecasting models and evaluation pipelines to anticipate AI capabilities and timelines
Create tools for scenario exploration, uncertainty quantification, and model benchmarking
Develop monitoring systems for key indicators of AI progress across research, industry, and policy
Write policy briefs and governance proposals grounded in forecasting insights
Explore new methodologies inspired by projects like AI 2027 and EpochAI's empirical forecasting work
Pursue other projects that advance the field of AI forecasting!
You will work in teams over one weekend and submit open-source forecasting models, benchmark suites, scenario analyses, policy briefs, or empirical studies that advance our understanding of AI development timelines and trajectories.
Joining the EPFL hub
This hackathon is international, but as a local hub on the EPFL campus, we'll provide additionally:
👨🏫 Mentorship for your projects
🍹 Drinks and snacks to fuel your hard-work
Program
Friday, October 31st, 17:45 : participants welcome
Friday, October 31st, 18:00 : subject presentation + hackathon launch + keynote talk with Eli Lifland (AI Futures Project)
Sunday, November 2nd, 19:00 : end of hackathon
What to expect:
What you will work on
You will have to submit a project report, link to a public GitHub repository with your code (recommended), a brief (3-5 minute) video demonstration of your solution (optional) and an appendix documenting any AI/LLM prompts used in your project for reproducibility (optional).
Join us!
This hackathon is for anyone who is passionate about AI safety, Forecasting, or global security. Whether you're an AI researcher, engineer, student, policy analyst, domain expert in other domain, or simply someone with a great idea, we invite you to be part of this journey. Together, we can advance our understanding of AI development timelines and trajectories.
Sign up
Once registered on this Luma event, please proceed to sign up in the Apart Research hackathon page ("Sign up" here). This will allow you to later access the $400 in cloud computing credits for each team.
We will contact you on the email used for the Luma inscription to form teams.
No Forecasting experience is required, as starter resources and mentors are provided! Only curiosity and will to work hard over the 48h :)
More information about the hackathon here: https://www.apartresearch.com/event/ai-safety-hackathon
The hackathon will happen in the CO 015 room in EPFL campus, instructions on how to get there here.