Cover Image for Recursive Language Models w/ Alex Zhang
Cover Image for Recursive Language Models w/ Alex Zhang
Hosted By
3 Going

Recursive Language Models w/ Alex Zhang

Hosted by alphaXiv
Zoom
Registration
Welcome! To join the event, please register below.
About Event

About Event

​​​​​​🔬 AI4Science on alphaXiv
🗓 Friday January 23rd 2026 · 10AM PT
🎙 Featuring Alex Zhang
💬 Casual Talk + Open Discussion

​​​🎥 Zoom: Upon Registration

Description: We study allowing large language models (LLMs) to process arbitrarily long prompts through the lens of inference-time scaling. We propose Recursive Language Models (RLMs), a general inference strategy that treats long prompts as part of an external environment and allows the LLM to programmatically examine, decompose, and recursively call itself over snippets of the prompt. We find that RLMs successfully handle inputs up to two orders of magnitude beyond model context windows and, even for shorter prompts, dramatically outperform the quality of base LLMs and common long-context scaffolds across four diverse long-context tasks, while having comparable (or cheaper) cost per query.

Check out the full paper here!

​​​​​​Whether you’re working on the frontier of LLMs or just curious about anything AI4Science, we’d love to have you there.

​​​​​​Hosted by: alphaXiv x Intology

Hosted By
3 Going