Papers Club by Gathers proudly presents an event in partnership with IDEAS NCBR. Join GathersAI on December 21 at 5 pm CET for an exploration of “LongLLaMA — Extending Context in Large Language Models.”
Delve into the intricacies of LLMs and their capacity to incorporate new information while addressing the critical challenge of the distraction issue. Discover the innovative Focused Transformer (FoT) and its role in enhancing context length.
The distinguished moderator, Konrad Staniszewski, a Ph.D. student at IDEAS NCBR and the Doctoral School of Exact and Natural Sciences at the University of Warsaw, will guide us through this discussion.
See the Focused Transformer/LongLLaMA paper: https://arxiv.org/abs/2307.03170
Zoom: https://zoom.us/j/94362453338?pwd=YUhtNjFOdFFjZG9GcDZwdVlwMXJBUT09
Secure your free spot today!