Skip to content Search
for students, for scientific institutions
21.12.2023 5:00 pm - 21.12.2023 6:00 pm
online
Join GathersAI on December 21 for "LongLLaMA: Extending Context in Large Language Models," presented by Papers Club in partnership with IDEAS NCBR. Explore the world of LLMs, discover the innovative Focused Transformer (FoT), and meet the expert moderator, Konrad Staniszewski.

Papers Club by Gathers proudly presents an event in partnership with IDEAS NCBR. Join GathersAI on December 21 at 5 pm CET for an exploration of “LongLLaMA — Extending Context in Large Language Models.”

Delve into the intricacies of LLMs and their capacity to incorporate new information while addressing the critical challenge of the distraction issue. Discover the innovative Focused Transformer (FoT) and its role in enhancing context length.

The distinguished moderator, Konrad Staniszewski, a Ph.D. student at IDEAS NCBR and the Doctoral School of Exact and Natural Sciences at the University of Warsaw, will guide us through this discussion.

See the Focused Transformer/LongLLaMA paper: https://arxiv.org/abs/2307.03170

Zoom: https://zoom.us/j/94362453338?pwd=YUhtNjFOdFFjZG9GcDZwdVlwMXJBUT09

Secure your free spot today!

20.09.2023
Webinars for candidates for doctoral schools and doctoral thesis supervisors
20.05.2023
OpenAI Meeting
01.06.2023
Neural Fields in Computer Graphics. Lecture by Przemysław Musialski