AI Reading Club - Where models store factual and linguistic knowledge
Education

AI Reading Club - Where models store factual and linguistic knowledge

SomoEventFinder

Datum

čt 14. 5.

Čas

16:00 - 17:00

Cena

Free

Webová stránka

Navštívit
O události

Join us for an engaging session of the AI Reading Club where we delve into the fascinating paper titled Transformer Feed-Forward Layers Are Key-Value Memories (2020). 📖✨

👉 Paper: Read Here

This session shifts the focus from attention mechanisms to the integral role of feed-forward layers within the Transformer architecture. Discover how these layers can be interpreted as key-value memories that identify meaningful input patterns and correlate them with output information.

What We Will Discuss:

  • The implications of feed-forward layers storing knowledge on our understanding of model memory.
  • The validity of the key-value memory concept and potential limitations.
  • Insights gained from this paper in relation to BERT attention and the idea that “Attention is not Explanation.”

Session Format:

  • Overview: 10-15 minutes by the discussion lead.
  • Group Discussion: About 45 minutes of collaborative exploration.
  • Discussion Lead: TBD

Join Us:

Connect with us on Discord: Join Here

📍 Location: Register to see the address.
🎟️ Tickets: Get Your Ticket

POZNÁMKA: Nemůžeme zaručit přesnost informací o této události. Navštivte webové stránky události a ověřte si podrobnosti jako datum, otevírací dobu, ceny a místo.