AI Reading Club - Where models store factual and linguistic knowledge
Education

AI Reading Club - Where models store factual and linguistic knowledge

SomoEventFinder

Date

jeu. 14 mai

Heure

16:00 - 17:00

Prix

Free

À propos de l'événement

Join us for an engaging session of the AI Reading Club where we delve into the fascinating paper titled Transformer Feed-Forward Layers Are Key-Value Memories (2020). 📖✨

👉 Paper: Read Here

This session shifts the focus from attention mechanisms to the integral role of feed-forward layers within the Transformer architecture. Discover how these layers can be interpreted as key-value memories that identify meaningful input patterns and correlate them with output information.

What We Will Discuss:

  • The implications of feed-forward layers storing knowledge on our understanding of model memory.
  • The validity of the key-value memory concept and potential limitations.
  • Insights gained from this paper in relation to BERT attention and the idea that “Attention is not Explanation.”

Session Format:

  • Overview: 10-15 minutes by the discussion lead.
  • Group Discussion: About 45 minutes of collaborative exploration.
  • Discussion Lead: TBD

Join Us:

Connect with us on Discord: Join Here

📍 Location: Register to see the address.
🎟️ Tickets: Get Your Ticket

NOTE : Nous ne pouvons pas garantir l'exactitude des informations fournies sur cet événement. Visitez le site web de l'événement pour vérifier les détails tels que la date, les horaires, les prix et le lieu.