NTR Webinar: Effects of Architecture and Training on Embedding Geometry and Feature Discriminability in BERT

NTR organizes and hosts scientific webinars on neural networks and invites speakers from all over the world to present their recent work at the webinars.

On October 12 Maksim Podkorytov, Florida State University, Tallahassee, Florida, USA, led a technical Zoom webinar on Effects of Architecture and Training on Embedding Geometry and Feature Discriminability in BERT. 

About the webinar: 

Bidirectional Encoder Representations from Transformers (BERT) [Devlin et al, 2019] have improved SOTA in a number of NLP tasks by using an auto-encoding model that incorporates large bidirectional contexts. However, the underlying  mechanisms of BERT for its effectiveness are not well understood. 

In this work we investigated how the BERT architecture and its masked language modeling pre-training objective affects the geometry of its embeddings and the effectiveness of its features for classification tasks.

Materials: https://aclanthology.org/2021.naacl-main.403.pdf

Moderator and contact: NTR CEO Nick Mikhailovsky: nickm@ntrlab.com.

Leave a Reply

Your email address will not be published. Required fields are marked *