NTR Webinar: Efficient Long-Range Transformers

NTR organizes and hosts scientific webinars on neural networks and invites speakers from all over the world to present their recent work at the webinars.

On September 14 Valerii Likhosherstov, University of Cambridge, Cambridge, UK; Researcher at Google UK, led a technical Zoom webinar on Efficient Long-Range Transformers.

About the webinar:

These days, Transformer architectures are showing state-of-the-art performance in many tasks, including natural language processing, computer vision, protein modelling and beyond. 

The difficulties arise because Transformers scale quadratically (O(L^2)) as the sequence length L grows. In this talk, we discussed a multiplicity of recently proposed methods to reduce time or memory complexity of Transformers up to O(L) and even O(1).

Webinar presentation.

Moderator and contact: NTR CEO Nick Mikhailovsky: nickm@ntrlab.com.

Leave a Reply

Your email address will not be published. Required fields are marked *