NTR organizes and hosts scientific webinars on neural networks and invites speakers from all over the world to present their recent work at the webinars.
On June 29 Dmitry Tsarkov, Google, Zurich, Switzerland led a technical Zoom webinar on Measuring Compositional Generalization: A Comprehensive Method on Realistic Data.
About the webinar:
State-of-the-art machine learning methods exhibit limited compositional generalization. At the same time, there is a lack of realistic benchmarks that comprehensively measure this ability, which makes it challenging to find and evaluate improvements.
We introduced a novel method to systematically constructing such benchmarks by maximizing compound divergence, while guaranteeing a small atom divergence between train and test sets, and we quantitatively compare this method to other approaches for creating compositional generalization benchmarks.
We presented a large and realistic natural language question answering dataset that is constructed according to this method, and we use it to analyze the compositional generalization ability of three machine learning architectures.
We found that they fail to generalize compositionally and that there is a surprisingly strong negative correlation between compound divergence and accuracy.
Materials:
Measuring Compositional Generalization: A Comprehensive Method on Realistic Data. Keysers et al, ICLR-2020, https://arxiv.org/abs/1912.09713
*-CFQ: Analyzing the Scalability of Machine Learning on a Compositional Task. Tsarkov et al, ISWC-2021, https://arxiv.org/abs/2012.08266
Moderator and contact: NTR CEO Nick Mikhailovsky: nickm@ntrlab.com.