DNA science. Artificial intelligence. Smartphones and 3D printers. Science and technology have transformed the world we live in. But how did we get here? It wasn’t by accident. Well, sometimes it was. It was also the result of hard work, teamwork, and competition. And incredibly surprising moments. Hosted by bestselling author Steven Johnson (“How We Got To Now”), American Innovations uses immersive scenes to tell the stories of the scientists, engineers, and ordinary people behind the great ...
…
continue reading
Content provided by NLP Highlights and Allen Institute for Artificial Intelligence. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by NLP Highlights and Allen Institute for Artificial Intelligence or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!
Go offline with the Player FM app!
129 - Transformers and Hierarchical Structure, with Shunyu Yao
MP3•Episode home
Manage episode 296551674 series 1452120
Content provided by NLP Highlights and Allen Institute for Artificial Intelligence. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by NLP Highlights and Allen Institute for Artificial Intelligence or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
In this episode, we talk to Shunyu Yao about recent insights into how transformers can represent hierarchical structure in language. Bounded-depth hierarchical structure is thought to be a key feature of natural languages, motivating Shunyu and his coauthors to show that transformers can efficiently represent bounded-depth Dyck languages, which can be thought of as a formal model of the structure of natural languages. We went on to discuss some of the intuitive ideas that emerge from the proofs, connections to RNNs, and insights about positional encodings that may have practical implications. More broadly, we also touched on the role of formal languages and other theoretical tools in modern NLP. Papers discussed in this episode: - Self-Attention Networks Can Process Bounded Hierarchical Languages (https://arxiv.org/abs/2105.11115) - Theoretical Limitations of Self-Attention in Neural Sequence Models (https://arxiv.org/abs/1906.06755) - RNNs can generate bounded hierarchical languages with optimal memory (https://arxiv.org/abs/2010.07515) - On the Practical Computational Power of Finite Precision RNNs for Language Recognition (https://arxiv.org/abs/1805.04908) Shunyu Yao's webpage: https://ysymyth.github.io/ The hosts for this episode are William Merrill and Matt Gardner.
…
continue reading
145 episodes
MP3•Episode home
Manage episode 296551674 series 1452120
Content provided by NLP Highlights and Allen Institute for Artificial Intelligence. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by NLP Highlights and Allen Institute for Artificial Intelligence or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
In this episode, we talk to Shunyu Yao about recent insights into how transformers can represent hierarchical structure in language. Bounded-depth hierarchical structure is thought to be a key feature of natural languages, motivating Shunyu and his coauthors to show that transformers can efficiently represent bounded-depth Dyck languages, which can be thought of as a formal model of the structure of natural languages. We went on to discuss some of the intuitive ideas that emerge from the proofs, connections to RNNs, and insights about positional encodings that may have practical implications. More broadly, we also touched on the role of formal languages and other theoretical tools in modern NLP. Papers discussed in this episode: - Self-Attention Networks Can Process Bounded Hierarchical Languages (https://arxiv.org/abs/2105.11115) - Theoretical Limitations of Self-Attention in Neural Sequence Models (https://arxiv.org/abs/1906.06755) - RNNs can generate bounded hierarchical languages with optimal memory (https://arxiv.org/abs/2010.07515) - On the Practical Computational Power of Finite Precision RNNs for Language Recognition (https://arxiv.org/abs/1805.04908) Shunyu Yao's webpage: https://ysymyth.github.io/ The hosts for this episode are William Merrill and Matt Gardner.
…
continue reading
145 episodes
All episodes
×Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.