Skip to content (access key 's')
Logo of Technion
Logo of CS Department


Neural Sequence Models: A Formal Lens
event speaker icon
Gail Weiss, Ph.D. Thesis Seminar
event date icon
Wednesday, 19.10.2022, 16:00
event location icon
Zoom Lecture: 99357013274 and Taub 601
event speaker icon
Advisor:  Prof. Eran Yahav and Prof. Yoav Goldberg
Neural sequence models (NSMs) - neural networks adapted specifically for the task of processing input sequences - have emerged as powerful tools in sequence processing, with the current most popular architectures being transformers and RNN variants. But what is a trained network really doing? In this talk we will approach this question, starting from the question of what a network *can* do, and progressing to the question of what a trained network *has* learned in practice. Specifically, we will begin by discussing the mechanisms that different RNN architectures can implement, and how these affect their ability to express different formal languages. We will then move this discussion to transformers, for which we must introduce RASP - a symbolic abstraction of their behaviour. Finally, we will discuss how a given trained RNN can be converted to a smaller, more interpretable, model - and how the process uncovers cases where seemingly perfect networks have not learned their intended task!
[Back to the index of events]