Events
The Taub Faculty of Computer Science Events and Talks
Gail Weiss (Ph.D. Thesis Seminar)
Wednesday, 19.10.2022, 16:00
Advisor: Prof. Eran Yahav and Prof. Yoav Goldberg
Neural sequence models (NSMs) - neural networks adapted specifically for the task of processing input sequences - have emerged as powerful tools in sequence processing, with the current most popular architectures being transformers and RNN variants. But what is a trained network really doing? In this talk we will approach this question, starting from the question of what a network *can* do, and progressing to the question of what a trained network *has* learned in practice.
Specifically, we will begin by discussing the mechanisms that different RNN architectures can implement, and how these affect their ability to express different formal languages. We will then move this discussion to transformers, for which we must introduce RASP - a symbolic abstraction of their behaviour. Finally, we will discuss how a given trained RNN can be converted to a smaller, more interpretable, model - and how the process uncovers cases where seemingly perfect networks have not learned their intended task!