Guy Rosin, Ph.D. Thesis Seminar
Advisor: Dr. Kira Radinsky, Prof. Shaul Markovitch
Our world is constantly evolving, and so is the content on the web. Consequently, our languages, often said to mirror the world, are dynamic in nature.
However, most current language representations are static and cannot adapt to changes over time.
New words and semantic evolution have been shown to pose a crucial challenge in many Natural Language Processing and Information Retrieval tasks, leading to a significant performance drop for modern language models.
In this thesis, we create time-aware language models and representations and demonstrate their value for several real-world tasks.
First, we introduce a temporal relationship model that supports the task of identifying, given two words, when they relate to each other. Then, we focus on contextual word representations based on the transformer architecture. We propose models that use time as an additional context of texts, allowing us to create time-specific contextualized word representations.
Finally, we study the relationship between language change and world events. To that end, we develop mechanisms to simultaneously embed words and events in the same vector space.