אירועים
אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב
Yoav Goldberg - CS-Lecture
יום חמישי, 25.01.2018, 10:30
While deep learning methods in Natural Language Processing are arguably
overhyped, recurrent neural networks (RNNs), and in particular gated
recurrent networks like the LSTM, emerge as very capable learners for
sequential data. Thus, my group started using them everywhere. After
briefly explaining what they are and why they are and giving a birds-eye
overview of our work, I will describe a line of work in which we use
LSTM encoders in a multi-task learning scenario. In these cases, we
improve accuracy on a given task by supplementing the learning process
with supervision signal from a related auxiliary task, using the LSTM as
a shared representation.
Short Bio:
==========
Yoav Goldberg has been working in natural language processing
for over a decade. He is a Senior Lecturer at the Computer Science
Department at Bar-Ilan University, Israel. Prior to that he was a
researcher at Google Research, New York. He received his PhD in Computer
Science and Natural Language Processing from Ben Gurion University. He
regularly reviews for NLP and Machine Learning venues, and serves at the
editorial board of Computational Linguistics. He published over 50
research papers and received best paper and outstanding paper awards at
major natural language processing conferences. His research interests
include machine learning for natural language, structured prediction,
syntactic parsing, processing of morphologically rich languages, and, in
the past two years, neural network models with a focus on recurrent
neural networks