The Taub Faculty of Computer Science Events and Talks
Thursday, 03.01.2013, 12:30
We present a communication-efficient protocol for distributed online prediction tasks where a group of learners, such as computers in a cloud or nodes in a sensor network, provide a real-time service based on local models of a high-frequency data stream (as in, e.g., online advertisements, trade recommendations, or social content promotion). While synchronizing these local models can increase the predictive performance, the required communication generates a cost. In contrast to earlier work, which synchronizes models according to a static schedule, we propose a dynamic protocol that aims to directly minimize communication and that can also be applied to track non-stationary concepts. By skipping model synchronization whenever the divergence of the local models is low, our approach can completely avoid communication in relatively stable phases, while it maintains high responsiveness in dynamic learning phases.