אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב
Christoph Hofmeister (Technical University of Munich (TUM), Munich, Germany)
יום ראשון, 28.05.2023, 14:30
This talk is about distributed machine learning in the presence of Byzantine errors. A main node performs gradient descent steps with the help of some worker nodes, a limited number of which are controlled by an adversary. These malicious worker nodes can return arbitrary data to the main node instead of the desired computation results. Prior work proposes distributing the data with redundancy among the workers and using error correction codes to detect and correct the erroneous computation results. In this work, we propose a solution that requires less redundancy at the cost of a small number of gradient computations on the main node and some light communication. In addition to a new scheme, we provide lower bounds on communication and computation.
Christoph Hofmeister received a B.Eng. in electrical engineering and information technology from the Munich University of Applied Sciences (HM) in 2019 and an M.Sc. in electrical engineering and information technology from the Technical University of Munich (TUM) in 2021, where he is currently pursuing a Ph.D. with the Coding and Cryptography Group, Institute of Communications Engineering, under the supervision of Prof. Wachter-Zeh. His research interests include information and coding theory and its applications, with a focus on coded computing.