Ben Finkelshtein, M.Sc. Thesis Seminar
Advisor: Prof. Alexander M. Bronstein and Dr. Chaim Baskin
Graph neural networks (GNNs) have shown broad applicability in a variety of domains.
These domains, e.g., social networks and recommendation systems, are fertile ground for malicious users and behavior. In a series of works, we study the robustness of GNNs under different scenarios and present a simple rotation and permutation equivariant point-cloud GNN.
We show that GNNs are vulnerable to the scenario of strategic behavior of multiple users (i.e., Strategic Classification) and to the extremely limited (and thus quite realistic) scenario of a single-node adversarial attack. That is, an attacker can force the GNN to classify any target node to a chosen label, by only slightly perturbing the features or the neighbors' list of another single arbitrary node in the graph, even when not being able to select that specific attacker node.
Additionally, as equivariance to permutations and rigid motions is an important inductive bias for various 3D learning problems -- we present a simple rotation and permutation equivariant point-cloud network that can approximate any equivariant function.