Locally Private Graph Neural Networks

Twitter Machine Learning Seminar, Remote

Abstract:

Graph Neural Networks (GNNs) have demonstrated superior performance in learning graph representations for several subsequent downstream inference tasks. However, learning over graph data can raise privacy concerns when nodes represent people or human-related variables that involve personal information about individuals. Previous works have presented various techniques for privacy-preserving deep learning over non-relational data, such as image, audio, video, and text, but there is less work addressing the privacy issues involved in applying deep learning algorithms on graphs. As a result and for the first time, in this paper, we develop a privacy-preserving GNN learning algorithm with formal privacy guarantees based on Local Differential Privacy (LDP) to tackle the problem of node-level privacy, where graph nodes have potentially sensitive features that need to be kept private, but they could be beneficial for an untrusted server to learn richer node representations. Specifically, we propose an optimized LDP algorithm with an unbiased estimator, using which a central server can communicate with the graph nodes to privately collect their data and estimate the graph convolution layer of the GNN. To further reduce the effect of the injected noise, we propose a simple graph convolution layer based on the multi-hop aggregation of the nodes’ features. Extensive experiments conducted over real-world datasets demonstrate the capability of our method in maintaining an appropriate privacy-accuracy trade-off for privacy-preserving node classification.

Download the slides