Distances and Shortest Paths With Differential Privacy
Adam Sealfon, Massachusetts Institute of Technology
Analyzing sensitive data brings with it the challenge of drawing useful conclusions from a database without revealing the underlying data. Differential privacy encapsulates this requirement, providing strong statistical guarantees about the privacy of individual data records while simultaneously providing a framework through which an analyst can query a database and safely release approximate answers. On any pair of databases which differ only in a single entry, a differentially private algorithm must produce statistically close distributions of outputs. Differentially private algorithms consequently hide individual data records while still allowing the approximate release of aggregate functions of the data.
We introduce a model for differentially private analysis of network structured data in which the network topology is public but edge weights are sensitive and must not be leaked. Motivated in part by navigation in a road network with traffic, we study the problems of releasing shortest paths and distances in this model. We show that any differentially private algorithm producing an approximately shortest path must have additive error proportional to the number of edges, and provide an algorithm achieving this bound. Furthermore, we show how to privately release all-pairs distances and provide specialized algorithms with substantial accuracy improvements for bounded-weight graphs and for trees.
Abstract Author(s): Adam Sealfon