Differentially Private Data Structures

Differential privacy is one of the most prominent definitions for privacy, being used not only in academic research but also in real-world applications such as Google’s next word prediction. The static data structures setting-where multiple queries must be answered in a differentially private manner over a static data set-is already well understood. The dynamic data structures setting - where updates to the data set occur alongside queries over the current data - was introduced in 2010 by Dwork, Naor, Pitassi, and Rothblum, who called it "differential privacy under continual observation." It has received substantial attention in recent years, largely due to its application in private machine learning. More specifically, using differentially private dynamic data structures in the training of neural networks ensures the privacy of the data while minimizing the loss of prediction accuracy.
I will survey the current state of research, outline the key algorithmic techniques, and highlight my own recent work on this topic. In particular, I will explain the currently most accurate algorithm on differentially private continual prefix sum, which is an essential subroutine in differentially private gradient descent.
Speaker: Moika Henzinger, Institute of Science and Technology, Austria
Monday, 01/27/25
Contact:
Website: Click to VisitCost:
FreeSave this Event:
iCalendarGoogle Calendar
Yahoo! Calendar
Windows Live Calendar
