Robust estimation of discrete distributions under local differential privacy
23.05.2022, 10:15
– Golm Haus 9 Raum 2.22 (hybrid - zoom)
Forschungsseminar Statistik
Julien Chhor
Joint work with Flore Sentenac (CREST)
Although robust learning and local differential privacy are both widely studied fields of research, combining the two settings is an almost unexplored topic. We consider the problem of estimating a discrete distribution in total variation from n contaminated data batches under a local differential privacy constraint. A fraction \(1−\varepsilon\) of the batches contain \(k~\) i.i.d. samples drawn from a discrete distribution p over d elements. To protect the users' privacy, each of the samples is privatized using an α-locally differentially private mechanism.
The remaining \(\varepsilon n\) batches are an adversarial contamination. The minimax rate of estimation under contamination alone, with no privacy, is known, up to a \(\sqrt{ \log(1/\varepsilon)}\) factor. Under the privacy constraint alone, the minimax rate of estimation is also known. We characterize the minimax estimation rate under the two constraints up to a \(\sqrt{ \log(1/\varepsilon)}\) factor, which is larger than the sum of the two separate rates. We provide a polynomial-time algorithm achieving this bound, as well as a matching information theoretic lower bound.
Zoom link on request