Differential Privacy is a criteria used to judge whether a randomized algorithm operating over a database of individuals may be deemed to preserve privacy. In this work we apply the notion of Differential Privacy to reproducing kernel Hilbert spaces of functions. As in the finite dimensional Differential Privacy literature, we achieve privacy via noise addition where the variance is calibrated to the "sensitivity" of the output. In our setting the noise in question is the sample path of a Gaussian process, and the sensitivity is measured in the RKHS norm rather than the euclidean norm. We give examples of private versions of kernel density estimators and support vector machines.
This talk will be self contained in that it will not assume the prior knowledge about differential privacy, stochastic processes etc.