We revisit the fundamental problem of learning Axis-Aligned-Rectangles over a
finite grid $X^dsubseteq{mathbb{R}}^d$ with differential privacy. Existing
results show that the sample complexity of this problem is at most $minleft{
d{cdot}log|X| ;,; d^{1.5}{cdot}left(log^*|X| right)^{1.5}right}$.
That is, existing constructions either require sample complexity that grows
linearly with $log|X|$, or else it grows super linearly with the dimension
$d$. We present a novel algorithm that reduces the sample complexity to only
$tilde{O}left{d{cdot}left(log^*|X|right)^{1.5}right}$, attaining a
dimensionality optimal dependency without requiring the sample complexity to
grow with $log|X|$.The technique used in order to attain this improvement
involves the deletion of “exposed” data-points on the go, in a fashion designed
to avoid the cost of the adaptive composition theorems. The core of this
technique may be of individual interest, introducing a new method for
constructing statistically-efficient private algorithms.

By admin