We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between Pand Qoutput distributions of an -LDP mechanism in terms of a divergence between the corresponding input distributions P and Q, respectively. Our first main technical result presents a sharp upper bound on the χ2-divergence χ2(P||Q) in terms of χ2(P||Q) and . We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on χ2(P||Q) in terms of total variation distance (P,Q) and . We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam’s, Assouad’s, and the mutual information methods —powerful tools for bounding minimax estimation risks. These results are shown to lead to tighter privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.