Detection of sparse signals arises in many modern applications such as signal processing, bioinformatics, finance, and disease surveillance. However, in many of these applications, the data may contain sensitive personal information, which is desirable to be protected during the data analysis. In this article, we consider the problem of (,δ)-differentially private detection of a general sparse mixture with a focus on how privacy affects the detection power. By investigating the nonasymptotic upper bound for the summation of error probabilities, we find any (,δ)-differentially private test cannot detect the sparse signal if the privacy constraint is too strong or if the model parameters are in the undetectable region in cai2014optimal. Moreover, we study the private clamped log-likelihood ratio test proposed in canonne2019structure and show it achieves vanishing error probabilities in some conditions on the model parameters and privacy parameters. Then, for the case when the null distribution is a standard normal distribution, we propose an adaptive (,δ)-differentially private test, which achieves vanishing error probabilities in the same detectable region in cai2014optimal when the privacy parameters satisfy certain sufficient conditions. Several numerical experiments are conducted to verify our theoretical results and illustrate the performance of our proposed test.