A recent result says that the lossless compression of a single source $X^{n}$ is achievable with a strong locality property; any $X_{i}$ can be decoded from a constant number of compressed bits, with a vanishing in $n$ probability of error. By contrast, we show that for two separately encoded sources $(X^{n},Y^{n})$ , lossless compression and strong locality is generally not possible. Specifically, we show that for the class of “confusable” sources, strong locality cannot be achieved whenever one of the sources is compressed below its entropy. Irrespective of $n$ , there always exists at least one index $i$ for which the probability of incorrectly decoding $(X_{i},Y_{i})$ is lower bounded by $2^{-O( \mathtt {d})}$ , where $ \mathtt {d}$ denotes the worst-case (over indices) number of compressed bits accessed by the local decoder. Conversely, if the source is not confusable, strong locality is possible even if one of the sources is compressed below its entropy. Results extend to an arbitrary number of sources.