Abstract—This paper presents a rate distortion approach to
Markov graph learning. It provides lower bounds on the number
of samples required for any algorithm to learn the Markov graph
structure of a probability distribution, up to edit distance. We
first prove a general result for any probability distribution, and
then specialize it for Ising and Gaussian models. In particular,
for both Ising and Gaussian models on p variables with degree
at most d, we show that at least Ω((d − s
p ) log p) samples are
required for any algorithm to learn the graph structure up to
edit distance s. Our bounds represent a strong converse; i.e.,
we show that for a lower number of samples, the probability of
error goes to 1 as the problem size increases. These results show
that substantial gains in sample complexity may not be possible
without paying a significant price in edit distance error.
Index Terms—Markov networks, graphical models, strong
converse, Ising model, Gaussian Markov mode