The variance and standard deviation play a central role in probability and statistics.One reason for this might be that the variance of the sum of independent (and even of uncorrelated) square integrable random variables is the sum of their variances. A generalization of this is the additivity of the covariance matrix for independent random vectors. We show that some kind of converse is also true: if for a ‘‘dispersion measure’’ V of the form V(X) = E(f (X − EX)), where f : Rn → R is even, i.e., f (−x) = f (x) ∀x ∈ Rn, additivity V(X + Y) = V(X) + V(Y) holds for every two independent Rn-valued random
variables X and Y (such that all integrals involved exist), then necessarily f (x) = x′Ax for some symmetric n×n-matrix A, and so V is a linear combination of the covariances between any two components of X. For n = 1 it follows that V is a multiple of the variance and thus the ‘‘variance’’ not only is a popular example of a dispersion measure with additivity for
independent random variables, but also can even be characterized by this property.