Principal Coordinates Analysis (PCoA, = Multidimensional scaling, MDS) is a method to explore and to visualize similarities or dissimilarities of data. It starts with a similarity matrix or dissimilarity matrix (= distance matrix) and assigns for each item a location in a low-dimensional space, e.g. as a 3D graphics.
Rational
PCOA tries to find the main axes through a matrix. It is a kind of eigenanalysis (sometimes referred as "singular value decomposition") and calculates a series of eigenvalues and eigenvectors. Each eigenvalue has an eigenvector, and there are as many eigenvectors and eigenvalues as there are rows in the initial matrix.
Eigenvalues are usually ranked from the greatest to the least. The first eigenvalue is often called the "dominant" or "leading" eigenvalue. Using the eigenvectors we can visualize the main axes through the initial distance matrix. Eigenvalues are also often called "latent values".
The result is a rotation of the data matrix: it does not change the positions of points relative to each other but it just changes the coordinate systems!
Interpretation
By using PCoA we can visualize individual and/or group differences. Individual differences can be used to show outliers.
Note:
There is also a method called 'Principal Component Analysis' (PCA, sometimes also misleadingly abbreviated as 'PCoA') which is different from PCOA. PCA is used for similarities and PCoA for dissimilaritties. However, all binary measures (Jaccard, Dice etc.) are distance measures and, therefore PCoA should be used. For details see the following box: