Finally, Hierarchical Clustering, produces a set of nested clusters organized as
a hierarchical tree (dendogram). Hierarchical Clustering does not have to assume
a particular number of clusters in advanced. Also, any desired number of clusters
can be obtained by selecting the tree at the proper level. Hierarchical clusters can
also sometimes correspond to meaningful taxonomies. Traditional hierarchical al-
gorithms use a similarity or distance matrix and merge or split one cluster at a time.
There are two main approaches to hierarchical clustering. In agglomerative hier-
archical clustering we start with the points as individual clusters and at each step,
merge the closest pair of clusters until only one cluster (or k clusters) are left. In
divisive hierarchical clustering we start with one, all-inclusive cluster, and at each
step, split a cluster until each cluster contains a point (or there are k clusters).