AIC and BIC Methods
If maximum likelihood is used to estimate parameters and the models are non-nested, then the Akaike information criterion (AIC) or the Bayes information criterion (BIC) can be used to perform model comparisons. The two criteria are very similar in form but arise from very different assumptions. The AIC is derived from information theory and it is designed to pick the model that produces a probability distribution with the smallest discrepancy from the true distribution (as measured by the Kuhlbeck–Liebner discrepancy (see Bozdogan, 2000). The BIC is derived from a large sample asymptotic approximation to the full Bayesian model comparison (described later). They are both defined as follows. Suppose a model has k parameters,