To introduce Markov chains, let us first give some related definitions.
A stochastic process fX(t); t 2 Tg is a collection of random variables, i.e.,
for each t 2 T; X(t) is a random variable. The index t is often interpreted
as time and, we refer X(t) as the state of the process at time t. Moreover,
the set S which contains X(t) for all t 2 T is called a state space
The set T is called the index set of the process. When T is a countable
set, the stochastic is said to be a discrete-time process. If T is an interval of
the real line, the stochastic process is said to be a continuous-time process.
Definition 1.2.1. Let (Xn) be a stochastic process that takes on a finite or
countable number of possible values. If Xn = i, the the process said to be
in state i at time n. A stochastic process is said to have a Markov property if
the next state depends only on the current state and not on the past. i.e.,