A stochastic process is a process for which we do not know the outcome but can make estimates based on the probability of different events occurring over time. A primary example of a stochastic process is the Markov chain (seen above). The essence of a Markov chain is that the next state depends only on the current state. This simply means that the probability of being in state X on the nth step given that we know the state for all previous steps (1, 2, 3, ... n), is the same as just knowing the state during step n (the most recent event). In the above picture, the circles with Si represent the states, with the arrows representing the probability from going between each state. In other words, the probability of going to state S3 from state S1 is .8 or 80%.
A good example of a Markov chain is flipping a coin and adding 1 for every heads and 0 for every tails. If the total is 60 after 100 flips, we do not know what the total will be after the 101st flip, but we can estimate based on probability—making this a stochastic process. Further, it does not matter what the total was after the 1st flip, the 20th flip, or the 99th flip; all we need to know is the total after the 100th flip in order to estimate the 101st flip—making this a Markov chain.