consider a gambler who, at each play of the game, either wins $1 with probability p or loses $1 with probability 1-p. if we suppose that our gambler quits playing either when he goes broke or he attains a fortune of $N, then the gambler's fortune is a Markov chain having transition probability