Stochastic process

In order to provide a formal definition of a Markov chain, it is first necessary to specify what is meant by a set of random variables having a temporal ordering. Such a set of random variables can best be represented by a stochastic process.

We define a stochastic process in discrete time and discrete states using the following sequence of random variables:

Here, each Xn is a discrete random variable with values in a S = s1, s2,…, sset, called the space of the states. Without losing generality, suppose that S is a subset of the relative integers, Z. We will use the index n of Xn to denote the time in which the states evolve; we will call states, the possible ones with the values of Xn. The process starts in one of these states and moves successively from one state to another. Each move is called a step.

As time passes, the process can jump from one state to another. If the system is in state i during time step n, and is in state j ≠ i during time step n +1, then we say that there has been a transition.