Markov Chains
Markov Chains Assignment Help | Markov Chains Homework Help
Discrete Time and Continuous Time Markov Chains
We have two main parameters to classify Markov Processes, time and state space. A discrete state space Markov Process (meaning, with finite number of states) is called a Markov Chain.
Discrete Time Markov Chain
A Markov Chain can be classified as a, Discrete Time Markov Chain (DTMC), where the times at which the transition occurs are also finite. Our previous example of brand switching problem is a Discrete Time Markov Chain, as the customer only buys in a particular month and from limited number of brands.
The mathematical representation is as follows:
Pi->j (n) = P (Xn+1 = j | Xn = i) = P (Xn+1 = j | Xn = i, Xn-1 = i1, Xn-2 = i2…)
Which states the Markov property of independence from the history of the system.
Continuous Time Markov Chain
If the time parameter of the Markov Process is continuous then the Markov Chain is called a Continuous Time Markov Chain (CTMC) or a Continuous-Parameter Chain. Poisson Process with an independent increment property is an example of such a Markov Process. Number of cars passing by a traffic signal is also such a process. As a car can drive through the signal at any time (which is continuous) while the number of cars that do pass through are quantifiable.
The mathematical representation is slightly different to account for a continuous parameter and maintain the independent state space parameter:
Pi->j (t) = P (X (t+s)= j | X (t) = i)
Brownian motion is an example of a continuous time and continuous state space, Markov Process.