Say {X(t), t≥0} is a continuous-time stochastic process taking values on the nonnegative integers. Then X(t) is a Markov chain if for all s,t≥0, and nonnegative integers i,j, x(u), 0≤u<s we have
Say the process enters state i at time 0. Let Ti be the time the process stays in state i. Then
But this means that Ti is memoryless! Of course Ti is also non-negative and continuous, and therefore Ti has to have an exponential distribution.
With this we have the following characterization of a continuous-time Markov chain:
So a continuous-time Markov chain is a process that moves from state to state in accordance with a discrete-space Markov chain, but also spends an exponentially distributed amount of time in each state.
say {N(t);t≥0} is a Poisson process rate λ. Then {N(t);t≥0} is also a continuous-time Markov process which stays at i for an exponential time, then jumps to i+1.
Let’s start by considering a finite state space continuous-time Markov chain, that is X(t)\(\in\) {0,..,N}, say. Let
Pij(t) = P(X(t)=j|X(0)=i)
Then the the Markov property asserts that {X(t),t≥0} satisfies
where c) follows from the Chapman-Kolmogorov equations. d*) is not strictly a consequence of the Markov property but is usually a sensible additional condition.
Let P(t)=(pij) denote the matrix of transition probabilities at time t, so P is a matrix whose entries are functions of t.
Now c) can be written as P(s+t)=P(s)P(t), t,s≥0. and d*) as limh→0P(h)=I
The rates qi and qij give as a second way to describe a Markov chain, called the infinitesimal description
Let
Remark
This way of describing a Markov process is more important than it might appear at first for two reasons:
it is often much easier to say what happens in a process during a very short period of time than to find the transition matrix.
This approach to Markov processes lends itself to generalizing the idea of a Markov process to more complicated state spaces via so called operator semigroups. These are transformations on complicated, even infinite dimensional spaces that act very much like matrix multiplication.
say {N(t);t≥0} is a Poisson process rate λ. Then Ai,i=-λ and Ai,i+1=λ
say {X(t),t≥0} is a Markov chain with X(t)\(\in\) {0,1} and
We need An, so
it is easy to find the stationary distribution of a continuous-time discrete-space Markov chain in terms of the infinitesimal matrix (if it exists). If all states communicate, that is if Pij(t)>0 for all i,j and some t>0, then limt→∞Pij(t) =πj>0 exists, and
otherwise the chain would never leave i, and so we have πA=0 or
πjqj=∑i≠jπiqij, .. j=0,..,N
say {N(t);t≥0} is a Poisson process rate λ. Then Ai,i=-λ and Ai,i+1=λ
Here we have \(\pi A=0\) leads to \(-\pi_i \lambda +\pi_{i+1}\lambda=0\) or \(\pi_i=\pi_0\) for all i, but then \(\sum \pi_i=\infty\), so a stationary distribution does not exist. This is obvious because \(N(t)\rightarrow \infty\).
A company has a computer for its website. If the computer is down they can’t sell anything, so they have a backup, which takes over if the first computer is down. The operating computer fails after an exponentially distributed time (with rate μ). Repair times are also exponentially distributed (with rate λ). Let’s assume that μ is fixed but we have a choice of λ (by hiring more technicians). We want to make sure that in the long run at most 1% of the time both computers are down. How should we choose λ?
Let X(t) be the number of computers in operating condition at time t, so X(t)\(\in\) {0,1,2}. Then X(t) is a Markov chain with infinitesimal matrix
What is the average “total downtime”, that is the time when neither computer is working? The system of equations for the stationary distribution is
and we see that the probability only depends on the ratio λ/μ. Set x=λ/μ, then
Many webpages have a counter that keeps track of the number of people who have visited the site. We can model such a counter as a Markov Chain called a “Pure Birth” process. At time 0 there have been 0 visitors. Say at time t there have been X(t)=n. The counter stays at n for time T that has an exponential distribution with rate λ.
Of course this also a Poisson process.
Consider a system whose state at any time is the number of “people” in the system. Suppose if there are n people in the system then
Thus a birth and death process is a Markov chain with state-space {0,1,..} and
where 4) is because we go from i to i+1 if there is a birth before a death.
Let X~Exp(λ), Y~Exp(μ) and X\(\perp\)Y. Now
Consider a population of m individuals that at time 0 consists of 1 “infected” and m-1 “susceptible” (individuals that might get infected, maybe because they have not been immunized. Once infected an individual remains so forever and we suppose that in any time interval h any given infected person will cause, with probability αh+o(h) any given susceptible to become infected. If we let X(t) denote the number of infected people in the population at time t, {X(t),t≥0} is a pure birth process with
λn=(m-n)nα, n=1,..,m-1
because if there are n infected people the m-n uninfected ones get infected at a rate of nα
Let Ti be the time to go from i to i+1 infected, and let T be the time until the total population is infected, then