A stochastic process {N(t),t>0} is called a counting process if N(t) is the number of times an event occurred up to time t.
Consider an ATM machine and let N(t) be the number of customers served by the ATM machine at time t.
Because of the way it is defined every counting process has the following properties:
Let {N(t);t≥0} be a counting process. Then
{N(t);t≥0} is said to have independent increments if the number of events that occur in disjoint intervals are independent.
{N(t);t≥0} is said to have stationary increments if the distribution of the number of events that occur in any interval of time depend only on the length of the interval.
The process of our ATM machine probably has independent increments but not stationary increments. Why?
The most important example of a counting process is the Poisson process. To define it we need the following notation, called Landau’s o symbol:
Landau’s o symbol
a function f is said to be o(h) (“little o of h”) if
lim
In other words, f goes to 0 faster than the linear function.
f(x)=x^2 is o(h) but f(x)=x is not.
If f,g are o(h), then f\pm kg is o(h) because
\lim_{h\rightarrow 0} \frac{f(h)\pm kg(h)}h = \lim_{h\rightarrow 0} \frac{f(h)}h\pm k\lim_{h\rightarrow 0} \frac{g(h)}h = 0\pm k0= 0
A counting process \{N(t);t \ge 0\} is said to be a Poisson process with rate \lambda>0 if
N(0)=0
N(t) has stationary and independent increments
P(N(h)=1) = \lambda h + o(h)
P(N(h) \ge 2) = o(h)
Notice that this implies that during a short time period the probability of an event occurring is proportional to the length of the interval and the probability of a second (or even more) events occurring is very small.
Let \{N(t);t \ge 0\} be a Poisson process, then
N(t+s)-N(s) \sim Pois(\lambda t) for all s \ge 0.
proof
Let p_n(t) = P(N(t)=n). Then
\begin{aligned} &p_0(t+h) =P(N(t+h)=0) =\\ &P(N(t)=0, N(t+h)-N(t)=0) = \\ &P(N(t)=0)P(N(t+h)-N(t)=0) = \text{ \{independent increments\}}\\ &p_0(t)P(N(h)=0)= \text{ \{stationary increments\}}\\ &p_0(t)\left[1-P(N(h)\ge 1)\right] = \\ &p_0(t)\left[1-P(N(h)= 1)-P(N(h) > 1)\right] = \\ &p_0(t)\left[1-(\lambda h+o(h))-o(h)\right] =\\ &p_0(t)\left[1-\lambda h+o(h)\right] =\text{ \{property of o(h)\}}\\ &p_0(t)-\lambda h p_0(t) +o(h) \end{aligned}
so
\begin{aligned} &\frac{p_0(t+h)-p_0(t)}{h} = -\lambda p_0(t) +\frac{o(h)}h \\ &\\ &p'_0(t) =\lim_{h\rightarrow 0} \frac{p_0(t+h)-p_0(t)}{h} =\\ &\lim_{h\rightarrow 0} \left[-\lambda p_0(t) +\frac{o(h)}h\right]= -\lambda p_0(t)\\ &\\ &\frac{p'_0(t)}{p_0(t)}=-\lambda\\ &\int \frac{p'_0(t)}{p_0(t)} dt = \int -\lambda dt = -\lambda t+C \\ &\int \frac{p'_0(t)}{p_0(t)} dt = \log p_0(t) +C\\ &\log p_0(t) = -\lambda t+C \\ &0=\log p_0(0) = -\lambda 0+C = C\\ &\log p_0(t) = -\lambda t \\ &p_0(t) = e^{-\lambda t} = P(X=0) \\ \end{aligned}
where X is a random variable with X\sim Pois(\lambda t).
The same basic idea works for the case p_n(t) as well to finish the proof.
Remark It is intuitively clear why the definition above should lead to the Poisson distribution. Take the interval (0,t] and subdivide it into k equal size intervals (0,t/k], (t/k, 2t/k], …, ((k-1)t/k,t)]. The probability of 2 or more events in any one interval goes to 0 as k goes to \infty because
P(2\text{ or more events in any subinterval})\le \\ \sum_k P(2\text{ or more events in the }k^{th}\text{ subinterval})= ko(t/k) by Boole’s Inequality. Now
t\times o(t/k)/(t/k)\rightarrow 0 as k\rightarrow \infty.
Therefore N(t) will (with probability going to 1 just equal the number of subintervals in which an event occurs. However, by independent and stationary increments this number will have a binomial distribution with parameters k and p=\lambda t+o(t/k). Hence by the Poisson approximation to the binomial we see that N(t) will have a Poisson distribution with rate \lambda t.
Suppose telephone calls arrive at a call center according to a Poisson process at a rate of 5 per hour. What is the probability of no calls during the first 30 minutes?
P(N(1/2)=0)=\frac{5^0}{0!}e^{-5\times 1/2}=0.0821
What is the probability of at 10 or more calls during the the first hour?
P(N(1)\ge 10)=1-P(N(1)\le 9)=\sum_{k=0}^9 \frac{5^k}{k!}e^{-5}
round(ppois(9, 5), 4)
## [1] 0.9682
Suppose that N points are uniformly distributed over the interval (0, N). Let X be the number of points in (0,1). We want to find the pdf of X if N is large.
Let’s try this directly first:
\begin{aligned} &P(X=0) =P(U_i>1;i=1,...,N) =\\ &P(U_1>1)^N= \left(1-\frac1N \right)^N\approx e^{-1} \end{aligned} Next we need
\begin{aligned} &P(X=1) =P(U_j<1;U_i>1;i=1,...,N,i\ne j) =\\ &\sum_{i=1,i\ne j}^N P(U_j<1;U_i>1)=\\ &\sum_{i=1,i\ne j}^N \frac1N \left(1-\frac1N \right)^{N-1} = \\ &\left(1-\frac1N \right)^{N-1} = \\ &\left(1-\frac1N \right)^{N}/(1-\frac1N) \approx \\ &e^{-1}/(1-\frac1N) = \frac{N}{e(N-1)} \end{aligned} Now for P(X=2) we need two of N U’s to be less than 1 and the others greater than 1, and this get’s ugly fast.
Instead consider the following: Let N(t) be the points in (0,t), then for t small (relative to N) \{(N(t), t \ge 0\} will be a Poisson process with rate \lambda. Now
P(N(1)=0) = P(X=0) = e^{-1}
so \lambda =1 and so
P(X=n) = P(N(1)=n) = e^{-1}/n!
Let T_1 be the time when the first event occurs, T_2 the time from the first event until the second event etc. The sequence T_1, T_2, ... is called the sequence of interarrival times.
Let \{N(t);t \ge 0\} be a Poisson process, and \{T_i ;i \ge 1\} be the interarrival times. Then T_i \sim Exp(\lambda) and T_i \perp T_j
proof
Note that \{T_1>t\} is equivalent to {no events occurred in (0,t]}, and so
\begin{aligned} &P(T_1>t) =P(N(t)=0) = e^{-\lambda t} \\ &F_{T_1}(t) =1-P(T_1>t) = 1-e^{-\lambda t}\\ &f_{T_1}(t) = \lambda e^{-\lambda t}\\ \end{aligned}
and we see that T_1\sim Exp(\lambda). But
\begin{aligned} &P(T_2>t\vert T_1=s) = \\ &P(\text{0 events in }(s, s+t]\vert T_1=s) = \text{ \{independent increments\}}\\ &P(\text{0 events in }(s, s+t]) = \text{ \{stationary increments\}}\\ &P(\text{0 events in }(0, t]) = e^{-\lambda t} \end{aligned}
So we find that T_2\sim Exp(\lambda) and that T_1 \perp T_2. By induction it is clear that the sequence \{T_n ,n=1,2,..\} is an iid sequence of exponential r.v. with mean 1/\lambda.
Remark This result should not come as a surprise because the assumption of independent and stationary increments means that the process from any moment on is independent of all that occurred before and also has the same distribution as the process started at 0. In other words the process is memoryless, and we have previously shown that any continuous rv on (0, \infty) with the memoryless property has to have an exponential distribution.
Let W_n be the arrival time of the n^{th} event. (This is also often called the waiting time until the n^{th} event).
Let \{N(t);t \ge 0\} be a Poisson process, and \{W_n; n \ge 1\} be the waiting times. Then W_n \sim \Gamma(n,\lambda)
proof
Clearly W_n = \sum^n_{i=1} T_i, and so we find W_n \sim \Gamma(n,\lambda)
Up to yesterday a store has 999,856 customers. They are planning to hold a little party when the 1,000,000th customer comes into the store. From experience they know that a customer arrives about every 4 minutes, and the store is open from 9am to 6pm. What is the probability that they will have the party today?
They will have the party if at least 144 customers come into the store today. Let’s assume that the customers arrive according to a Poisson process with rate 4min (?), then we want the probability P(W_{144} < 9*60).
Now W_{144} \sim \Gamma (144,1/4)
and so
round(pgamma(9*60, 144, 1/4), 4)
## [1] 0.2302
Here is another proof of the last theorem. We use the fact that the n^{th} event occurs at or before time t if and only if the number of events occurring by time t is at least n. So
N(t) \ge n \text{ iff } W_n \le t
This is a very useful equivalence, and much more general than just for the Poisson process, so here is an illustration:
\begin{aligned}
&F_{W_n}(t) = P(W_n\le t) = P(N(t)\ge n) =\\
&\sum_{k=n}^\infty P(N(t)=n) = \sum_{k=n}^\infty \frac{(\lambda t)^k}{k!}e^{-\lambda t}
\end{aligned}
\begin{aligned}
&f_{W_n}(t) = \frac{dF_{W_n}(t)}{dt}=\\
&\frac{d}{dt} \sum_{k=n}^\infty \frac{(\lambda t)^k}{k!}e^{-\lambda t} = \\
&\sum_{k=n}^\infty \frac{\lambda^k}{k!}\frac{d}{dt}\left[t^ke^{-\lambda t} \right]= \\
&\sum_{k=n}^\infty \frac{\lambda^k}{k!}\left[kt^{k-1}e^{-\lambda t}-\lambda t^ke^{-\lambda t} \right]= \\
&\sum_{k=n}^\infty \left[\lambda\frac{(\lambda t)^{k-1}}{(k-1)!}-\lambda \frac{(\lambda t)^{k}}{k!} \right]e^{-\lambda t}= \\
&\lambda e^{-\lambda t}\sum_{k=n}^\infty \left[\frac{(\lambda t)^{k-1}}{(k-1)!}- \frac{(\lambda t)^{k}}{k!} \right]= \\
&\lambda e^{-\lambda t} \left[\sum_{k=n}^\infty\frac{(\lambda t)^{k-1}}{(k-1)!}-\sum_{k=n}^\infty \frac{(\lambda t)^{k}}{k!} \right]= \\
&\lambda e^{-\lambda t} \left[\sum_{j=n-1}^\infty\frac{(\lambda t)^{j}}{j!}-\sum_{k=n}^\infty \frac{(\lambda t)^{k}}{k!} \right]= \\
&\lambda e^{-\lambda t} \left[\frac{(\lambda t)^{n-1}}{(n-1)!}+\sum_{j=n}^\infty\frac{(\lambda t)^{j}}{j!}-\sum_{k=n}^\infty \frac{(\lambda t)^{k}}{k!} \right]= \\
&\lambda e^{-\lambda t}\frac{(\lambda t)^{n-1}}{(n-1)!}=\\
&\frac{\lambda^n}{\Gamma(n)}t^{n-1}e^{-\lambda t}
\end{aligned}
and so W_n\sim Gamma(n, \lambda)
Say that on any given day hundreds of cars pass through a certain intersection. Any one of them has a (hopefully small) probability of having an accident at that intersection. Let N(t) be the number of accidents in the t days, then is N(t) a Poisson process?
There are two problems with the assumptions of the Poisson process here:
different days might have different numbers of cars going through (weekdays vs. weekends?)
the probability of having an accident is probably very different for different cars.
The first problem might be handled by considering a different time-scale (accidents per week?), the second problem actually is not a problem at all:
Let Z_1, Z_2~, ... be independent Bernoulli rv’s with P(Z_i =1)=p_i. Let S_n =Z_1 +...+Z_n. Let \lambda=p_1 +..+p_n, then
\vert P(S_n=k)-\frac{\lambda^k}{k!}e^{-\lambda}\vert\le\sum\limits_{i=1}^{n}p_i^2 proof omitted
In the “classic” case where p_1 =..=p_n =p=\lambda/n we have
\vert P(S_n=k)-\frac{\lambda^k}{k!}e^{-\lambda}\vert\le\frac{\lambda^2}{n}
and we see that this theorem not only gives us reason to think that the Poisson approximation works in the example above, it also provides a useful estimate of the error in the Poisson approximation to the Binomial.
We have seen previously that if
U_1 ,...,U_n \sim U[0,1]
and independent, then the order statistic \left(U_{(1)},...,U_{(n)}\right) has joint density
f(u_{(1)},..,u_{(n)})=n!; 0<u_{(1)}< ..<u_{(n)}<1
Clearly if U_1 ,..,U_n \sim U[0,t] and independent, then \left(U_{(1)},...,U_{(n)}\right) has joint density
f(u_{(1)},..,u_{(n)})=\frac{n!}{t^n}; 0<u_{(1)}< ..<u_{(n)}<t
let W_1, W_2,... be the arrival times of a Poisson process with rate \lambda. Then
f_{(W_1,..,W_n)|N(t)=n}(w_1,..,w_n)=\frac{n!}{t^n};0<w_1<..<w_n<t
In other words, conditional on the total number of arrivals the arrival times have the same distribution as the order statistic of a uniform.
proof omitted
Say people arrive at a store according to a Poisson process with rate 7 per hour. What is the expected time the first customer arrived if after one hour 10 customers were in the store?
More generally say \{N(t),t \ge 0\} is a Poisson process with rate \lambda. We want to find the mean time of the first event, given that N(1)=n, n \ge 1, that is we want E[W_1|N(1)=n].
According to the theorem
\begin{aligned} &\left\{W_1,...,W_n|N(1)=n\right\} =^d \left(U_{(1)},...,U_{(n)}\right)\\ &U_i\sim U[0,1] \\ &P(W_1>t|N(1)=n) = \\ &P(W_1>t,...,W_n>t|N(1)=n) = \\ &P(U_{(1)}>t,...,U_{(n)}>t) = \\ &P(U_1>t,...,U_n>t) = \\ &P(U_1>t)^n=\\ & \left(\int_t^1 ds \right)^n=(1-t)^n\\ \end{aligned} and so
E[W_1|N(1)=n] = \int_0^1 (1-t)^n dt=-\frac{(1-t)^{n+1}}{n+1}|_0^1=\frac1{n+1} where we used theorem (1.7.9).
In the case of the store we find the expected arrival time of the first customer to be 1/11\approx 5.45 minutes. Note that this answer does not depend on the rate of 7 per minute!
Customers arrive at a store according to a Poisson process of rate \lambda. Each customer pays $1 on arrival, and we want to evaluate the expected value of the total sum collected during (0,t], discounted back to time 0.
If the discount (inflation) rate is \beta, then $1 paid at time W_k is worth $\exp\{-\beta W_k\} at the end of the time period. If a total of N(t) people arrived at the store by time t, the total money is given by
M=E \left[\sum_{k=1}^{N(t)} \exp\{-\beta W_k\}\right]
Now using the formula for conditional expectations we find
\begin{aligned} &M=E \left[\sum_{k=1}^{N(t)} \exp\{-\beta W_k\}\right] = \\ &E \left\{ E \left[\sum_{k=1}^{N(t)} \exp\{-\beta W_k\}\vert N(t)\right]\right\}=\\ &\sum_{n=1}^\infty E \left[\sum_{k=1}^{N(t)} \exp\{-\beta W_k\}\vert N(t)=n\right]P \left(N(t)=n \right) \end{aligned} Now \begin{aligned} &E \left[\sum_{k=1}^{N(t)} \exp\{-\beta W_k\}\vert N(t)=n\right] = \\ &E \left[\sum_{k=1}^{n} \exp\{-\beta W_k\}|N(t)=n\right] = \\ &E\left[\sum_{k=1}^{n} \exp\{-\beta U_{(k)}\}\right] = \\ &E\left[\sum_{k=1}^{n} \exp\{-\beta U_{k}\}\right] = \\ &nE\left[\exp\{-\beta U_{1}\}\right] = \\ &n\int_0^t e^{-\beta x}\frac1t dx=\\ &-\frac{n}{\beta t}e^{-\beta x}|_0^t=\frac{n}{\beta t}(1-e^{-\beta t}) \end{aligned} and finally
\begin{aligned} &M = \sum_{n=1}^\infty E \left[\sum_{k=1}^{N(t)} \exp\{-\beta W_k\}\vert N(t)=n\right]P \left(N(t)=n \right)=\\ &\sum_{n=1}^\infty \frac{n}{\beta t}(1-e^{-\beta t})P \left(N(t)=n \right) = \\ &\frac{1}{\beta t}(1-e^{-\beta t})\sum_{n=1}^\infty nP \left(N(t)=n \right) = \\ &\frac{1}{\beta t}(1-e^{-\beta t})E[N(t)]=\\ &\frac{1}{\beta t}(1-e^{-\beta t})\lambda t=\\ &\frac{\lambda}{\beta}(1-e^{-\beta t}) \end{aligned}
Let N_1(t), N_2(t) be independent Poisson processes with rate \lambda, \tau, respectively. Then
N(t)=N_1(t)+N_2(t) is a Poisson process rate \lambda+\mu
N_1(t)|N_1(t)+N_2(t)=n \sim Bin(n,\frac{\lambda}{\lambda+\mu})
proof
The conditions of a Poisson process are easily verified.
\begin{aligned} &P(N_1(t)=n|N_1(t)+N_2(t)=m+n) = \\ &\frac{P(N_1(t)=n,N_1(t)+N_2(t)=m+n)}{P(N_1(t)+N_2(t)=m+n)} = \\ &\frac{P(N_1(t)=n,N_2(t)=m)}{P(N(t)=m+n)} = \\ &\frac{P(N_1(t)=n)P(N_2(t)=m)}{P(N(t)=m+n)} = \\ &\frac{\frac{(\lambda t)^n}{n!}e^{-\mu t}\frac{(\mu t)^{m}}{m!}e^{-\lambda t}}{\frac{((\lambda+\mu) t)^{m+n}}{(m+n)!}e^{-(\lambda+\mu) t}} =\\ &\frac{(m+n)!}{n!m!}\left(\frac{\lambda}{\lambda+\mu}\right)^n\left(\frac{\mu}{\lambda+\mu}\right)^m = \\ &{{n+m}\choose{n}}\left(\frac{\lambda}{\lambda+\mu}\right)^n\left(1-\frac{\lambda}{\lambda+\mu}\right)^m \end{aligned}
Consider a Poisson process \{(N(t), t \ge 0\} with rate \lambda, and suppose each time time an event occurs it is classified as either type I or II, with probabilities p and q=1-p, respectively, independent of anything else. Let N_1(t) and N_2(t) be the respective number of type I and II arrivals by time t, then \{N_1 (t), t \ge 0\} and \{N_1 (t),t \ge 0\} are both Poisson process with respective rates p\lambda and (1-p)\lambda. Furthermore the processes are independent.
proof
Using the law of total probability we find
\begin{aligned} &P(N_1(t)=n,N_2(t)=m) = \\ &\sum_{k=0}^\infty P(N_1(t)=n,N_2(t)=m|N(t)=k)P(N(t)=k) = \\ &P(N_1(t)=n,N_2(t)=m|N(t)=n+m)P(N(t)=n+m) = \\ &{{n+m}\choose{m}} p^nq^m \frac{(\lambda t)^{n+m}}{(n+m)!}e^{-\lambda t} = \\ &\frac{(n+m)!}{n!m!} p^nq^m \frac{(\lambda t)^{n}(\lambda t)^{m}}{(n+m)!}e^{-\lambda (p+q)t} = \\ &\frac{1}{n!m!} p^nq^m (\lambda t)^{n}(\lambda t)^{m}e^{-\lambda pt}e^{-\lambda qt} = \\ &\left( \frac{(\lambda pt)^{n}}{n!}e^{-\lambda pt}\right)\left( \frac{(\lambda qt)^{m}}{m!}e^{-\lambda qt}\right)=\\ &P(N_1(t)=n)P(N_2(t)=m) \end{aligned} where the equation in the middle follows from previous theorem.
The theorem easily generalizes to r types.
Customers arrive at a store according to a Poisson process with rate of 2 per hour. Each customer is a “Buyer” with probability 0.3 or a “Window-Shopper” with probability q=0.7. What is the probability of at least 1 sale during a 2 hour period?
P(at least 1 sales) =
P(N_1 (t) \ge 1) =
1-P(N_1 (t)=0) = 1-\exp(-2*2*0.3) = 1-e^{-1.2} =0.7
Coupon Collector Problem
There are m different coupons. Each time a person collects a coupon it is, independently of those previously obtained, of type j with probability p_j. Let N denote the number of coupons one needs in order to have a complete collection of at least one of each type. We want to find E[N].
Let N_j be the number of coupons needed until we have one of type j, then
N=\max\{N_j; 1\le j\le m\}
It is easy to see that N_j \sim G(p_j ), but they are not independent and so finding the distribution of their maximum is very difficult.
Let’s assume that that coupons are collected according to a Poisson process with rate 1, and say an event is of type j if the coupon collected was of type j. If we let N_j(t) denote the number of type j coupons collected by time t, then it follows that \{N_j (t),t \ge 0\} are independent Poisson processes with rates p_j.
Let W_j denote the time of the first event of type j, and let W=\max\{ W_j; 1 \le j \le m\} be the time when we have all the coupons. Now the W_j are the waiting times until the first event of independent Poisson processes, so they are also the first interarrival times and therefore have exponential distributions and are independent, so
\begin{aligned} &P(W<t) = P \left( \max\{ W_j; 1 \le j \le m\}<t\right)=\\ &\\ &P(W_1<t,...,W_m<t) = \prod_{j=1}^m P(W_j<t) = \prod_{j=1}^m \left(1-e^{-p_j t} \right)\\ &\\ &E[W] =\int_0^\infty P(W>t)dt = \int_0^\infty \left[1- \prod_{j=1}^m \left(1-e^{-p_j t} \right)\right]dt \end{aligned}
Now let T_i be the ith interarrival time, that is the time between finding the (i-1)st and the ith coupon. W= \sum T_i, but T_i \sim Exp(1), and they are independent, so
E[W|N]=E[\sum T_i |N]=NE[T_1 |N]=N
so
E[W]=E\{E[W|N]\}=E[N]
For example, say all coupons appear equally often, that is p_1 =...=p_n =p=1/m, then
\begin{aligned} &E[N] = \int_0^\infty \left[1- \prod_{j=1}^m \left(1-e^{-t/m} \right)\right]dt\\ &\int_0^\infty \left[1- \left(1-e^{-t/m} \right)^m\right]dt = \\ &m\int_0^\infty \left[1- \left(1-e^{-x} \right)^m\right]dx = \\ &m\int_0^\infty \left[1- \left(\sum_{k=0}^m {{m}\choose{k}}1^k(-e^{-x})^{m-k} \right)\right]dx = \\ &m\int_0^\infty \left[ \left(\sum_{k=0}^{m-1} {{m}\choose{k}}e^{-x(m-k)})(-1)^{m-k+1} \right)\right]dx = \\ &m \sum_{k=0}^{m-1} {{m}\choose{k}}(-1)^{m-k+1}\int_0^\infty e^{-x(m-k)})dx = \\ &m \sum_{k=0}^{m-1} {{m}\choose{k}} \left[ \frac1{-(m-k)} -e^{-x(m-k)})|_0^\infty \right.= \\ &\sum_{k=0}^{m-1} (-1)^{m-k+1}{{m}\choose{k}}\frac{m}{m-k} \end{aligned} Let’s calculate these numbers for some values of m:
m | E[N] |
---|---|
2 | 3.0 |
3 | 5.5 |
4 | 8.3 |
5 | 11.4 |
6 | 14.7 |
7 | 18.1 |
8 | 21.7 |
9 | 25.5 |
10 | 29.3 |
If \{N_i (t);t \ge 0\}, i=1,..,k represent the number of type i events occurring in (0,t] and if P_i(t) is the probability that an event occurring at time t is of type i, then
E[N_1(t)]=\lambda \int_0^t P_i(s)ds
proof extend theorem on marked Poisson to continuous case.
Spread of Covid
One of the difficulties in tracking the number of Covid infected people is its incubation time, that is an infected person does not show any symptoms for a number of days, but is capable of infecting others.
Let us suppose that individuals contract Covid according to a Poisson process with some unknown rate \lambda. Suppose that the incubation time until symptoms appear is a rv with cdf G, which is known, and suppose incubation times are independent. Let N_1(t) be the number of individuals that have shown symptoms at time t, and let N_2(t) be the number that have contracted Covid at time t but not yet shown symptoms. An individual that contracts Covid at time s will show symptoms at time t with probability G(t-s), so it follows from the above theorem that
\begin{aligned} &E[N_1(t)]=\lambda \int_0^t G(t-s)ds = \lambda \int_0^t G(x)dx\\ &E[N_2(t)]=\lambda \int_0^t 1-G(t-s)ds = \\ &\lambda t - \lambda \int_0^t G(x)dx=\lambda t-E[N_1(t)] \end{aligned}
say we know the number of individuals with symptoms as time t is n_1 , then
E[N_1(t)]= \lambda \int_0^t G(x)dx\approx n_1
or \lambda=n_1/\int_0^t G(x)dx, and so
n_2\approx n_1t/\int_0^t G(x)dx-n_1 Say G is exponential rate \mu, then G(x)=1-e^{-\mu x},
\begin{aligned} &\int_0^t G(x)dx = \\ &\int_0^t 1-e^{-\mu x}dx = \\ &x+\frac1{\mu}e^{-\mu x}|_0^t = \\ &t+\frac1{\mu}e^{-\mu t}-\frac1\mu=t+(e^{-\mu t}-1)/\mu \\ &n_2=\frac{n_1 t}{t+(e^{-\mu t}-1)/\mu} \end{aligned}
for example if t=5 days, \mu=0.3 and n_1 =100,000, then n_2 is
round(100000*5/(5+(exp(-0.3*5)-1)/0.3))
## [1] 207432