Poisson Distribution

 Objectives:

Understand the concept of discrete random variable taking countably infinite values. 

Poisson distribution as the limiting form of binomial distribution.

Understand the specific situations for the use of these models.

Compute probabilities for Poisson and geometric distributions. Learn interrelations. among the different probability distributions.

Poisson Distribution

We are now familiar with concept of discrete probability distribution taking finite values. In this section, we deal with discrete random variable defined on countably infinite sample space. The theory of probability distributions is found to be of immense use for studying the events occurring in day-to-day life. Some typical probability distributions are widely applicable in real life situations and areas of research. Among these, Poisson distribution plays an important role.

We experience number of situations where chance of occurrence of an event in a short time interval is very small. However, there are infinitely many opportunities to occur. The number of occurrences of such an event follows Poisson distribution.

For example:

(a) Number of defective items found in a good lot of large size. (b) Number of times system of components fails.

(c) Number of persons standing in a queue.

(d) Number of deaths due to snake-bite in a certain village.

(e) Number of persons affected due to radiation, residing near a nuclear power plant.

The examples stated above give idea about applications of Poisson distribution. Poisson distribution is applicable in statistical quality control, especially in constructing control charts and acceptance sampling plans.

It is also applicable in the theory of reliability.

It is found to be useful in theory of queues.

Definition of Poisson distribution :

A discrete random variable X taking values 0, 1, 2... is said to follow Poisson distribution with parameter m if its probability mass function (p.m.f.) is given by;

P[X=x] =

eTM mx x!

x = 0, 1, 2

m > 0 otherwise

Note:

= 0;

1. X follows Poisson distribution with parameter m is symbolically written as X→ P (m). Its p.m.f. is denoted by P(x).

2. We require to use the following results repeatedly in the further derivations.

m2 2!

em 1+m+2 +

loge (1 - a) =

3. We shall verify that P(x) is a p.m.f.

(i) P(x)>0,Vx : em >0

+

if la < 1

00

(ii)

Σ Ρ (x) = Σ

e-m m

x!

x=0

x=0

mx

= e-mΣ

x!

x=0

=em

+m+

+...

=e-mem = 1

Mean and variance :

Mean = μ1

= E(X) = x P(x)

x=0

= Σ x emmx

x!

x = 1

(Since term corresponding to x = 0 is zero)

=me-m

em = m

E(X) = m

I

μ12 = E(X2)= Σ x2 P(x)

x=0

mx

= Σ [x(x-1)+x] (m2)

x=0

= Σ x (x-1)

x=0

x!

e-m mx

+ Σ x

e-mmx x!

x=0

00

mx-2

..

..

= e-mm2 Σ

x-2=0

(x-2)! + E(X)

= em m2 em+m

H2 Var (X) = E (X2) - [E(X)]2 = m2+m-m2 = m

Var (X) = m

Note: Thus, the mean and variance of Poisson distribution are equal and each is equal to the parameter of the distribution.

Moment Generating Function (M.G.F.) of P(m)

Suppose XP (m), then

Mx (t) = E (e)

- "P(x) - I enem

x=0

= Σ

x=0

x=0

(me') e x!

=ey (me)

x=0

x!

= emmet

Mx(t)=emel-1)

Deduction of raw moments:

First four raw moments can be obtained by expanding m.g.f. My (t) in powers of t as follows:

m (et 1)

Mx (t) = e

F...

= ea

where, a m +

= 1+ a

+༣ +

+.....

14

Substituting value of a, we have

Mx (1)=1+m(t+++ + ...) + m2 (1 + 2 + 3 + ...)2

1

H2

1

1+mt +

21+ ...) + (1 + 2 + 3 + ... + ...

12.

(m m2 m2 3m3 m4

+

+3!2! +4! +.......

coefficient of t = m

t2

= coefficient of = m + m2

H = coefficient of

=m+3m2 + m3

14

μ1 = coefficient of 4! = m +7m2 + 6m3 + m2

Central moments of P(m):

The expressions for first four central moments are derived as follows:

μ1 = 0

= m2 + m-(m)2 = m

M3 = μ-3 μ μ + 2 (μ)

= (m3 + 3m2 + m) -3 (m2 + m) m + 2 (m)3 = m

μ1 = μ1 = 4μ μ + 6 μ2 (μ,)2 – 3 (μ,)*

= (m2 + 6m3 + 7m2 + m) - 4 (m3 + 3m2 + m) m

+6 (m2 + m) m2 - 3m4

H3m2 + m

The first four raw moments can be obtained 

Cumulant generating function:

Kx (t)= loge Mx (t)

loge [em (et-1)]

Le

= m (e1- 1)

= m

+1+21

13

+ ...

= m[+++++++]

Now, k, = rth cumulant = coefficient of = m, for all r

i.e. k, k2k3 k...... m.

Thus, we have proved all cumulants of Poisson distribution are equal and the common value is parameter m. This is regarded as characteristic property of Poisson distribution. It means a Poisson distribution satisfies this property and conversely any discrete probability distribution satisfying this property must be Poisson distribution.

Central moments using first 4 cumulants are obtained as follows: H1 = k1 = m

H2

= k2 = m

H1k1+3k2m+3m2

Note: Obtaining central moments of Poisson r.v. using cumulants

is the easiest method.

Coefficients of skewness and kurtosis:

B1 == m3

H2

μ3 m2 1 Y1 = 3/2 = 3/2 = 6=" m H2

μ3

m m3/2

Vm

Since, μ3 m > 0, Poisson distribution is positively skewed.

B2 =

Ha 3m2 + m

=

3+

m2

m

:> 3 1/2 = B2-3 ===> > 0

H2

Hence, Poisson distribution is leptokurtic. Further as m→ ∞, B1,

Yi 20 and B23. Thus, for large m, the distribution becomes symmetric and mesokurtic.

Additive Property of Poisson Distribution

Statement: If X, and X2 are two independent Poisson variables with parameters m, and m2 respectively, then X, + X2 follows Poisson distribution with parameters m1 + m2.

Proof: X, is Poisson variate with parameter m,

Similarly,

Mx, (t) = em (e-1)

Mx (t) = em2 (e-1)

Note that, Mx + x2 (t) = Mx, (t). Mx2 (1)

[ X, and X2 are independent]

= em, (et - 1)

.em2 (et - 1)

=

(m,+m) (e-1)

= M.G.F. of Poisson (m, + m2).

Therefore, by uniqueness property we conclude that (X + X2) has

Poisson distribution with parameter (m, + m2).

Generalization of additive property:

Let X1, X2... X ... Xk be k independent variables following Poisson distribution with parameters m1, m2 ... m; ... my respectively.

Then X, X2+ ... + X; + ... + Xk =

distribution with parameter

Proof: Mx, x2

(2 m

m;

k

Σ X; follows Poisson

i=1

x (t) = Mx,(t). Mx, (t) ... Mxk (t)

[ X1, X2,..., Xk are independent variables]

m, (e-1) m2 (el 1) (m, + m2+ ... + mg) (e1-1)

my (et-1)

It is also the M.G.F. of Poisson distribution with parameter

(m, m2+...+ mg). Thus,

k

X; has Poisson distribution with

k

parameter

m;

i = 1

i = 1

Remark : If X, and X2 are independent Poisson variates, one may be qurious about the distribution of X, X2. It is worth noting that X, X2 does not follow Poisson distribution. It is obvious because. X, X2 can take negative integral values, whereas a Poisson variate never takes negative values.

Result: If X, and X2 are two independent Poisson variates with parameters m, and m2 respectively, then conditional distribution of

m1

X, given X, X2 = n is binomial with parameters n and m1 + m2' n being non-negative integer.

Proof: Since X, and X2 are independent Poisson variables, X, + X2 has Poisson distribution with parameter m, + m2.

e-(m, + m2) (m, + m2)

n!

P[X, k, X+X2 = n]

..

P[X, +X2 = n] =

Now, P [X, KIX, + X2 = n] =

P[X + X2 = n]

[:: P (AIB) =

P[X,

..

P [X1 =kX + X2 =n] =

=

P (ANB)

P(B)

k, X2n-k]

P[X, + X2 = n]

P(B) #0

[ X, and X2 are independent]

P[X1 = k] P[X2 =n-k]

P[X1 + X2 = n]

- m.

k!

m1

m,+m,

-m2

e m2

(n-k)!

(m1 + m2)"

..P [X1 =kX + X2 =n] =

n!

K n-k

n!

m1 m2

k! (n-k)! ((m, + m2)"

m,

=) * (1

(A) (m) + m2.

1-

m1 m, +m2

= p.m.f. of B❘n,

m1 m+m2,

k

; k=0, 1... n

Note: E (XIX, + X2 = n) = n

m1 + m2 '

n. m, m2

Var (X,IX + X2 = n) =

(m,+ m2)2

I

Example 1.3: Suppose X; → P (m; = 1.2 i) for i = 1, 2, 3, 4 and Xi

4

are independent of each other. If SXi, find (i) P (S= 12),

(ii) P (S> 3), (iii) P (S-2 <2)

Solution X

By additive property

4

i=1

Poisson (m;= 1.2 i) and X; are independent r.v.

S = X; → Poisson m = Σ m;

i=1

i=1

4

S→ Poisson m=1.2 Σ

i=1

S→ Poisson [m= 1.2 (10) = 12]

i.e.

e-12 (12)12

(i). P[S12] =

x!

= 0.10557

[From statistical tables]

(ii)P(S>3) 1-P(S≤3)

= 1 [P (S=0) + P(S= 1)

+ P(S2) + P(S = 3)]

= 1

e-12 120 e-12 121 e-12 122

0!

2!

e-12 1237

+

= 1 [0.000006+0.000074+0.000442

+ 0.001770]

= 0.997708

(iii) P [S-2<2] = P [S<4]

= P [S≤3]

= 0.000006 +0.000074 + 0.000442 +0.001770

= 0.002292

Example 1.4 If X and Y are independent Poisson random variables with means 2 and 4 respectively, find

(i) P[XY <1]

(ii) P [3 (X + Y) ≥9]

Solution: Let

Z = X + Y → P (m=2+4= 6)

(i)

P<1]

= P[Z<2]

(ii)

= P(Z=0) + P(Z = 1)

= 0.017352

P [3Z29]= P(Z≥3)

= 1-P(Z<3) = 0.93803

Recurrence Relation between Probabilities of Poisson Distribution

The relation between P[X = x] and P[X = x + 1] is called recurrence relation between probabilities. If X is a Poisson variate with parameter m, we shall establish this relation

P[X=x] =

e-m mx x!

e-m mx + 1

and

P(X = x + 1] =

(x + 1)!

P[X = x + 1] P(X = x] P[X=x+1] =

(em mx + 1)/(x + 1)!

=

(em mx)/x !

m

x + 1

P[X=x]; x = 0, 1, 2...

Note: Using above recurrence relation one can find P [X = x + 1] from P [X=x], provided the value of m is known (or estimated).

If P (X = 0]= e-m is calculated separately, then using the above recurrence relation we can find the remaining probabilities as follows: Putting x = 0, P(X = 1] = P(X = 0] = me-m

Putting x = 1,

P[X=2] =

P(X = 1]

Continuing in this manner P [X = 3], P [X = 4] etc. can be calculated. It is used in fitting of Poisson distribution to the given data.

Poisson Distribution as a Limiting form of Binomial Distribution

Suppose X is a binomial random variable with parameters n and p. If n is large and p is small, then computation of probabilities using p.m.f. of binomial distribution is very difficult. For instance when (50) n = 50, p = 0.02, P (X = 10]= 10) (0.02)10 (0.98)40. Calculation of such probabilities is laborious.

In these situations a French Mathematician Poisson has shown that under certain conditions, the p.m.f. of binomial distribution can be approximated by p.m.f. of Poisson distribution.

Result Suppose random variable X has binomial distribution with parameters n and p. If n→ ∞ and p→ 0 in such a way that np = m (>0) remains constant, then the p.m.f. of binomial distribution px qnx approaches to the p.m.f. of poisson distribution with e-m mx

=Q) F

P(x) =

parameter m, which is

Proof: P(x) =

x!

.

pq-q=1-p;x=0, 1, 2... n.

[n(n-1) (n-2)

(n-x+1)

=

x!

px (1 - p)n-x

=

() (1)

Taking limit as n→ ∞, we have

... (1-x) (np)x (1 - p)n-x

lim

lim 1

P(x) =

00- U

Now,

where, Dmx

lim (1-

lim

...

D

1-x

=

11-00

= e-m

m

p=

n

Real life situations of Poisson distribution:  Poisson distribution is used in the situations when the event under consideration is a rare event in a short interval of time. Some real life situations are:

1. Number of twin births occurring in a hospital during a month. 2. Number of patients suffering from bad reaction due to an injection.

3. Number of suicides in a city in a year.

4. Number of traffic accidents on Pune-Mumbai Express highway.

5. Number of printing mistakes on a page of a book.

6. Number of a particles emitted from a radioactive source.

7. Number of blood donors of group AB.






























Post a Comment

Previous Post Next Post

Contact Form