Faqts : Science : Mathematics : Probability

+ Search
Add Entry AlertManage Folder Edit Entry Add page to http://del.icio.us/
Did You Find This Entry Useful?

7 of 12 people (58%) answered Yes
Recently 6 of 10 people (60%) answered Yes

Entry

Math: Probability: Distribution: Can you derive the Erlang distribution C^k.e^(-C.T).T^(k-1)/(k-1)!?

Feb 19th, 2006 10:19
Knud van Eeden,


----------------------------------------------------------------------
--- Knud van Eeden --- 25 December 2004 - 10:53 pm -------------------
Math: Probability: Distribution: Can you derive the Erlang 
distribution C^k.e^(-C.T).T^(k-1)/(k-1)!?
---
Suppose that the oocurrence of a certain type of events in time
follow a Poisson distribution.
In other words, if k is the total amount of events
(e.g. the occurrence of an agent getting 5 telephone calls within an
hour),
that you can calculate the probability of it by using the
Poisson distribution formula:
    k        -L
   L    .   e                                                      (1)
 = ------------
        k!
Now you can write L as follows:
Given is (see the description of the Poisson distribution formula)
for the average rate L:
 L = N . p                                                         (2)
Let
 dt = T / N                                                        (3)
(where t is your total (time) interval, which you divide in N small
 subinterval, thus each subinterval dt has a length T divided by N,
 thus T/N).
From (3) follows you can find N by dividing your total interval by
the length of the subinterval:
 N = T / dt                                                        (4)
Given is that the probability p is proportional with the subinterval
length, so you get. E.g. if you have an average rate per hour of 60
calls, with a certain probability, then you have a proportionally
smaller probability if you look for the probability in an interval 60
times smaller, thus in an interval of 1 minute.
 p = constant . dt
or thus abbreviating constant to C gives
 p = C . dt
Putting (4) and (5) in (2) gives:
 L = (T / dt) . (C . dt)                                           (6)
worked out (elimination again the subinterval dt) gives
 L = C . T                                                         (7)
Putting (7) in (1) gives the Poisson distribution formula expressed in
the total time interval length instead.
          k     -(C . T)
   (C . T)  .  e                                                   (8)
 = ---------------------
             k!
For example if you have 60 calls per hour, than is your total time
interval T equal to 1 hour. The constant C will then be equal to 60.
---
---
Now the Erlang distribution follows from for example the question, how
much time is passed until (but not included) the agent is getting his
5th call?
---
The method used to find this, is to first use the cumulative Poisson
probability function, and then to find the corresponding probability
function via the derivative of it.
That will lead to the Erlang formula.
---
OK, introduce a variable t which presents this (time) interval until
the agent is getting his 5th call, or thus in general his kth call.
---
Now before he is getting his 5th call, he is getting his 0th call,
1st call, 2nd call, 3th call and 4th call.
Thus in general before receiving his kth call he has received his 0th,
1st, 2nd, 3rd, 4th, 5th, ..., (k - 1) calls.
---
You split your time interval until the 5th call up in
time between
(0th and 1th call) +
(1th and 2nd call) +
(2nd and 3th call) +
(3rd and 4th call) +
(4th and 5th call)
---
That this happens can be found via the cumulative
Poisson distribution, or thus the sum (you add all this time events
before the kth call together so):
         0     -(C . T)          1     -(C . T)
  (C . T)  .  e           (C . T)  .  e
  --------------------- + --------------------- + ...              (9)
       0!                      1!
               (k-1) -(C . T)
        (C . T)  .  e
  ... + ---------------------
             (k - 1)!
or thus written otherwise as:
                                  k     -(C . T)
                           (C . T)  .  e
 = SUM( k = 0 to N-1 ) ( ------------------------- )              (10)
                                   k!
Now as usual, similar to e.g. finding the 'not' situation in the
binomial distribution, you use the complement of this, or thus the
subtraction of 1, to give the looked for expression.
Thus the cumulative probability that an agent is getting his
5th call is equal to 1 minus the probability that the agent is
getting his 4th or lower call.                                    (11)
Combining (11) and (10) gives thus for this probability in general:
                                      k     -(C . T)
                               (C . T)  .  e
 1 - ( SUM( k = 0 to N-1 ) ( ------------------------- ) )        (12)
                                       k!
As you are interested in the probability function itself, you
determine the derivative of (12) to T.
  '                             (C . T)  .  e             '
 1 - ( SUM( k = 0 to N-1 ) ( ------------------------- ) )        (13)
                                       k!
=
                                (C . T)  .  e             '
     ( SUM( k = 0 to N-1 ) ( ------------------------- ) )
 0 -                                   k!
That gives, using the product rule (u . v)' = u' . v + u . v'
for each term of the above sum (where thus C and k are considered
constants for this differentation):
               k    -(C . T)
        (C . T)  . e
      -------------------
                k!
gives for each term:
            (k - 1)        -(C . T)          k         -(C . T)
 k . (C . T)        . C . e         + (C . T)  . -C . e           (13)
 --------------------------------------------------------------
                             k!
by summing all the terms, you get:
                          (k-1)    -(C.T)      k    -(C.T)
                   k.(C.T)     .C.e      -(C.T) .C.e              (14)
- SUM(k=0 to N-1) (---------------------------------------)
                                     k!
                               -(C . T)
Now put the common factor C . e         in front of the sum
                                   k          (k-1)
     -(C.T)                   (C.T)  - k.(C.T)
= C.e      . SUM(k=0 to N-1) (---------------------)              (15)
                                       k!
You can further work this eliminate this sum, by writing and
working out (15) as follows (write the sum as 2 sums)
                                   k
     -(C.T)                   (C.T)
= C.e      . SUM(k=0 to N-1) (------) +
                                k!
                                      (k-1)
     -(C.T)                    k.(C.T)
- C.e      . SUM(k=0 to N-1) (-------------)                      (16)
                                   k!
You can simplify the second term, by using the fact that
k / k! equals per definition k / ( k . (k-1) . (k-2) . ... 2. 1)
or thus there remains, after dividing numerator and denominator
both by k
1 / ((k-1) . (k-2) . ... 2. 1)
which equals per definition
1 / (k-1)!                                                        (17)
To avoid making this equal to 1 / (0 - 1)! when you choose k=0 in
the sum, you split the second term of (16) in 2 parts (basically
removing the k=0 term, as this gives 0 as a result anyhow.
                 +--- this term equals 0
                 |
                 V
   -(C.T)          (k-1)                              (k-1)
C.e          0.(C.T)                           k.(C.T)
         . (------------  +  SUM(k=1 to N-1) (-------------))     (18)
                0!                                 k!
Putting the (17) in (18), and the endresult of this back in (16) leaves
thus:
                                  k                        (k-1)
    -(C.T)                   (C.T)                    (C.T)
=C.e      .(SUM(k=0 to N-1) (------)-SUM(k=1 to N-1) (----------)) (19)
                               k!                       (k-1)!
Now if you look carefully to this sum, you see that this are 2 sums
with the same common term (C . T) which are subtracted from each other.
You split out the highest term of this, and are left with 2 sums,
which are equal to each other and subtracted of each other. Thus
0, thus eliminated.
                 (k-1)
    -(C.T)  (C.T)
=C.e      .(------)    +
             (k-1)!
                                  k                        (k-1)
    -(C.T)                   (C.T)                    (C.T)
 C.e      .(SUM(k=0 to N-2) (------)-SUM(k=1 to N-1) (----------)) (20)
                               k!                       (k-1)!
                 (k-1)
    -(C.T)  (C.T)
=C.e      .(------)    +
             (k-1)!
                                  (k-1)                    (k-1)
    -(C.T)                   (C.T)                    (C.T)
 C.e      .(SUM(k=1 to N-1) (------)-SUM(k=1 to N-1) (----------)) (21)
                             (k-1)!                       (k-1)!
                 (k-1)
    -(C.T)  (C.T)
=C.e      .(------)    +
             (k-1)!
    -(C.T)
 C.e      .( 0 )                                                   (22)
                 (k-1)
    -(C.T)  (C.T)
=C.e      .(------)
            (k-1)!
Adding the exponents of C gives finally:
   k    -(C.T)    (k-1)
= C  . e       . T                                                 (23)
  ---------------------
         (k-1)!
And this function is the Erlang probability function.
---
---
Note:
This Erlang probability function is a special case of the Gamma
probability function.
Because you can write (this follows from the definition of Gamma( k ),
which is basically a generalization of the faculty for non-integer
numbers):
(k-1)!
as
Gamma( k )                                                         (24)
if k is an integer greater than 0.
Putting (24) in (23) gives:
   k    -(C.T)    (k-1)
= C  . e       . T                                                 (25)
  ---------------------
       Gamma( k )
which is the definition of the Gamma probability function.
From this follows that you can use examples using the Gamma
probability function to further understand the Erlang
probability function.
---
---
Note:
The exponential probability function is a special case of this Erlang
probability function (and thus a special case of the gamma probability
function).
Because you can write if you choose k=1 and put this in (23):
   1    -(C.T)    (1-1)
= C  . e       . T
  ---------------------
       (1 - 1)!
   1    -(C.T)    0
= C  . e       . T
  ---------------------
          0!
        -(C.T)
= C  . e
and this is the definition for the exponential probability function.
---
---
Example:
If an agent receives 16 calls per day on average,
what is the time until his 5th call?
Thus C = 16
and T = 1 (day)
and k = 5
Filling this in in (23) gives
    5    -(16 . 1)    (5 - 1)
= 16  . e          . 1
  ----------------------------
          (5 - 1)!
= 16^5 * exp(-16) / 4!
= 0.00491673681
(I have to further check the meaning of this -- is it after 0.004 part
 of a day, thus 0.004 of 8 hours, or thus after already 2 minutes,
 which is a bit short of course, or should I interpret it differently.
 And or the values of the parameters C
 and or T). The problem is the large value of C, probably you should
 make your interval shorter.
---
---
Example:
Thus for example, if an agent receives 16 calls per day on average,
thus 16 calls per 8 hours, thus 2 calls per hour.
What is the time until his 2th call in the first hour?
Thus C = 2
and T = 1 (hour)
and k = 2
Filling this in in (23) gives
    2    -(2 . 1)     (2 - 1)
=  2  . e          . 1
  ----------------------------
          (2 - 1)!
= ( 2^2 * exp(-2) * 1^1 ) / 1!
= 0.541341133
(this should thus probably mean after about 0.54 part of 1 hour,
or thus about an half an hour?)
---
Note:
It looks like that the problem is that you should very probably
integrate this formula (23) between the begin and end of your time
interval.
You see a probable example of this in, using the Gamma probability
function (note that the Erlang probability function is a special case
of the Gamma probability function).
Thus you should calculate
  T=T2
  -
 |       k    -(C.T)    (k-1)
 |      C  . e       . T                                           (25)
 |     ---------------------  . dT
-              (k-1)!
T=T1
---
Using this example:
Example:
[book: source: Hastings, p. 139]
Suppose that cars pass an expressway according to a Poisson process
with a rate of 10 per minute. Find the probability that the second car
passes the exit between 5 and 8 seconds
Since 5 seconds is 1/12 minute, 8 seconds is 2/15 minute, and
the time T2 of passage of the second car has the Gamma( 2, 10 )
distribution
and given:
C = 10
k = 2
you get after filling this in in (24):
 P[ 1/12 <= T2 <= 2/15 ] =
  T=2/15
  -
 |       2    -(10.T)   (2-1)
 |     10  . e       . T                                           (26)
 |     ---------------------  . dT
-              (2-1)!
T=1/12
which after working out gives:
= 0.18
In other words, the probability is about 1 in 5 times that a 2nd car
passes the exit between 5 and 8 seconds.
---
---
Book: see also:
---
[book: Hastings, Kevin J. - Probability and statistics - ISBN 0-201-
59278-9 - p. 136 'Gamma density']
---
[book: Montgomery, Douglas, C. / Runger, George C. / Hubele, Norma, 
F. - engineering statistics - publisher: John Wiley - ISBN 0-471-17026-
7 - p. 54 'Random variables: you may use the Poisson distribution 
function with L = C . t (where C is a constant) / p. 152 'Poisson 
random process']
---
[book: Neeleman, D. / Bolhuis, van J. - Statistics and probability - 
Free University - p. 118 'Erlang distribution']
---
[book: Weisstein, Eric W. - CRC concise encyclopedia of mathematics - 
ISBN 0-8493-9640-9 - p. 694 'Gamma distribution']
---
---
Internet: see also:
---
Math: Probability: Distribution: Link: Overview: Can you give an 
overview of links?
http://www.faqts.com/knowledge_base/view.phtml/aid/32917/fid/815
---
Erlang distribution
http://mathworld.wolfram.com/ErlangDistribution.html
---
Gamma distribution (see (5) 'The corresponding probability function P
(x) of waiting times until the hth Poisson event is then obtained by 
differentiating D(x)')
http://mathworld.wolfram.com/GammaDistribution.html
---
Erlang distribution
http://en.wikipedia.org/wiki/Erlang_distribution
---
Math: Traffic engineering: Can you give overview of links about the 
distribution of telephone calls?
http://www.faqts.com/knowledge_base/view.phtml/aid/39736/fid/815
----------------------------------------------------------------------