You are on page 1of 4

SHOT-NOISE PROCESSES AND where the Vk ’s are i.i.d.

random variables
DISTRIBUTIONS independent of the Poisson process. More
generally, one may have
Many statistical distributions appear as 
marginal distributions for shot-noise pro- X(t) = Z(t, Tk ), (3)
cesses. Here such distributions are called
shot-noise distributions. For instance, very where {Z(·, u), u ∈ R} is a family of indepen-
simple shot-noise processes produce stable, dent stochastic processes. The interpreta-
gamma, and negative binomial distributions. tion is that Z(t, u) represents the random
effect (noise, response) a possible event (shot,
impulse) at u will have a t. Here t and u
SHOT-NOISE PROCESSES may also be points in, e.g., Rn (or Rm and
Rn , respectively) and the {Tk } may denote
Assume that an electron that arrives at the points in a spatial Poisson process of con-
anode in a vacuum tube at time u generates stant intensity. Of course, Z(t, u) may van-
a current of magnitude h(t − u) at time t in ish for t and u outside certain regions. The
the circuit. The response function h vanishes process X(t) is stationary if the distribu-
here for physical reasons on (−∞, 0). Under tion of (Z(t1 , u), . . . , Z(tm , u)) depends only on
additivity, the total current is given by (t1 − u, . . . , tm − u). This happens in particu-
 lar when Z(t, u) = h(t − u, V(u)), where {V(u),
X(t) = h(t − Tk ), (1) u ∈ R} is a family of i.i.d. random variables
or vectors.
where {Tk }∞ −∞ denote the successive arrival Some more or less well-known examples
times of the electrons. These times are ran- are presented below; Example 3 may be new.
dom and are most often assumed to stem from
a Poisson (point) process∗ of constant inten- Example 1. Insurance Mathematics
sity λ on (−∞, ∞). This realistic assump- and Waterflows. Let {Tk } be the (Poisson)
tion makes the analysis of the stochastic claim epochs and {Vk } the corresponding
process∗ X(t), t ∈ R, simple. The process X(t) i.i.d. claim sizes on an insurance company.
is the classical shot-noise process that has a The discounted value at time t, X(t), of all
very rich literature; cf., e.g., ref. 17, p. 423, claims after t may then be represented by
and ref. 18, p. 150. Early works are those by (2), where h(u) = eρu 1(−∞,0] (u). Here ρ > 0
Campbell [5] and Schottky [21]. A thorough and 1A (·) denotes the indicator function of a
mathematical study was made by Rice [20]. set A. (The interest is assumed constant and
The process is strictly stationary  with inflation neglected.)
(Campbell’s theorem) E[X(t)] = λ h(u)du With h(u) = Ce−ρu 1[0,∞) (u) (C is a positive
and constant) and Vk the amount of rain in a rain-
 storm at Tk , (2) has been used as an approx-
cov[X(t), X(t − τ )] = λ h(u)h(u − τ ) du. imate model for a steamflow, X(t), at time t;
see ref. 25. In this case X(t), t ∈ R, is Marko-
vian and X(t), t ∈ Z, is an autoregressive∗
For λ large, (X(t1 ), . . . , X(tm )) has approxi-
process of order 1. By formula (4) below for
mately a multivariate normal distribution.
the characteristic function∗ for X(t) it can
There are many stochastic processes
be shown that if the Vk ’s are exponentially
appearing in different applied fields (astron-
distributed, then X(t) is gamma∗ distributed
omy, biology, hydrology∗ , insurance mathe-
(and any value of the shape parameter is pos-
matics, queuing theory∗ , etc.) that can be
sible). This result was perhaps first noticed
considered as shot-noise processes or simple
by Bartlett [1].
generalizations thereof. The response is often
random, in which case Example 2. Busy Lines and Thickness
 of Yarn. Calls to a telephone exchange with
X(t) = h(t − Tk )Vk , (2) infinitely many lines arrive according to a
Encyclopedia of Statistical Sciences, Copyright © 2006 John Wiley & Sons, Inc.

1
2 SHOT-NOISE PROCESSES AND DISTRIBUTIONS

Poisson process with intensity λ. Let X(t) be attractive force is proportional to u − t−β ,
the number of busy lines at time t and let L(u) β > n/2, the index will be n/β. Unsymmet-
stand for the stochastic length of a call arriv- ric stable distributions for X(t) appear if the
ing at u; the L(u)’s are assumed to be i.i.d. intensity of the Poisson process is permitted
random variables. Then X(t) is given by (3) to be direction dependent viewed from t (t
with Z(t, u) = 1 if L(u) > t − u, and Z(t, u) = 0 fixed).
otherwise. By simple considerations or from The case n = 1, β = 2, leads to an applica-
(4) below it follows that X(t) is Poisson dis- tion to traffic noise. Consider a long straight
tributed with mean λE[L(u)]. highway with cars with random velocities
If, instead, L(u) denotes the length of a and positions (at any moment) according to
fibre with left endpoint at  position u in a a Poison point process. Then, under very
thread of yarn, then X(t) = Z(t, Tk ) can be weak assumptions, the intensity (effect) of
interpreted as the number of fibres at position the traffic noise at an arbitrary point at the
t, i.e., the thickness of the thread at t; cf., e.g., road will follow at any time a positive stable
ref. 8, pp. 366–368. distribution with index 12 . The case when the
noise is registered at a distance from the road
Example 3. Generalization of Example is treated in ref. 24. See also, e.g., ref. 16.
2. Groups of people arrive at a service sta-
tion with infinitely many servers according For an application of shot-noise processes
to a Poisson process. The number of per- in photographic science, see ref. 13. The
sons in a group arriving at u is a random papers of refs. 6, 11, 15, and 19 contain fur-
variable N(u) and their service times are ther applications or serve as guides to the
i.i.d. random variables Lj (u), j = 1, . . . , N(u). more recent literature.
The number of people  under service at t
is given by X(t) = Z(t, Tk ), where Z(t, u) =
N(u) SHOT-NOISE DISTRIBUTIONS
j=1 1{Lj (u)>t−u} . Example 2 covers this case
if the arrival process is compound Poisson.
Let X(t) be given by (3). Then
The process X(t), t ∈ R, is Markovian if the
Lj (u)’s are exponentially distributed. If, more-
ϕX(t) (s) = E[exp{isX(t)}]
over, N(u) is geometrically distributed, then   
X(t) has a negative binomial distribution∗
= exp λ (E[exp{isZ(t, u)}] − 1)du
(and any value of the shape parameter is pos-
sible). This result seems to be little known; (4)
cf. ref. 4.
and the multivariate characteristic func-
Example 4. Gravitation Force and Traf- tion of (X(t1 ), . . . , X(tm )) admits an analogous
fic Noise. Point masses (stars) are assumed expression from which Campbell’s theorem
to be distributed in R3 according to a spa- can be derived; cf. ref. 18, pp. 152–155. For-
tial Poisson process of constant intensity. Let mula (4) is a consequence of the fact that
V(u) denote the mass of a star at u; the V(u)’s the distribution
 of X(t) equals the limit dis-
are assumedto be i.i.d. random variables. tribution of j Z(t, j/n)δjn as n → ∞, where
Then X(t) = Z(t, Tk ), where δjn = 1 if there is an event in ((j − 1)/n, j/n]
and 0 otherwise; the terms of the sum are
Z(t, u) = C(u1 − t1 )u − t−3 V(u) independent. It is easy to generalize (4) to
be valid also when the points {Tk } stem from
is the total gravitation force in the t1 direction a nonstationary compound spatial Poisson
on a unit mass at t. (The sum is not abso- process.
lutely convergent and must be interpreted Formula (4) shows that a shot-noise dis-
suitably.) It was shown by the astronomer tribution, i.e., the marginal distribution of a
Holtsmark [14] that X(t) has a symmetric shot-noise process X(t), is infinitely divisible∗ .
stable distribution∗ of index 32 . A simple This result follows immediately also from the
derivation is given in ref. 9, pp. 173–174. See fact that, for any n, a Poisson (point) pro-
also refs. 7 and 12. If the space is Rn and the cess of intensity λ can be considered to be
SHOT-NOISE PROCESSES AND DISTRIBUTIONS 3

the superposition of n independent Poisson description of the self-decomposable distribu-


(point) processes of intensity λ/n. tions as limit distributions for normed sums
A converse result is that any infinitely of independent random variables. Any self-
divisible distribution without a normal com- decomposable distribution with limy↓0 yn(y) <
ponent appears as the marginal distribu- ∞ is a shot-noise distribution for an appro-
tion of priate distribution of the Vk ’s.
⎡ ⎤ (b) If h(u) = ce−ρu and the distribution
 of the Vk ’s is a mixture of exponential
lim ⎣ h(t − Tk ) − b(T)⎦ distributions∗ , then the shot-noise distribu-
T→∞
|Tk −t|<T tion is a generalized gamma convolution, i.e.,
a distribution with characteristic function of
for appropriate functions h and b. Stable the form
distributions with index α, 0 < α < 2, are   
1
obtained by choosing h(u) = C sgn(u)|u|−1/α , ϕ(s) = exp ias + log U(dy) ,
1 − is/y
where the constant C may have different val-
ues for u > 0 and u < 0. An infinitely divisible where a  0 and U(dy) is a nonnegative mea-
distribution on [0, ∞), with left extremity 0 sure on (0, ∞). Moreover, any generalized

and with characteristic function gamma convolution with a = 0 and U(dy) <
  ∞ is possible as a shot-noise distribution.
ϕ(s) = exp (eisy − 1)N(dy) , The class of generalized gamma convolutions,
(0,∞) introduced by Thorin [23], can be described
 as the class of limits of finite convolutions of
is the distribution of X(t) = h(t − Tk ) if
gamma distributions. It contains several of
h is nonnegative with h(u) = 0 for u < 0
the continuous standard distributions on (0,
and related to the Lévy measure N(dy) on
∞), in particular positive stable distributions
(0, ∞) by
and distributions with densities of the form

N(dy) = λµ{u; h(u) > x},
N

(x,∞) f (x) = Cxβ−1 (1 + cj x)−γj ,


j=1
x > 0,
where the parameters are nonnegative and
where µ is Lebesgue measure; see, e.g., ref. limits of such distributions; see ref. 2.
3. The sum giving X(t) will converge almost (c) When there is no restriction on h but
surely. To obtain the gamma distribution, the Vk ’s are exponentially distributed, the
one may choose h to be proportional to the possible shot-noise distributions are precisely
inverse of the function (x,∞) y−1 e−y dy and a the T2 distributions∗ with left extremity 0.
suitable λ. However there is a more explicit The T2 distributions are defined as the limits
representation of the gamma distribution as of finite convolutions of mixtures of exponen-
shot-noise distribution;cf. Example 1. tial distributions. Every generalized gamma
For the case X(t) = h(t − Tk )Vk , with h convolution is in the T2 class.
nonnegative and 0 on (−∞, 0), some inter- The distributions encountered in (a)–(c)
esting subclasses of the infinitely divisible above have discrete analogs on Z+ that are
distributions on [0, ∞) are obtained as classes marginal distributions for shot-noise pro-
of shot-noise distributions by restricting the cesses of the kind considered in Example 3
form of h and/or the possible distributions of with suitable restrictions; e.g., if the Lj (u)’s
the nonnegative i.i.d. random variables Vk as are exponentially distributed, then X(t)
noticed in ref. 3. has a discrete self-decomposable distribu-
tion. These distributions were introduced in
(a) If h(u) = Ce−ρu , then X(t) has a self- ref. 22.
decomposable distribution∗ (class L distribu- Finally, an interesting shot-noise repre-
tion), which means that the Lévy measure sentation of Ferguson’s [10] Dirichlet process
has a density n(y) such that yn(y) is nonin- is worth mentioning. A Dirichlet process∗
creasing. See, e.g., ref. 9, pp. 588–590, for a with parameter space , equipped with a σ
4 SHOT-NOISE PROCESSES AND DISTRIBUTIONS

field A, is a stochastic process X(A), A ∈ A, 11. Gilchrist, J. H. and Thomas, J. B. (1975). Adv.
such that, for every partition A1 , . . . , Am Appl. Prob., 7, 527–541.
of , (X(A1 ), . . . , X(Am )) has a Dirichlet 12. Good, I. J. (1961). J. R. Statist. Soc. B, 23,
distribution∗ D(α(A1 ), . . ., α(Am )), where α 180–183.
is a finite positive measure on A. Then, as 13. Hamilton, J. F., Lawton, W. H., and Tra-
shown in ref. 10, X(A) may be represented as bka, E. A. (1972). Stochastic Point Processes:
Y(A)/Y( ), with Statistical Analysis, Theory and Applications,
P. A. W. Lewis, ed. Wiley, New York, pp.

 818–867.
Y(A) = h(Tk )1A (Wk ), 14. Holtsmark, J. (1919). Annalen Physik, 58,
k=1 577–630.
15. Lawrance, A. J. and Kottegoda, N. T. (1977).
where −1h −yis the inverse of the function J. R. Statist. Soc. A, 140, 1–31.
(x,∞) y e dy and the Wk ’s are i.i.d. random 16. Marcus, A. H. (1975). Adv. Appl. Prob., 7,
variables with values in and probability
593–606.
distribution α(dω)/α( ). The intensity λ
17. Moran, P. A. P. (1968). A Introduction to Prob-
of the Poisson process on (0, ∞) should
ability Theory. Clarendon, Oxford, England.
equal α( ). For disjoint A1 , . . . , Am , the (Emphasizes the connection between shot-
variables Y(A1 ), . . . , Y(Am ) are independent noise processes and infinite divisibility.)
and gamma distributed. A somewhat more 18. Parzen, E. (1962). Stochastic Processes.
explicit representation is obtained by chang- Holden-Day, San Francisco, CA. (A fairly ele-
ing Y(A) to mentary systematic treatment of shot-noise

processes, called filtered Poisson processes in
 the most general versions; recommended read-
Y(A) = exp{−Tk }Vk 1A (Wk ),
ing.)
k=1
19. Rice, J. (1977). Adv. Appl. Prob., 9, 553–565.
where the Vk ’s are independent and exponen- 20. Rice, S. O. (1954). Selected Papers on Noise
tially distributed with mean 1; cf. Example 1. and Stochastic Processes, N. Wax, ed. Dover,
New York, pp. 133–294.
REFERENCES 21. Schottky, W. (1918). Annalen Physik, 57,
541–567.
1. Bartlett, M. S. (1957). J. R. Statist. Soc. B, 19, 22. Steutel, F. W. and van Harn, K. (1979). Ann.
220–221. Prob., 7, 93–99.
2. Bondesson, L. (1979). Ann. Prob., 7, 965–979. 23. Thorin, O. (1977). Scand. Actuarial J., 60,
31–40.
3. Bondesson, L. (1982). Adv. Appl. Prob., 14,
858–869. 24. Weiss, G. H. (1970). Transport. Res., 4,
229–233.
4. Boswell, M. T. and Patil, G. P. (1970). Ran-
dom Counts in Scientific Work, Vol. I, G. P. 25. Weiss, G. H. (1977). Water Resour. Res., 13,
Patil, ed. Pennsylvania State University 101–108.
Press, University Park, PA, pp. 3–22.
5. Campbell, N. R. (1909). Proc. Camb. Philos. See also INFINITE DIVISIBILITY; POINT PROCESS, STATIONARY;
Soc. Math. Phys. Sci., 15, 117–136, 310–328. POISSON PROCESSES; STABLE DISTRIBUTIONS; and STOCHASTIC
6. Chamayou, J. M. F. (1978). Stoch. Processes PROCESSES, POINT.
Appl., 6, 305–316.
7. Chandrasekhar, S. (1954). Selected Papers on L. BONDESSON
Noise and Stochastic Processes. N. Wax, ed.
Dover, New York, pp. 3–91.
8. Cox, D. R. and Miller, H. D. (1965). The The-
ory of Stochastic Processes. Methuen, London,
England.
9. Feller, W. (1971). An Introduction to Probabil-
ity Theory and Its Applications, 2nd ed., Vol.
II. Wiley, New York.
10. Ferguson, T. (1973). Ann. Statist., 1, 209–230.

You might also like