Professional Documents
Culture Documents
The graph of a probability mass function. All the values of this function must be
non-negative and sum up to 1.
In probability and statistics, a probability mass function is a function that gives
the probability that a discrete random variable is exactly equal to some value.[1]
Sometimes it is also known as the discrete density function. The probability mass
function is often the primary means of defining a discrete probability
distribution, and such functions exist for either scalar or multivariate random
variables whose domain is discrete.
The value of the random variable having the largest probability mass is called the
mode.
Contents
1 Formal definition
2 Measure theoretic formulation
3 Examples
3.1 Finite
3.2 Infinite
4 Multivariate case
5 References
6 Further reading
Formal definition
Probability mass function is the probability distribution of a discrete random
variable, and provides the possible values and their associated probabilities. It
is the function {\displaystyle p:\mathbb {\mathbb {R} } }{\displaystyle p:\mathbb
{\mathbb {R} } } {\displaystyle \rightarrow [0,1]}{\displaystyle \rightarrow [0,1]}
defined by
When there is a natural order among the potential outcomes {\displaystyle x}x, it
may be convenient to assign numerical values to them (or n-tuples in case of a
discrete multivariate random variable) and to consider also values not in the image
of {\displaystyle X}X. That is, {\displaystyle f_{X}}f_X may be defined for all
real numbers and {\displaystyle f_{X}(x)=0}f_{X}(x)=0 for all {\displaystyle
x\notin X(S)}{\displaystyle x\notin X(S)} as shown in the figure.
The image of {\displaystyle X}X has a countable subset on which the probability
mass function {\displaystyle f_{X}(x)}f_{X}(x) is one. Consequently, the
probability mass function is zero for all but a countable number of values of
{\displaystyle x}x.
The discontinuity of probability mass functions is related to the fact that the
cumulative distribution function of a discrete random variable is also
discontinuous. If {\displaystyle X}X is a discrete random variable, then
{\displaystyle P(X=x)=1}{\displaystyle P(X=x)=1} means that the casual event
{\displaystyle (X=x)}{\displaystyle (X=x)} is certain (it is true in the 100% of
the occurrences); on the contrary, {\displaystyle P(X=x)=0}{\displaystyle P(X=x)=0}
means that the casual event {\displaystyle (X=x)}{\displaystyle (X=x)} is always
impossible. This statement isn't true for a continuous random variable
{\displaystyle X}X, for which {\displaystyle P(X=x)=0}{\displaystyle P(X=x)=0} for
any possible {\displaystyle x}x: in fact, by definition, a continuous random
variable can have an infinite set of possible values and thus the probability it
has a single particular value x is equal to {\displaystyle {\frac {1}{\infty }}=0}
{\displaystyle {\frac {1}{\infty }}=0}. Discretization is the process of converting
a continuous random variable into a discrete one.
Examples
Main articles: Bernoulli distribution, Binomial distribution, and Geometric
distribution
Finite
There are three major distributions associated, the Bernoulli distribution, the
binomial distribution and the geometric distribution.
The probability mass function of a fair die. All the numbers on the die have an
equal chance of appearing on top when the die stops rolling.
An example of the binomial distribution is the probability of getting exactly one 6
when someone rolls a fair die three times.
Geometric distribution describes the number of trials needed to get one success.
Its probability mass function is {\textstyle p_{X}(k)=(1-p)^{k-1}p}{\textstyle
p_{X}(k)=(1-p)^{k-1}p}.
An example is tossing the coin until the first head appears.
Other distributions that can be modeled using a probability mass function are the
categorical distribution (also known as the generalized Bernoulli distribution) and
the multinomial distribution.
If the discrete distribution has two or more categories one of which may occur,
whether or not these categories have a natural ordering, when there is only a
single trial (draw) this is a categorical distribution.
An example of a multivariate discrete distribution, and of its probability mass
function, is provided by the multinomial distribution. Here the multiple random
variables are the numbers of successes in each of the categories after a given
number of trials, and each non-zero probability mass gives the probability of a
certain combination of numbers of successes in the various categories.
Infinite
The following exponentially declining distribution is an example of a distribution
with an infinite number of possible outcomes—all the positive integers:
{\displaystyle {\text{Pr}}(X=i)={\frac {1}{2^{i}}}\qquad {\text{for }}i=1,2,3,\dots
}{\displaystyle {\text{Pr}}(X=i)={\frac {1}{2^{i}}}\qquad
{\text{for }}i=1,2,3,\dots }
Despite the infinite number of possible outcomes, the total probability mass is 1/2
+ 1/4 + 1/8 + ⋯ = 1, satisfying the unit total probability requirement for a
probability distribution.
Multivariate case
Main article: Joint probability distribution
Two or more discrete random variables have a joint probability mass function, which
gives the probability of each possible combination of realizations for the random
variables.
References
Stewart, William J. (2011). Probability, Markov Chains, Queues, and Simulation:
The Mathematical Basis of Performance Modeling. Princeton University Press. p. 105.
ISBN 978-1-4008-3281-1.
A modern introduction to probability and statistics : understanding why and how.
Dekking, Michel, 1946-. London: Springer. 2005. ISBN 978-1-85233-896-1. OCLC
262680588.
Rao, Singiresu S., 1944- (1996). Engineering optimization : theory and practice
(3rd ed.). New York: Wiley. ISBN 0-471-55034-5. OCLC 62080932.
Further reading
Johnson, N. L.; Kotz, S.; Kemp, A. (1993). Univariate Discrete Distributions (2nd
ed.). Wiley. p. 36. ISBN 0-471-54897-9.
vte
Theory of probability distributions
probability mass function (pmf)probability density function (pdf)cumulative
distribution function (cdf)quantile function
Loglogisticpdf no-labels.svg
raw momentcentral momentmeanvariancestandard deviationskewnesskurtosisL-moment
moment-generating function (mgf)characteristic functionprobability-generating
function (pgf)cumulantcombinant
Categories: Types of probability distributions
Navigation menu
Not logged in
Talk
Contributions
Create account
Log in
ArticleTalk
ReadEditView history
Search
Search Wikipedia
Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
Contribute
Help
Learn to edit
Community portal
Recent changes
Upload file
Tools
What links here
Related changes
Special pages
Permanent link
Page information
Cite this page
Wikidata item
Print/export
Download as PDF
Printable version
Languages
Deutsch
Español
Français
한국어
日本語
Português
Русский
Türkçe
中文
16 more
Edit links
This page was last edited on 30 August 2021, at 11:59 (UTC).
Text is available under the Creative Commons Attribution-ShareAlike License;
additional terms may apply. By using this site, you agree to the Terms of Use and
Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia