You are on page 1of 1

# In mathematics

The mathematical theory of probability arose from attempts to formulate mathemat
ical descriptions of chance events, originally in the context of gambling, but l
ater in connection with physics. Statistics is used to infer the underlying prob
ability distribution of a collection of empirical observations. For the purposes
of simulation, it is necessary to have a large supply of random numbers or mean
s to generate them on demand.
Algorithmic information theory studies, among other topics, what constitutes a r
andom sequence. The central idea is that a string of bits is random if and only
if it is shorter than any computer program that can produce that string (Kolmogo
rov randomness) this means that random strings are those that cannot be compressed
. Pioneers of this field include Andrey Kolmogorov and his student Per Martin-Löf,
Ray Solomonoff, and Gregory Chaitin. For the notion of infinite sequence, one n
ormally uses Per Martin-Löf's definition. That is, an infinite sequence is random
if and only it withstands all recursively enumerable null sets. The other notion
s of random sequences include (but not limited to): recursive randomness and Sch
norr randomness which are based on recursively computable martingales. It is sho
wn in Yongge Wang [11] that these randomness notions are generally different
Randomness occurs in numbers such as log (2) and pi. The decimal digits of pi co
nstitute an infinite sequence and "never repeat in a cyclical fashion." Numbers
like pi are also considered likely to be normal, which means their digits are ra
ndom in a certain statistical sense.
Pi certainly seems to behave this way. In the first six billion decimal places o
f pi, each of the digits from 0 through 9 shows up about six hundred million tim
es. Yet such results, conceivably accidental, do not prove normality even in bas
e 10, much less normality in other number bases.[12]