Professional Documents
Culture Documents
(Assignment)
Benford's law
Benford's Law, also called the First-Digit Law, refers to the frequency distribution
of digits in many (but not all) real-life sources of data. In this distribution, 1 occurs
as the leading digit about 30% of the time, while larger digits occur in that
position less frequently: 9 as the first digit less than 5% of the time. Benford's Law
also concerns the expected distribution for digits beyond the first, which
approach a uniform distribution.
This result has been found to apply to a wide variety of data sets, including
electricity bills, street addresses, stock prices, population numbers, death rates,
lengths of rivers, physical and mathematical constants, and processes described
by power laws (which are very common in nature). It tends to be most accurate
when values are distributed across multiple orders of magnitude.
The graph here shows Benford's Law for base 10. There is a generalization of the
law to numbers expressed in other bases (for example, base 16), and also a
generalization from leading 1 digit to leading n digits.
The discovery of Benford's Law goes back to 1881, when the American
astronomer Simon Newcomb noticed that in logarithm tables (used at that time
to perform calculations) the earlier pages (which contained numbers that
started with 1) were much more worn than the other pages. Newcomb's
published result is the first known instance of this observation and includes a
distribution on the second digit, as well. Newcomb proposed a law that the
probability of a single number N being the first digit of a number was equal to
log(N + 1) log(N).
The phenomenon was again noted in 1938 by the physicist Frank Benford, who
tested it on data from 20 different domains and was credited for it. His data set
included the surface areas of 335 rivers, the sizes of 3259 US populations,
104 physical constants, 1800 molecular weights, 5000 entries from a
mathematical handbook, 308 numbers contained in an issue of Reader's Digest,
the street addresses of the first 342 persons listed in American Men of
Science and 418 death rates. The total number of observations used in the
paper was 20,229. This discovery was later named after Benford (making it an
example of Stigler's Law).
Fourier Series
Fourier series is a way to represent a wave-like function as a combination of
simple sine waves. More formally, it decomposes any periodic function or periodic
signal into the sum of a (possibly infinite) set of simple oscillating functions,
namely sines and cosines(or, equivalently, complex exponentials). The Discrete-time
Fourier transform is a periodic function, often defined in terms of a Fourier series. And
the Z-transform reduces to a Fourier series for the important case |z|=1. Fourier
series is also central to the original proof of the NyquistShannon sampling theorem.
The study of Fourier series is a branch of Fourier analysis.
Fourier Transform
This is the formula for the Discrete Formula Transform, which converts
sampled signals (like a digital sound recording) into the frequency. It's the
mathematical engine behind a lot of the technology you use today, including
mp3 files, file compression, and even how your old AM radio stays in tune.
The daunting formula involves imaginary numbers and complex summations, but
Stuart's idea is simple. Imagine an enormous speaker, mounted on a pole,
playing a repeating sound. The speaker is so large; you can see the cone move
back and forth with the sound. Mark a point on the cone, and now rotate the
pole. Trace the point from an above-ground view, if the resulting squiggly curve
is off-center, then there is frequency corresponding the pole's rotational
frequency is represented in the sound.