Theoretical Computer Science Cheat Sheet
π
≈
3
.
14159,
e
≈
2
.
71828,
γ
≈
0
.
57721,
φ
=
1+
√
52
≈
1
.
61803,ˆ
φ
=
1
−√
52
≈−
.
61803
i
2
i
p
i
General Probability1 2 2Bernoulli Numbers (
B
i
= 0, odd
i
= 1):
B
0
= 1,
B
1
=
−
12
,
B
2
=
16
,
B
4
=
−
130
,
B
6
=
142
,
B
8
=
−
130
,
B
10
=
566
.Change of base, quadratic formula:log
b
x
=log
a
x
log
a
b,
−
b
±√
b
2
−
4
ac
2
a.
Euler’s number
e
:
e
= 1 +
12
+
16
+
124
+
1120
+
···
lim
n
→∞
1 +
xn
n
=
e
x
.
1 +
1
n
n
< e <
1 +
1
n
n
+1
.
1 +
1
n
n
=
e
−
e
2
n
+11
e
24
n
2
−
O
1
n
3
.
Harmonic numbers:1,
32
,
116
,
2512
,
13760
,
4920
,
363140
,
761280
,
71292520
,...
ln
n < H
n
<
ln
n
+ 1
,H
n
= ln
n
+
γ
+
O
1
n
.
Factorial, Stirling’s approximation:
1, 2, 6, 24, 120, 720, 5040, 40320, 362880,
...n
! =
√
2
πn
ne
n
1 + Θ
1
n
.
Ackermann’s function and inverse:
a
(
i,j
) =
2
j
i
= 1
a
(
i
−
1
,
2)
j
= 1
a
(
i
−
1
,a
(
i,j
−
1))
i,j
≥
2
α
(
i
) = min
{
j

a
(
j,j
)
≥
i
}
.
Continuous distributions: If Pr[
a < X < b
] =
ba
p
(
x
)
dx,
then
p
is the probability density function of
X
. If Pr[
X < a
] =
P
(
a
)
,
then
P
is the distribution function of
X
. If
P
and
p
both exist then
P
(
a
) =
a
−∞
p
(
x
)
dx.
Expectation: If
X
is discreteE[
g
(
X
)] =
x
g
(
x
)Pr[
X
=
x
]
.
If
X
continuous thenE[
g
(
X
)] =
∞−∞
g
(
x
)
p
(
x
)
dx
=
∞−∞
g
(
x
)
dP
(
x
)
.
Variance, standard deviation:VAR[
X
] =E[
X
2
]
−
E[
X
]
2
,σ
=
VAR[
X
]
.
For events
A
and
B
:Pr[
A
∨
B
] = Pr[
A
] + Pr[
B
]
−
Pr[
A
∧
B
]Pr[
A
∧
B
] = Pr[
A
]
·
Pr[
B
]
,
iﬀ
A
and
B
are independent.Pr[
A

B
] =Pr[
A
∧
B
]Pr[
B
]For random variables
X
and
Y
:E[
X
·
Y
] =E[
X
]
·
E[
Y
]
,
if
X
and
Y
are independent.E[
X
+
Y
] =E[
X
] +E[
Y
]
,
E[
cX
] =
c
E[
X
]
.
Bayes’ theorem:Pr[
A
i

B
] =Pr[
B

A
i
]Pr[
A
i
]
nj
=1
Pr[
A
j
]Pr[
B

A
j
]
.
Inclusionexclusion:Pr
n
i
=1
X
i
=
n
i
=1
Pr[
X
i
] +
n
k
=2
(
−
1)
k
+1
i
i
<
···
<i
k
Pr
k
j
=1
X
i
j
.
Moment inequalities:Pr

X
≥
λ
E[
X
]
≤
1
λ,
Pr
X
−
E[
X
]
≥
λ
·
σ
≤
1
λ
2
.
Geometric distribution:Pr[
X
=
k
] =
pq
k
−
1
, q
= 1
−
p,
E[
X
] =
∞
k
=1
kpq
k
−
1
=1
p.
2 4 33 8 54 16 75 32 116 64 137 128 178 256 199 512 2310 1,024 2911 2,048 3112 4,096 3713 8,192 4114 16,384 4315 32,768 4716 65,536 5317 131,072 5918 262,144 6119 524,288 6720 1,048,576 7121 2,097,152 7322 4,194,304 7923 8,388,608 8324 16,777,216 8925 33,554,432 9726 67,108,864 10127 134,217,728 10328 268,435,456 107 Binomial distribution:Pr[
X
=
k
] =
nk
p
k
q
n
−
k
, q
= 1
−
p,
E[
X
] =
n
k
=1
k
nk
p
k
q
n
−
k
=
np.
Poisson distribution:Pr[
X
=
k
] =
e
−
λ
λ
k
k
!
,
E[
X
] =
λ.
Normal (Gaussian) distribution:
p
(
x
) =1
√
2
πσe
−
(
x
−
µ
)
2
/
2
σ
2
,
E[
X
] =
µ.
The “coupon collector”: We are given arandom coupon each day, and there are
n
diﬀerent types of coupons. The distribution of coupons is uniform. The expectednumber of days to pass before we to collect all
n
types is
nH
n
.
29 536,870,912 10930 1,073,741,824 11331 2,147,483,648 12732 4,294,967,296 131Pascal’s Triangle11 11 2 11 3 3 11 4 6 4 11 5 10 10 5 11 6 15 20 15 6 11 7 21 35 35 21 7 11 8 28 56 70 56 28 8 11 9 36 84 126 126 84 36 9 11 10 45 120 210 252 210 120 45 10 1