You are on page 1of 18

Generation of Quasi-Uniform Distributions and

Verification of the Tightness of the Inner Bound

Dauood Saleem

Roll No. D15048


School of Computing and Electrical Engineering
Indian Institute of Technology Mandi
Himachal Pradesh, India
Advisor
Dr. Satyajit Thakor

October 16, 2020

1 / 18
Outline

• Introduction

• Background

• Research Problem

• Work Done and Target Set for Last Semester

• Planned work for the Next Semester

• References

2 / 18
Introduction

• Let [n] , {1, . . . , n} and let X1 , . . . , Xn be n jointly


distributed discrete random variables with joint distribution
pX1 ,...,Xn (x , . . . , xn ).
• For a random vector Q
X [n] = (Xi , i ∈ [n]), taking values from
the alphabet X [n] = i∈[n] Xi , its pmf vector is

p = [p(x[n] ) : x[n] ∈ X[n] ]

where x[n] ’s are n-tuples.


• For any α ⊆ [n], the entropy of X α = {Xi , i ∈ α} is defined
as follows.
X
H(X α ) = − pX α (xα ) log pX α (xα )
n −1
• Hn , R 2 is called the entropy space.

3 / 18
Introduction

• The set of all entropy vectors is the entropy region.1

Γn∗ , {h ∈ Hn : h is entropy vector.}

• For n = 2, the entropic region has been fully characterized.


• For n = 3, the closure of entropic region has been fully
characterized.2
• All the known results so far have partial characterizations for
n ≥ 4.

1
R. W. Yeung, “A framework for linear information inequalities,” IEEE
Trans. Inform. Theory, vol. 43, pp. 1924–1934, Nov. 1997.
2
Z. Zhang and R. W. Yeung, “A non-Shannon-type conditional inequality of
information quantities,” IEEE Trans. Inform. Theory, vol. 43, pp. 1982–1986,
Nov. 1997.
4 / 18
Background
• X [n] is a quasi-uniform random vector if for all ∅ =
6 α ⊆ [n],
the random vectors X α are uniformly distributed over their
respective supports Sα where,

Sα , {xα : Pr {X α = xα } > 0}

Thus for a quasi-uniform random vector X [n] , we have


(
1
|Sα | if xα ∈ Sα ,
Pr {X α = xα } =
0 otherwise.

• A given vector h ∈ Hn is quasi-uniform entropy vector if there


exist a quasi uniform random vector X [n] such that its entropy
vector is the same as the given vector h. Consider the region

Ψn , {h ∈ Hn : h is quasi-uniform.}

5 / 18
Research Problem

• The focus of this research work is to study the quasi-uniform


distributions.
• To find all quasi-uniform distributions for given random
variables with fixed or variable alphabet size.
• Classification of the quasi-uniform distribution.
• To check the tightness of known inner bounds via explicit
construction of quasi-uniform distributions on the different
faces.

6 / 18
Work Done and Target Set for Last Semester

Algorithm 1 Generate all quasi-uniform distribtuions over a given


alphabet X[n] = ni=1 Xi , and parameter d ∈ {2, . . . , |X[n] |}.
Q

Require: d, X[n]
1: i ← 1, p ← 0|X
[n] |×1
, m ← |X[n] |
2: function MakeQUD2 (i, d, m, p, X[n] )
3: if m = 1 then
P|X[n] |−1
4: p(|X[n] |) ← 1 − x=1 p(x)
5:
P
if p(xα ) ∈ {0, cα }for some constant cα ∈ (0, 1), ∀xα ∈ Xα , xα p(xα ) = 1, ∀∅ 6= α ( [n]
then
6: return p
7: end if
8: else h i
9: v ← 0, d 1

10: for j = 1 : 1 : 2 do
11: p(i) ← v (j)
P|X[n] | |X[n] |−i
12: if 0 ≤ 1 − a=1 p(a) ≤ d
, then
13: return MakeQUD2(i + 1, d, m − 1, p, X[n] )
14: end if
15: end for
16: end if
17: end function
18: return p

7 / 18
Work Done and Target Set for Last Semester

Theorem 1. Algorithm 2 generates all quasi-uniform distributions


over a given alphabet X[n] = ni=1 Xi such that X [n] is uniformly
Q
distributed over a support of size d.

8 / 18
Work Done and Target Set for Last Semester

Table 1: For 2 random variable with alphabet size 2.


Number of quasi-uniform distributions. Entropy Vectors
d=1 0 0 0 1 0 0 0
0 0 1 0
0 1 0 0
1 0 0 0

d=2 0 0 0.5 0.5 0 1 1


0.5 0.5 0 0 1 0 1
0 0.5 0 0.5 1 1 1
0.5 0 0.5 0
0 0.5 0.5 0
0.5 0 0 0.5

d=4 0.25 0.25 0.25 0.25 1 1 2

9 / 18
Work Done and Target Set for Last Semester
The total number of elemental inequalities (m) for n random
variables are given by
 
n n−2
m=n+ 2
2
For n = 2, the number of elemental inequalities are m = 2 + 1 = 3.
H(X1 |X2 ) ≥ 0 (1)
H(X2 |X1 ) ≥ 0 (2)
I (X1 ; X2 ) ≥ 0 (3)
For 2 random variable h vector have the elements
{h : h = [h1 h2 h12 ]T } (4)
The half-space corresponding to equation (1)-(3) are given as.
{h : h12 − h2 ≥ 0} (5)
{h : h12 − h1 ≥ 0} (6)
{h : h12 − h1 − h2 ≥ 0} (7)
10 / 18
Work Done and Target Set for Last Semester
The supporting hyperplane corresponding to equation (5)-(7) are
as follows.

{h : h12 − h2 = 0} (8)
{h : h12 − h1 = 0} (9)
{h : h12 − h1 − h2 = 0} (10)

From hyperplane (8), we get,

{h : h = [h1 h2 h2 ]T , h1 , h2 ∈ R+ } (11)

From hyperplane (9), we get,

{h : h = [h1 h2 h1 ]T , h1 , h2 ∈ R+ } (12)

From hyperplane (10), we get,

{h : h = [h1 h2 h1 + h2 ]T , h1 , h2 ∈ R+ } (13)
11 / 18
Work Done and Target Set for Last Semester
Taking intersection of (11) and (12), we get,

{h : h = [h1 h2 0]T , h1 , h2 ∈ R+ } (14)

Now, h12 = 0 = h1 + h2 implies h1 = 0 and h2 = 0, we get,

{h : h = [0 0 0]T } (15)

Taking intersection of hyperplane (11) and (13), we get,

{h : h = [h1 h2 h2 ]T , h1 , h2 ∈ R+ } (16)

Now, h2 = h12 = h1 + h2 implies h1 = 0, we get,

{h : h = h2 [0 1 1]T , h2 ∈ R+ } (17)

Here we can conclude that the random variable X1 is degenerate


random variable.
12 / 18
Work Done and Target Set for Last Semester
Taking intersection of (12) and (13), we get,

{h : h = [h1 h2 h1 ]T , h1 , h2 ∈ R+ } (18)

Now, h1 = h12 = h1 + h2 implies h2 = 0, we get,

{h : h = h1 [1 0 1]T , h1 ∈ R+ } (19)

Here we can conclude that the random variable X2 is degenerate


random variable.
Taking intersection of (11),(12) and (13), we get,
Now, h1 = h2 = h12 = h1 + h2 implies h1 = h2 , we get,

{h : h = [h1 h1 h1 ]T , h1 ∈ R+ } (20)
T +
{h : h = h1 [1 1 1] , h1 ∈ R } (21)

Here we can conclude that the random variable X1 and X2 are the
same random variable.
13 / 18
Work Done and Target Set for Last Semester
For n = 3, the total number of elemental inequalities are m = 9
and the total extreme rays are 8. For 3 random variable h vector
have the elements
{h : h = [h1 h2 h3 h12 h13 h23 h123 ]T }
One extreme ray out of the 8 extreme rays have the nature.
R1230 = r1230 [1 1 1 2 2 2 2]T , r1230 ∈ R+
Using the basic properties, we can find the value of r1230 . In this
case, X1 = X2 = X3 and each follow uniform distribution. Let
support size of each ranrom variable is n.
H(X1 ) = H(X2 ) = H(X3 ) = log n = r1230
R1230 = log n[1 1 1 2 2 2 2]T , n ∈ R+
Hence, R1230 contain non-entropic points. A face formed by convex
combination of R1230 contain non-entropic points as well as
entropic points.
14 / 18
Work Done and Target Set for Last Semester

• To check the looseness of the existing inner bounds for the


quasi-uniform distributions for the entropy region in certain
faces, e.g., one of the face Conv (R12 , R23 , R1230 ).
• It is equivalent to show the existing inner bounds on the face
Conv (R12 , R23 , R1230 ) are not tight and there exists a
quasi-uniform distribution such that the corresponding entropy
vector lies strictly inside the face Conv (R12 , R23 , R1230 ) and
strictly outside the inner bound.

15 / 18
Planned work for the Next Semester

• To check the looseness of known inner bounds via explicit


construction of quasi-uniform distributions on the certain
faces, e.g., one of the face is Conv (R12 , R23 , R1230 ).

16 / 18
References I

R. W. Yeung, Information theory and network coding. Springer Science & Business Media, 2008.

R. Dougherty, C. Freiling, and K. Zeger, “Networks, matroids, and non-shannon information inequalities,”
IEEE Transactions on Information Theory, vol. 53, no. 6, pp. 1949–1969, 2007.

B. Hassibi and S. Shadbakht, “Normalized entropy vectors, network information theory and convex
optimization,” in 2007 IEEE Information Theory Workshop on Information Theory for Wireless Networks.
IEEE, 2007, pp. 1–5.

X. Yan, R. W. Yeung, and Z. Zhang, “The capacity region for multi-source multi-sink network coding,” in
2007 IEEE International Symposium on Information Theory. IEEE, 2007, pp. 116–120.

Z. Zhang and R. W. Yeung, “A non-shannon-type conditional inequality of information quantities,” IEEE


Transactions on Information Theory, vol. 43, no. 6, pp. 1982–1986, 1997.

Q. Chen and R. W. Yeung, “Characterizing the entropy function region via extreme rays,” in 2012 IEEE
Information Theory Workshop. IEEE, 2012, pp. 272–276.

D. Fong, S. Shadbakht, and B. Hassibi, “On the entropy region and the ingleton inequality,” Mathematical
Theory of Networks and Systems (MTNS), 2008.

17 / 18
Thank you

18 / 18

You might also like