6 views

Uploaded by Papia Dasgupta

save

- Erasmus Students - Matemathics
- Context Preserving Dynamic Word Cloud Visualization
- 2018 Exam p Syllabi
- Probability Compressed
- 1 unit 6 slm and standards
- chap05
- Prob Notes
- Ch. 05
- Math IB Questions
- Uniform Distribution
- Application 7
- Notes
- 4.3
- FA Practice Question
- Penelope 2014
- Yes Please
- The Unwinding: An Inner History of the New America
- Sapiens: A Brief History of Humankind
- The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution
- Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future
- Dispatches from Pluto: Lost and Found in the Mississippi Delta
- Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America
- John Adams
- The Prize: The Epic Quest for Oil, Money & Power
- The Emperor of All Maladies: A Biography of Cancer
- Grand Pursuit: The Story of Economic Genius
- This Changes Everything: Capitalism vs. The Climate
- A Heartbreaking Work Of Staggering Genius: A Memoir Based on a True Story
- The New Confessions of an Economic Hit Man
- Team of Rivals: The Political Genius of Abraham Lincoln
- The Hard Thing About Hard Things: Building a Business When There Are No Easy Answers
- Smart People Should Build Things: How to Restore Our Culture of Achievement, Build a Path for Entrepreneurs, and Create New Jobs in America
- Rise of ISIS: A Threat We Can't Ignore
- The World Is Flat 3.0: A Brief History of the Twenty-first Century
- Bad Feminist: Essays
- Angela's Ashes: A Memoir
- Steve Jobs
- How To Win Friends and Influence People
- The Sympathizer: A Novel (Pulitzer Prize for Fiction)
- Extremely Loud and Incredibly Close: A Novel
- Leaving Berlin: A Novel
- The Silver Linings Playbook: A Novel
- The Light Between Oceans: A Novel
- The Incarnations: A Novel
- You Too Can Have a Body Like Mine: A Novel
- The Love Affairs of Nathaniel P.: A Novel
- Life of Pi
- Brooklyn: A Novel
- The Flamethrowers: A Novel
- The First Bad Man: A Novel
- We Are Not Ourselves: A Novel
- The Blazing World: A Novel
- The Rosie Project: A Novel
- The Master
- Bel Canto
- A Man Called Ove: A Novel
- The Kitchen House: A Novel
- Beautiful Ruins: A Novel
- Interpreter of Maladies
- The Wallcreeper
- The Art of Racing in the Rain: A Novel
- Wolf Hall: A Novel
- The Cider House Rules
- A Prayer for Owen Meany: A Novel
- My Sister's Keeper: A Novel
- Lovers at the Chameleon Club, Paris 1932: A Novel
- The Bonfire of the Vanities: A Novel
- The Perks of Being a Wallflower

You are on page 1of 1

**About the Mutual (Conditional) Information
**

Renato Renner a n d Ueli Maurer’ Institute of Theoretical Computer Science, ETH Zurich CH-8092 Zurich, Switzerland {renner ,maurer}Qinf .ethz. ch Abstract - We give a necessary, sufficient, and easily verifiable criterion for the conditional probability distribution Pzlxy (where X , Y and Z are arbitrary random variables), such that I ( X ;Y ) 2 I ( X ;Y ( Z )holds for any distribution Pxy. Furthermore, the result is generalized to the case where Z is specified by a conditional probability distribution depending on more than two random variables.

**Theorem 3. T h e conditional probability distribution p i s correlation free if and only if it is multiplicative.
**

In fact, it is easy t o decide whether a given conditional probability distribution p is multiplicative. The following lemma shows that one only has t o check the conditional independence of a certain pair of random variables.

I. INTRODUCTION

The mutual information I ( X ;Y )between two random variables X and Y is one of the basic measures in information theory 111. It can be interpreted as the amount of information that X gives on Y (or vice versa). In general, additional information, i.e., conditioning on an additional random variable Z , can either increase or decrease this mutual information. Without loss of generality, 2 can be seen as the output of a channel C with input ( X ,Y ) ,being fully specified by the conditional probability distribution PZlxy. It is thus a natural question whether for a fixed Pzlxy (i.e., a fixed channel C with input ( X , Y ) and output 2 ) conditioning on 2 can possibly (i.e., for some distribution PXY) increase the mutual information between X and Y . In the following, we give an answer to this question as well as t o a generalization thereof. Our result has different applications, e.g., in the context of information theoretical secret key agreement.

Lemma 4. T h e conditional probability distribution p i s multiplicative i f and only if I ( X ; Y I Z ) = 0 for t w o independent and uniformly distributed r a n d o m variables X , Y (i.e., PXY i s constant), and Z with PzlxY = p.

Combining Theorem 3 and Lemma 4, our result can be formulated as follows: Let C be a fixed channel, taking as input a pair of random variables. If and only if the mutual Y) information of a uniformly distributed input pair (X, does not increase when conditioning on the channel output 2 (i.e., it equals 0), then the mutual information of any arbitrary input pair ( X ,Y ) does not increase when conditioning on the output 2:

0 = q x ; )2 Y

qx; VIZ)

V P X Y : I ( X ;Y )2 I ( X ;Y l Z ) .

111. GENERALIZATIONS

The conditional probability distribution p considered in the previous section corresponds t o a channel taking two random variables as input. However, our result can be extended to conditional probability distributions of the form p : (z,z1, . . , x n ) c) p(zlzi,. . . ,zn)for n E N. The gen. eralization of Definition 1 and Definition 2 is straightforward.

**11. DEFINITIONS RESULTS AND
**

Let the function p : ( z , z ,y) ++ p(zlz, y) be a conditional probability distribution (i.e., p(.lz, y ) is a probability distribution for each pair (z, y ) ) .

Definition 5. The conditional probability distribution p is called correlation free if

Definition 1. The conditional probability distribution p is called correlation free if for all random variables X, Y and 2 with Pzlxy = p (and arbitrary distribution P x y )

2qxt;

2=1

2 ) 2 I ( X 1 . ..

xn; ) 2

I ( X ;Y ) 2 I ( X ;Y l Z ) .

In view of this definition, our main goal can be formulated as finding a simple criterion for p to be correlation free.

**for any choice of random variables X I , . . . ,X, and 2 with
**

PZlX,-.x, = P.

**Definition 6. The conditional probability distribution p is called multiplicative if it can be written as a product
**

n

Definition 2. The conditional probability distribution p is called multiplicative if it is the product of two functions r and s depending only on (2, and ( z ,y ) , respectively, i.e., z)

p(zIz1, ..., 2 % ) n r i ( z , z z ) =

i=l

p(zlx1 Y ) = r ( z , z ) ‘ 4% ) Y

for all z, y and z . Our main theorem states that the multiplicative property is exactly the criterion needed. ‘This work was supported by the Swiss National Science Foundation (SNF).

for appropriate functions r1,. . . ,rn. I t turns out that Theorem 3 still holds for these extended definitions.

REFERENCES

[l] C. E. Shannon, “A mathematical theory of communication,” Bell System Technical Journal, vol. 27, pt. I, pp. 379-423, 1948; pt. 1 , pp. 623-656, 1948. 1

0-7803-7501 -7/02/$17.00@2002IEEE.

364

- Erasmus Students - MatemathicsUploaded byPabloTra
- Context Preserving Dynamic Word Cloud VisualizationUploaded byEray Balkanlı
- 2018 Exam p SyllabiUploaded byMarie Santoro
- Probability CompressedUploaded byCratos_Poseidon
- 1 unit 6 slm and standardsUploaded byapi-295917681
- chap05Uploaded byKaziRafi
- Prob NotesUploaded byRoger Figueroa Quintero
- Ch. 05Uploaded byfour threepio
- Math IB QuestionsUploaded byFei Teng
- Uniform DistributionUploaded byvishwavichu
- Application 7Uploaded byDr. Ir. R. Didin Kusdian, MT.
- NotesUploaded byrastegar1977
- 4.3Uploaded byYousef
- FA Practice QuestionUploaded bySurabhi Chavan
- Penelope 2014Uploaded byJaviCantero