You are on page 1of 26

Universal recoverability in quantum information theory

Mark M. Wilde
Hearne Institute for Theoretical Physics,
Department of Physics and Astronomy,
Center for Computation and Technology,
Louisiana State University,
Baton Rouge, Louisiana, USA
mwilde@lsu.edu
Based on arXiv:1505.04661 and arXiv:1509.07127 (with Junge, Renner, Sutter, Winter)

NICT, December 11, 2015, Koganei, Tokyo 184-8795, Japan

Mark M. Wilde (LSU)

1 / 26

Main message

Entropy inequalities established in the 1970s are a mathematical


consequence of the postulates of quantum physics
They are helpful in determining the ultimate limits on many physical
processes (communication, thermodynamics, uncertainty relations)
Many of these entropy inequalities are equivalent to each other, so we
can say that together they constitute a fundamental law of quantum
information theory
There has been recent interest in refining these inequalities, trying to
understand how well one can attempt to reverse an irreversible
physical process
We discuss progress in this direction

Mark M. Wilde (LSU)

2 / 26

Background entropies
Umegaki relative entropy [Ume62]
The quantum relative entropy is a measure of dissimilarity between two
quantum states. Defined for state and positive semi-definite as
D(k) Tr{[log log ]}
whenever supp() supp() and + otherwise

Physical interpretation with quantum Steins lemma [HP91, NO00]


Given are n quantum systems, all of which are prepared in either the state
or . With a constraint of (0, 1) on the Type I error of
misidentifying , then the optimal error exponent for the Type II error of
misidentifying is D(k).

Mark M. Wilde (LSU)

3 / 26

Background entropies
Relative entropy as mother entropy
Many important entropies can be written in terms of relative entropy:
H(A) D(A kIA )

(entropy)

H(A|B) D(AB kIA B )


I (A; B) D(AB kA B )

(conditional entropy)
(mutual information)

I (A; B|C ) D(ABC k exp{log AC + log BC log C })

(cond. MI)

Equivalences
H(A|B) = H(AB) H(B)
I (A; B) = H(A) + H(B) H(AB)
I (A; B) = H(B) H(B|A)
I (A; B|C ) = H(AC ) + H(BC ) H(ABC ) H(C )
I (A; B|C ) = H(B|C ) H(B|AC )
Mark M. Wilde (LSU)

4 / 26

Fundamental law of quantum information theory

Monotonicity of quantum relative entropy [Lin75, Uhl77]


Let be a state, let be positive semi-definite, and let N be a quantum
channel. Then
D(k) D(N ()kN ())
Distinguishability does not increase under a physical process
Characterizes a fundamental irreversibility in any physical process

Proof approaches
Lieb concavity theorem [L73]
relative modular operator method (see, e.g., [NP04])
quantum Steins lemma [BS03]

Mark M. Wilde (LSU)

5 / 26

Strong subadditivity
Strong subadditivity [LR73]
Let ABC be a tripartite quantum state. Then
I (A; B|C ) 0

Equivalent statements (by definition)


Entropy sum of two individual systems is larger than entropy sum of
their union and intersection:
H(AC ) + H(BC ) H(ABC ) + H(C )
Conditional entropy does not decrease under the loss of system A:
H(B|C ) H(B|AC )

Mark M. Wilde (LSU)

6 / 26

Equality conditions [Pet86, Pet88]


When does equality in monotonicity of relative entropy hold?
D(k) = D(N ()kN ()) iff a recovery map P,N such that
= (P,N N )(),

= (P,N N )()

This Petz recovery map has the following explicit form [HJPW04]:


P,N () 1/2 N (N ())1/2 (N ())1/2 1/2
Classical case: Distributions pX and qX and a channel N (y |x). Then
the Petz recovery map P(x|y ) is given by the Bayes theorem:
P(x|y )qY (y ) = N (y |x)qX (x)
where qY (y )
Mark M. Wilde (LSU)

N (y |x)qX (x)
7 / 26

More on Petz recovery map


Linear, completely positive by inspection and trace non-increasing
because


Tr{P,N ()} = Tr{ 1/2 N (N ())1/2 (N ())1/2 1/2 }


= Tr{N (N ())1/2 (N ())1/2 }
= Tr{N ()(N ())1/2 (N ())1/2 }
Tr{}
The Petz recovery map perfectly recovers from N ():


P,N (N ()) = 1/2 N (N ())1/2 N ()(N ())1/2 1/2
= 1/2 N (I ) 1/2
=

Mark M. Wilde (LSU)

8 / 26

Petz recovery map for strong subadditivity

Strong subadditivity is a special case of monotonicity of relative


entropy with = ABC , = AC IB , and N = TrA
Then N () = () IA and Petz recovery map is


1/2
1/2
1/2
1/2
PC AC (C ) = AC C C C
IA AC
Interpretation: If system A is lost but H(B|C ) = H(B|AC ) , then
one can recover the full state on ABC by performing the Petz
recovery map on system C of BC , i.e.,
ABC = PC AC (BC )

Mark M. Wilde (LSU)

9 / 26

Approximate case

Approximate case would be useful for applications

Approximate case for monotonicity of relative entropy


What can we say when D(k) D(N ()kN ()) = ?
Does there exist a CPTP map R that recovers perfectly from N ()
while recovering from N () approximately? [WL12]

Approximate case for strong subadditivity


What can we say when H(B|C ) H(B|AC ) = ?
Is ABC approximately recoverable from BC by performing a
recovery map on system C alone? [WL12]

Mark M. Wilde (LSU)

10 / 26

Other measures of similarity for quantum states

Trace distance

Trace distance between and is k k1 where kAk1 = Tr{ A A}.


Has a one-shot operational interpretation as the bias in success probability
when distinguishing and with an optimal quantum measurement.

Fidelity [Uhl76]

Fidelity between and is F (, ) k k21 . Has a one-shot
operational interpretation as the probability with which a purification of
could pass a test for being a purification of .

Mark M. Wilde (LSU)

11 / 26

Breakthrough result of [FR14]


Remainder term for strong subadditivity [FR14]
unitary channels UC and VAC such that
H(B|C ) H(B|AC ) log F (ABC , (VAC PC AC UC )(BC ))
Nothing known from [FR14] about these unitaries! However, can conclude
that I (A; B|C ) is small iff ABC is approximately recoverable from system
C alone after the loss of system A.

Remainder term for monotonicity of relative entropy [BLW14]


unitary channels U and V such that
D(k) D(N ()kN ()) log F (, (V P,N U)(N ()))
Again, nothing known from [BLW14] about U and V.

Mark M. Wilde (LSU)

12 / 26

New result of [Wil15, JSRWW15]


Recoverability Theorem
Let and satisfy supp() supp() and let N be a channel. Then
Z
h 
i
t/2
D(k) D(N ()kN ())
dt p(t) log F , P,N (N ()) ,

t
where p(t) is a distribution and P,N
is a rotated Petz recovery map:


t
() U,t P,N UN (),t () ,
P,N
P,N is the Petz recovery map, and U,t and UN (),t are defined from
U,t () it () it , with a positive semi-definite operator.

Two tools for proof


Renyi generalization of a relative entropy difference and the
SteinHirschman operator interpolation theorem
Mark M. Wilde (LSU)

13 / 26

Universal Recovery

Universal Recoverability Corollary


Let and satisfy supp() supp() and let N be a channel. Then
D(k) D(N ()kN ()) log F (, R,N (N ())) ,
where

R,N

t/2

dt p(t) P,N

(follows from concavity of logarithm and fidelity)

Mark M. Wilde (LSU)

14 / 26

Universal Distribution
0.8
p(t)

0.6
0.4
0.2
0
3

0
t

1
Figure: This plot depicts the probability density p(t) := 2 cosh(t) + 1
as a
function of t R. We see that it is peaked around t = 0 which corresponds to
the Petz recovery map.

Mark M. Wilde (LSU)

15 / 26

Renyi generalizations of a relative entropy difference


Definition from [BSW14, SBW14]



0 /2
0 /2
0 /2 1/2
e (, , N ) 2 log

N
()
N
()

I
U


,
E
0
2
where (0, 1) (1, ), 0 ( 1)/, and USBE is an isometric
extension of N .

Important properties
e (, , N ) = D(k) D(N ()kN ()).
lim

e 1/2 (, , N ) = log F (, P,N (N ())) .

Mark M. Wilde (LSU)

16 / 26

SteinHirschman operator interpolation theorem (setup)

Let S {z C : 0 < Re {z} < 1} , and let L (H) be the space of bounded
linear operators acting on H. Let G : S L(H) be an operator-valued
function bounded on S, holomorphic on S, and continuous on the
boundary S. Let (0, 1) and define p by
1
1

=
+
,
p
p0
p1
where p0 , p1 [1, ].

Mark M. Wilde (LSU)

17 / 26

SteinHirschman operator interpolation theorem


(statement)
Then the following bound holds
log kG ()kp
Z

h
i
h
i

dt (t) log kG (it)k1


+

(t)
log
kG
(1
+
it)k
,

p0
p1

sin()
,
2(1 ) [cosh(t) cos()]
sin()
(t)
,
2 [cosh(t) + cos()]
lim (t) = p(t).

where (t)

&0

Mark M. Wilde (LSU)

18 / 26

Proof of Recoverability Theorem


Tune parameters


Pick G (z) [N ()]z/2 [N ()]z/2 IE U z/2 1/2 ,
p0 = 2, p1 = 1, (0, 1) p =

2
1+

Evaluate norms









kG (it)k2 = N ()it/2 N ()it/2 IE U it/2 1/2 1/2 = 1,
2
2
h 
i1/2
t/2
kG (1 + it)k1 = F , P,N (N ())
.

Mark M. Wilde (LSU)

19 / 26

Proof of Recoverability Theorem (ctd.)

Apply the SteinHirschman theorem






/2
/2
/2 1/2
log [N ()] [N ()]
IE U
2/(1+)
 
Z
/2 
t/2
.
dt (t) log F , (P,N N )()

Final step
Apply a minus sign, multiply both sides by 2/, and take the limit as
& 0 to conclude.

Mark M. Wilde (LSU)

20 / 26

SSA refinement as a special case


Let ABC be a density operator acting on a finite-dimensional Hilbert
space HA HB HC . Then the following inequality holds
Z
h
i
t/2
I (A; B|C )
dt p(t) log F (ABC , PC AC (BC )) ,

where PCt AC is the following rotated Petz recovery map:


PCt AC () (UAC ,t PC AC UC ,t ) () ,
the Petz recovery map PC AC is defined as
h
i
1/2
1/2
1/2
1/2
PC AC () AC C () C
IA AC ,
and the partial isometric maps UAC ,t and UC ,t are defined as before.
Mark M. Wilde (LSU)

21 / 26

Conclusions

The result of [FR14] already had a number of important implications


in quantum information theory.
The new result in [Wil15, JSRWW15] applies to relative entropy
differences, has a brief proof, and yields a universal recovery map
(depending only on and N ).
It is still conjectured that the recovery map can be the Petz recovery
map alone (not a rotated Petz map).

Mark M. Wilde (LSU)

22 / 26

References I
[BLW14] Mario Berta, Marius Lemm, and Mark M. Wilde. Monotonicity of quantum
relative entropy and recoverability. December 2014. arXiv:1412.4067.
[BS03] Igor Bjelakovic and Rainer Siegmund-Schultze. Quantum Steins lemma
revisited, inequalities for quantum entropies, and a concavity theorem of Lieb. July
2003. arXiv:quant-ph/0307170.
[BSW14] Mario Berta, Kaushik Seshadreesan, and Mark M. Wilde. Renyi generalizations
of the conditional quantum mutual information. March 2014. arXiv:1403.6102.
[FR14] Omar Fawzi and Renato Renner. Quantum conditional mutual information and
approximate Markov chains. October 2014. arXiv:1410.0664.
[HJPW04] Patrick Hayden, Richard Jozsa, Denes Petz, and Andreas Winter. Structure
of states which satisfy strong subadditivity of quantum entropy with equality.
Communications in Mathematical Physics, 246(2):359374, April 2004.
arXiv:quant-ph/0304007.

Mark M. Wilde (LSU)

23 / 26

References II
[HP91] Fumio Hiai and Denes Petz. The proper formula for relative entropy and its
asymptotics in quantum probability. Communications in Mathematical Physics,
143(1):99114, December 1991.
[JSRWW15] Marius Junge, David Sutter, Renato Renner, Mark M. Wilde, and Andreas
Winter. Universal recovery from a decrease of quantum relative entropy. September
2015. arXiv:1509.07127.
[Lin75] G
oran Lindblad. Completely positive maps and entropy inequalities.
Communications in Mathematical Physics, 40(2):147151, June 1975.
[L73] Elliott H. Lieb. Convex Trace Functions and the Wigner-Yanase-Dyson
Conjecture. Advances in Mathematics, 11(3), 267288, December 1973.
[LR73] Elliott H. Lieb and Mary Beth Ruskai. Proof of the strong subadditivity of
quantum-mechanical entropy. Journal of Mathematical Physics, 14(12):19381941,
December 1973.
[LW14] Ke Li and Andreas Winter. Squashed entanglement, k-extendibility, quantum
Markov chains, and recovery maps. October 2014. arXiv:1410.4184.
Mark M. Wilde (LSU)

24 / 26

References III
[NO00] Hirsohi Nagaoka and Tomohiro Ogawa. Strong converse and Steins lemma in
quantum hypothesis testing. IEEE Transactions on Information Theory,
46(7):24282433, November 2000. arXiv:quant-ph/9906090.
[NP04] Michael A. Nielsen and Denes Petz. A simple proof of the strong subadditivity
inequality. arXiv:quant-ph/0408130.
[Pet86] Denes Petz. Sufficient subalgebras and the relative entropy of states of a von
Neumann algebra. Communications in Mathematical Physics, 105(1):123131,
March 1986.
[Pet88] Denes Petz. Sufficiency of channels over von Neumann algebras. Quarterly
Journal of Mathematics, 39(1):97108, 1988.
[SBW14] Kaushik P. Seshadreesan, Mario Berta, and Mark M. Wilde. Renyi squashed
entanglement, discord, and relative entropy differences. October 2014.
arXiv:1410.1443.
[SFR15] David Sutter, Omar Fawzi, and Renato Renner. Universal recovery map for
approximate Markov chains. April 2015. arXiv:1504.07251.
Mark M. Wilde (LSU)

25 / 26

References IV

[Uhl76] Armin Uhlmann. The transition probability in the state space of a *-algebra.
Reports on Mathematical Physics, 9(2):273279, 1976.
[Uhl77] Armin Uhlmann. Relative entropy and the Wigner-Yanase-Dyson-Lieb concavity
in an interpolation theory. Communications in Mathematical Physics, 54(1):2132,
1977.
[Ume62] Hisaharu Umegaki. Conditional expectations in an operator algebra IV (entropy
and information). Kodai Mathematical Seminar Reports, 14(2):5985, 1962.
[Wil15] Mark M. Wilde. Recoverability in quantum information theory. May 2015.
arXiv:1505.04661.
[WL12] Andreas Winter and Ke Li. A stronger subadditivity relation?
http://www.maths.bris.ac.uk/csajw/stronger subadditivity.pdf, 2012.

Mark M. Wilde (LSU)

26 / 26

You might also like