You are on page 1of 24

Optimized quantum f -divergences

Mark M. Wilde

Hearne Institute for Theoretical Physics,


Department of Physics and Astronomy,
Center for Computation and Technology,
Louisiana State University,
Baton Rouge, Louisiana, USA
mwilde@lsu.edu

Based on arXiv:1710.10252

American Mathematical Society Special Session on


“Mathematical Perspectives in Quantum Information Theory”

Mark M. Wilde (LSU) 1 / 24


Main message

Quantum f -divergences have appeared in various areas of quantum


information, including quantum error correction, state discrimination,
entropy inequalities, etc.

Examples include q. relative entropy and Petz–Rényi relative entropy

It was previously not known how to write quantum fidelity or


sandwiched Rényi relative entropy as a quantum f -divergence

The optimized f -divergence proposed here represents a general way of


doing so, and one of the main contributions here is that the optimized
f -divergence obeys a data processing inequality

Benefit: Unified approach for proving the data processing inequality


for both the Petz–Rényi and sandwiched Rényi relative entropies for
all of the parameter values for which it is known to hold

Mark M. Wilde (LSU) 2 / 24


Background — quantum mechanics

Quantum states
The state of a quantum system is specified by a positive semidefinite
operator with trace equal to one, usually denoted by ρ, σ, τ , etc.

Quantum channels
Any physical process can be written as a quantum channel.
Mathematically, a quantum channel is a linear, completely positive, trace
preserving map, thus taking an input quantum state to an output
quantum state. Quantum channels are usually denoted by N , M, P, etc.

Isometric extensions of quantum channels


Every quantum channel has an isometric extension: There exists an
isometry taking an input density operator to the tensor-product Hilbert
space of output and environment. Channel is realized by applying isometry
and discarding environment
Mark M. Wilde (LSU) 3 / 24
Motivation

Important to have metrics or measures for distinguishing quantum states:

1 Use to judge the performance of quantum computations or protocols

2 They lead to measures of information or correlation

Mark M. Wilde (LSU) 4 / 24


Quantum relative entropy

Quantum relative entropy [Ume62]


The quantum relative entropy is a measure of dissimilarity between two
quantum states. Defined for state ρ and positive semi-definite σ as

D(ρkσ) ≡ Tr{ρ[log ρ − log σ]}

whenever supp(ρ) ⊆ supp(σ) and +∞ otherwise

Mark M. Wilde (LSU) 5 / 24


Fundamental law of quantum information theory

Data processing inequality [Lin75, Uhl77]


Let ρ be a state, let σ be positive semi-definite, and let N be a quantum
channel. Then
D(ρkσ) ≥ D(N (ρ)kN (σ))
“Distinguishability does not increase under a physical process”
Characterizes a fundamental irreversibility in any physical process

Applications
1 Implies optimality of communication rates in QIT
2 Can prove instances of the 2nd law of thermodynamics
3 Can establish entropic quantum uncertainty relations

Mark M. Wilde (LSU) 6 / 24


Quantum relative entropy (ctd.)

Quantum relative entropy is a parent quantity, from which we get


entropy, conditional entropy, and mutual information

Operational meaning in quantum hypothesis testing as the optimal


asymptotic Type II error exponent [HP91, ON00]

Classical reduction is
 
X p(x)
D(pkq) ≡ p(x) log
x
q(x)
for a probability distribution p and a positive measure q

Mark M. Wilde (LSU) 7 / 24


Petz–Rényi relative entropy

There is interest in generalizing quantum relative entropy:

Petz–Rényi relative entropy [Pet85, Pet86]


Given a state ρ, a positive semi-definite operator σ, and α ∈ (0, 1) ∪ (1, ∞)

Dα (ρkσ) ≡ [α − 1]−1 log Tr{ρα σ 1−α }

Properties
limα→1 Dα (ρkσ) = D(ρkσ)
Operationally relevant when α ∈ (0, 1) [Nag06, Hay07]
Obeys data processing for α ∈ (0, 1) ∪ (1, 2] [Pet85, Pet86]:

Dα (ρkσ) ≥ Dα (N (ρ)kN (σ))

Mark M. Wilde (LSU) 8 / 24


Sandwiched Rényi relative entropy
Another generalization of quantum relative entropy is

Sandwiched Rényi relative entropy [MLDS+ 13, WWY14]


Given a state ρ, a positive semi-definite operator σ, and α ∈ (0, 1) ∪ (1, ∞)
eα (ρkσ) ≡ [α − 1]−1 log Tr{(σ (1−α)/2α ρσ (1−α)/2α )α }
D

= α [α − 1]−1 log σ (1−α)/2α ρσ (1−α)/2α

α
−1 1/2 (1−α)/α 1/2
= α [α − 1] log ρ σ ρ
α

Properties
limα→1 D
eα (ρkσ) = D(ρkσ)
Operationally relevant when α ∈ (1, ∞) [MO15]
Data proc. α ∈ [1/2, 1) ∪ (1, ∞] [FL13]: D
eα (ρkσ) ≥ D
eα (N (ρ)kN (σ))

Mark M. Wilde (LSU) 9 / 24


Classical reduction of quantum Rényi relative entropies

Both the Petz- and sandwiched Rényi relative entropies reduce to the
following for the classical case:
X
Dα (pkq) = [α − 1]−1 log p(x)α q(x)1−α
x

Mark M. Wilde (LSU) 10 / 24


Classical and quantum f -divergences

Classical f -divergence [Csi67, AS66, Mor63]


f -divergence is another generalization of classical relative entropy:
 
X p(x)
Df (pkq) ≡ p(x)f
x
q(x)

If f is convex, then Df obeys data processing

Quantum f -divergence [Pet85, Pet86]


Petz defined a quantum generalization of f -divergence:

Df (ρkσ) ≡ hϕρ |S Ŝ f (ρ−1 T ρ


S ⊗ σŜ )|ϕ iS Ŝ

1/2
where |ϕρ iS Ŝ ≡ (ρS
P
⊗ IŜ )|ΓiS Ŝ and |ΓiS Ŝ ≡ i |iiS ⊗ |iiŜ

Mark M. Wilde (LSU) 11 / 24


Special cases of quantum f -divergence
Quantum relative entropy
We can use the following identities

(ZS ⊗ IŜ )|ΓiS Ŝ = (IS ⊗ ZŜT )|ΓiS Ŝ


hΓ|S Ŝ (ZS ⊗ IŜ )|ΓiS Ŝ = Tr{ZS }

and pick f (x) = − log x to find that

Df (ρkσ) = D(ρkσ)

Petz–Rényi quasi-entropy [Pet85, Pet86]


Pick f (x) = sgn(α − 1)x 1−α to get

Df (ρkσ) = sgn(α − 1) Tr{ρα σ 1−α }

related to Petz–Rényi relative entropies


Mark M. Wilde (LSU) 12 / 24
Data processing for quantum f -divergence

Reminder: function f is operator anti-monotone if


A ≤ B ⇒ f (B) ≤ f (A) for Hermitian A and B

Petz proved that if f is operator anti-monotone on (0, ∞) (and thus


operator convex) then Df obeys data processing [Pet85, Pet86], i.e.,

Df (ρkσ) ≥ Df (N (ρ)kN (σ))

Method relies on the operator Jensen inequality [HP03]:

f (V † XV ) ≤ V † f (X )V

for operator convex f , an isometry V , and a Hermitian operator X .

Mark M. Wilde (LSU) 13 / 24


Petz’s proof of data processing (setup)

Consider simple case of partial trace:

Df (ρAB kσAB ) ≥ Df (ρA kσA )

Pick isometry V to be
1/2 −1/2
VAÂ→AÂB B̂ |ψiAÂ = ρAB (ρA ⊗ IÂ )|ψiAÂ ⊗ |ΓiB B̂

Then the following equality holds:

V † (ρ−1 T −1 T
AB ⊗ σÂB̂ )V = ρA ⊗ σÂ

Mark M. Wilde (LSU) 14 / 24


Petz’s proof for data processing

Proof:

Df (ρAB kσAB ) = hϕρAB |f (ρ−1 T


AB ⊗ σÂB̂ )|ϕ
ρAB
i
= hϕρA |V † f (ρ−1 T ρA
AB ⊗ σÂB̂ )V |ϕ i
≥ hϕρA |f (V † [ρ−1 T ρA
AB ⊗ σÂB̂ ]V )|ϕ i
= hϕρA |f (ρ−1 T ρA
A ⊗ σ )|ϕ i
= Df (ρA kσA )

Inequality follows from operator Jensen inequality

This gives data processing for the Petz–Rényi relative entropies using
operator anti-monotonicity of sgn(α − 1)x 1−α for α ∈ (0, 1) ∪ (1, 2].

Mark M. Wilde (LSU) 15 / 24


Optimized quantum f -divergence

Optimized f -divergence [Wil17]


Inspired by Petz, define the optimized f -divergence as
ef (ρkσ) ≡
D sup hϕρ |S Ŝ f (τS−1 ⊗ σŜT )|ϕρ iS Ŝ
τ >0,Tr{τ }≤1

Why? This definition still obeys data processing for operator


anti-monotone f and captures other quantities previously not known to be
f -divergences, including fidelity and sandwiched Rényi relative entropy

Mark M. Wilde (LSU) 16 / 24


Sandwiched Rényi as optimized f -divergence

From Hölder duality, we have that [MLDS+ 13]



1/2 (1−α)/α 1/2
ρ σ ρ = sup Tr{ρ1/2 σ (1−α)/α ρ1/2 τ (α−1)/α }
α τ >0,Tr{τ }≤1

for α > 1, and [MLDS+ 13]



1/2 (1−α)/α 1/2
ρ σ ρ = inf Tr{ρ1/2 σ (1−α)/α ρ1/2 τ (α−1)/α }
α τ >0,Tr{τ }≤1

for α ∈ (0, 1). Then rewrite

sgn(α − 1) Tr{ρ1/2 σ (1−α)/α ρ1/2 τ (α−1)/α } = hϕρ |S Ŝ f (τS−1 ⊗ σŜT )|ϕρ iS Ŝ

for f (x) = sgn(α − 1)x (1−α)/α . This is operator anti-monotone for


α ∈ [1/2, 1) ∪ (1, ∞]

Mark M. Wilde (LSU) 17 / 24


Data processing for optimized f -divergence [Wil17]

Data processing holds for f operator anti-monotone:


ef (ρAB kσAB ) ≥ D
D ef (ρA kσA )

Proof is just slightly different from Petz’s. Let ωA be an arbitrary


density operator, and set τAB to be the following one:
1/2 −1/2 −1/2 1/2
τAB = ρAB ρA ωA ρA ρAB

We have the following identity:

V † (ρ−1 T −1 T
AB ⊗ σÂB̂ )V = ωA ⊗ σÂ

Mark M. Wilde (LSU) 18 / 24


Data processing for optimized f -divergence (ctd.) [Wil17]

Proof:
−1 −1
hϕρAB |f (τAB ⊗ σÂTB̂ )|ϕρAB i = hϕρA |V † f (τAB ⊗ σÂTB̂ )V |ϕρA i
−1
≥ hϕρA |f (V † [τAB ⊗ σÂTB̂ ]V )|ϕρA i
= hϕρA |f (ωA−1 ⊗ σÂT )|ϕρA i

Take a supremum over density operators τ to conclude that

∀ωA > 0, Tr{ωA } = 1 ef (ρAB kσAB ) ≥ hϕρA |f (ω −1 ⊗ σ T )|ϕρA i


D A Â

Holds for an arbitrary density operator ωA , and so


ef (ρAB kσAB ) ≥ D
D ef (ρA kσA )

Mark M. Wilde (LSU) 19 / 24


Conclusion

It was previously not known how to write quantum fidelity or


sandwiched Rényi relative entropy as a quantum f -divergence

The optimized f -divergence proposed here represents a general way of


doing so, and one of the main contributions here is that the optimized
f -divergence obeys a data processing inequality

Benefit: Unified approach, based on operator Jensen inequality, for


proving the data processing inequality for both the Petz–Rényi and
sandwiched Rényi relative entropies for all of the parameter values for
which it is known to hold

Mark M. Wilde (LSU) 20 / 24


References I

[AS66] S. M. Ali and S. D. Silvey. A general class of coefficients of divergence of


one distribution from another. Journal of the Royal Statistical Society.
Series B (Methodological), 28(1):131–142, 1966.

[Csi67] Imre Csiszár. Information type measure of difference of probability


distributions and indirect observations. Studia Scientiarum
Mathematicarum Hungarica, 2:299–318, 1967.

[FL13] Rupert L. Frank and Elliott H. Lieb. Monotonicity of a relative Rényi


entropy. Journal of Mathematical Physics, 54(12):122201, December 2013.
arXiv:1306.5358.
[Hay07] Masahito Hayashi. Error exponent in asymmetric quantum hypothesis
testing and its application to classical-quantum channel coding. Physical
Review A, 76(6):062301, December 2007. arXiv:quant-ph/0611013.

[HP91] Fumio Hiai and Dénes Petz. The proper formula for relative entropy and its
asymptotics in quantum probability. Communications in Mathematical
Physics, 143(1):99–114, December 1991.

Mark M. Wilde (LSU) 21 / 24


References II

[HP03] Frank Hansen and Gert K. Pedersen. Jensen’s operator inequality. Bulletin
of the London Mathematical Society, 35(4):553–564, July 2003.
arXiv:math/0204049.

[Lin75] Göran Lindblad. Completely positive maps and entropy inequalities.


Communications in Mathematical Physics, 40(2):147–151, June 1975.

[MLDS+ 13] Martin Müller-Lennert, Frédéric Dupuis, Oleg Szehr, Serge Fehr, and
Marco Tomamichel. On quantum Rényi entropies: a new definition and
some properties. Journal of Mathematical Physics, 54(12):122203,
December 2013. arXiv:1306.3142.
[MO15] Milán Mosonyi and Tomohiro Ogawa. Quantum hypothesis testing and the
operational interpretation of the quantum Rényi relative entropies.
Communications in Mathematical Physics, 334(3):1617–1648, March 2015.
arXiv:1309.3228.
[Mor63] Tetsuzo Morimoto. Markov processes and the h-theorem. Journal of the
Physical Society of Japan, 18(3):328–331, 1963.

Mark M. Wilde (LSU) 22 / 24


References III

[Nag06] Hiroshi Nagaoka. The converse part of the theorem for quantum Hoeffding
bound. November 2006. arXiv:quant-ph/0611289.

[ON00] Tomohiro Ogawa and Hiroshi Nagaoka. Strong converse and Stein’s lemma
in quantum hypothesis testing. IEEE Transactions on Information Theory,
46(7):2428–2433, November 2000. arXiv:quant-ph/9906090.

[Pet85] Dénes Petz. Quasi-entropies for states of a von Neumann algebra. Publ.
RIMS, Kyoto University, 21:787–800, 1985.

[Pet86] Dénes Petz. Quasi-entropies for finite quantum systems. Reports in


Mathematical Physics, 23:57–65, 1986.

[Uhl77] Armin Uhlmann. Relative entropy and the Wigner-Yanase-Dyson-Lieb


concavity in an interpolation theory. Communications in Mathematical
Physics, 54(1):21–32, 1977.

[Ume62] Hisaharu Umegaki. Conditional expectations in an operator algebra IV


(entropy and information). Kodai Mathematical Seminar Reports,
14(2):59–85, 1962.

Mark M. Wilde (LSU) 23 / 24


References IV

[Wil17] Mark M. Wilde. Optimized quantum f -divergences and data processing.


October 2017. arXiv:1710.10252.
[WWY14] Mark M. Wilde, Andreas Winter, and Dong Yang. Strong converse for the
classical capacity of entanglement-breaking and Hadamard channels via a
sandwiched Rényi relative entropy. Communications in Mathematical
Physics, 331(2):593–622, October 2014. arXiv:1306.1586.

Mark M. Wilde (LSU) 24 / 24

You might also like