Quantum channels from subfactors Pieter Naaijkens Universidad - - PowerPoint PPT Presentation

quantum channels from subfactors
SMART_READER_LITE
LIVE PREVIEW

Quantum channels from subfactors Pieter Naaijkens Universidad - - PowerPoint PPT Presentation

Quantum channels from subfactors Pieter Naaijkens Universidad Complutense de Madrid 21 March 2019 This work was funded by the ERC (grant agreement No 648913) Three cultures Type I n : everything finite dimensional (no infinite resources)


slide-1
SLIDE 1

This work was funded by the ERC (grant agreement No 648913)

Quantum channels from subfactors

Pieter Naaijkens Universidad Complutense de Madrid 21 March 2019

slide-2
SLIDE 2

Three cultures

Thanks to Reinhard Werner for this characterisation.

Type In: everything finite dimensional (no infinite resources) Type I∞: separable Hilbert space (e.g. quantum particle on line) Type III: focus on algebra of observables (particularly useful with infinite # d.o.f.)

slide-3
SLIDE 3

use quantum systems to communicate main question: how much information can I transmit? will consider infinite systems here… … described by subfactors channel capacity is given by Jones index

Quantum information

slide-4
SLIDE 4

Outline

Classical information theory Subfactors and QI

slide-5
SLIDE 5

Classical information theory

slide-6
SLIDE 6

Image source: Alfred Eisenstaedt/The LIFE Picture Collection

Information theory Alice wants to communicate with Bob using a noisy channel. How much information can Alice send to Bob per use of the channel?

slide-7
SLIDE 7

Setup

Alice Bob input space

  • utput space

How well can Bob recover the messages sent by Alice (small error allowed)?

slide-8
SLIDE 8

Shannon entropy

Def: Measure for the information content of X Coding: represent tuples in Xn by codewords (asymptotically, error goes to zero!)

slide-9
SLIDE 9

Relative entropy

Compare two probability distributions P, Q: Vanishes iff P=Q, otherwise positive

slide-10
SLIDE 10

Mutual information

`information’ due to noise here the conditional entropy is defined: some algebra gives:

slide-11
SLIDE 11

Channel capacities

What is the maximum amount of information we can send through the channel? Def: the Shannon capacity of the channel is defined as:

slide-12
SLIDE 12

Operational approach

encode n channels N messages decode Maximum error for all possible encodings:

slide-13
SLIDE 13

Coding theorem

Def: R is called an achievable rate if The supremum of all R is called the capacity C.

slide-14
SLIDE 14

Quantum information

slide-15
SLIDE 15

Quantum information

work mainly in the Heisenberg picture

  • bservables modelled by von Neumann algebra

consider normal states on channels are normal unital CP maps Araki relative entropy

slide-16
SLIDE 16

Araki relative entropy

Let be faithful normal states: Def: Def:

slide-17
SLIDE 17

Distinguishing states Alice prepares a mixed state : …and sends it to Bob Can Bob recover ?

slide-18
SLIDE 18

Holevo 𝜓 quantity In general not exactly: Generalisation of Shannon information

slide-19
SLIDE 19

Infinite systems

Suppose is an infinite factor, say Type III, and a faithful normal state

Better to compare algebras!

slide-20
SLIDE 20

Comparing algebras

Want to compare and , with subfactor Shirokov & Holevo, arXiv:1608.02203

slide-21
SLIDE 21

A quantum channel

For finite index inclusion , say irreducible, quantum channel, describes the restriction of operations

slide-22
SLIDE 22

Jones index and entropy

gives an information-theoretic interpretation to the Jones index

Hiai, J. Operator Theory, ’90; J. Math. Soc. Japan, ‘91

slide-23
SLIDE 23

Quantum wiretapping

Alice Bob Eve

slide-24
SLIDE 24

Theorem (Devetak, Cai/Winter/Young) The rate of a wiretapping channel is given by

lim

n→∞

1 n max

{px,ρx}

  • χ({px}, Φ⊗n

B (ρx)}) − χ({px}, Φ⊗n E (ρx)})

slide-25
SLIDE 25

A conjecture

The Jones index of a subfactor gives the classical capacity of the wiretapping channel that restricts from to .

[M : N] M N

  • L. Fiedler, PN, T.J. Osborne, New J. Phys 19:023039 (2017)

PN, Contemp. Math. 717, pp. 257-279 (2018), arXiv:1704.05562

slide-26
SLIDE 26

Some remarks

use entropy formula by Hiai together with properties of the index averaging drops out: single letter formula coding theorem is missing