Campuses:

groups:journal_club:2010fall:berta_the_uncertainty_principle

The .pdf file can be found here

An external link to the paper is here.

The DOI is 10.1038/nphys1734.

See the review article in Nature for more information. For the more mathematically inclined, see the supplementary information. Also see information on the BB84 algorithm here.

We start with the uncertainty relation we know and love:

There are two problems with this. (1) Remember how you have to always compute commutators acting on ? Unless we have something pretty like , the right-hand side will depend on our choice for . If happens to be an eigenstate of one of the operators, then both sides of the inequality are zero and there is no lower uncertainty bound for the other operator. (See the first page of PRL, Maassen & Uffink.) (2) Have you tried logarithms? i.e. we would like to recast it in terms of entropy and information theory.

It's not completely trivial to transform things to logarithms, (hence the PRL linked to above), but the result obtained in the late '80s was Eq. (1) in this week's paper:

where is some measure of the complementarity of and .

What is the problem with this one? The authors claim that with a “quantum memory”, one can magically make the left-hand-side zero. (See answer to Joe's question below.) They propose adding another term to the right-hand-side which accounts for entanglement. Their version is given in Eq. (2).

They give the following examples in the text. The key to deciphering these is to remember that the term will be negative if the particles are entangles, positive otherwise.

- Maximal entanglement yields , which (for whatever reason) can never exceed the term.
- If the particle is not entangled at all, we have . Since measurements without a quantum memory will be greater than those without (i.e. ), we recover Eq. (1).
- If there is no quantum memory, we just get something even stronger than Eq. (1), since for mixed states:

- The
*Shannon entropy*of a probability distribution for possible outcomes indexed by is given by:

- The
*von Neumann entropy*is just the quantum analog, making use of the density matrix:

Brownies to all who come prepared. (I can't guarantee that they will be as glorious as those of the Joe.) –

Q: What is the difference between a quantum and classical memory?

A: A classical memory would simply be knowledge of the density matrix. i.e. I send a particle to Alice and I remember how I made it. If she tells me which measurement she made on it, then my uncertainty as to her result is described by the traditional uncertainty relation. –

A quantum memory invokes “spooky” action at a distance. Meaning, I entangle my particle EPR-style before I send it to Alice. Then, after she tells me which measurement she performs, I can perform the same type of measurement and get a result which is perfectly (anti-)correlated with hers. –

Qs: What do they mean by “quantum memory,” or specifically, what is in Eq. (2)? Is it a state? What is the “dimension of the particle” (in the last paragraph of the first page)? –

is a state in the sense that when we have a quantum memory, we have to treat the entangled state as a whole. One might think of a wave packet that was somehow distantly entangled to it's evil twin brother far away. The and commutation relations would still hold for each particle individually. However, when we talk about information and entropy, we can't forget about the evil twin. So their notation says, “Given the existence of , what's my entropy?”.

The original uncertainty principle applies to each dimension individually as long as they are not coupled (i.e. ). When we switch to talking about information and entropy, we have to consider all dimensions involved. Presumably, this is to satisfy some theoretical possibility of information moving between dimensions. is just the number of degrees of freedom. –

—-

Back to papers

groups/journal_club/2010fall/berta_the_uncertainty_principle.txt · Last modified: 2010/09/20 14:45 by geppert