1

Possible Duplicate:
Information conservation during quantum measurement

I asked a version of the following quesiton previously on Physics.stackexchange, where it didn't get a lot of attention. I thought it might be worth asking it here in case this community has anything to add in the few days before the site is shut down. I hope it's OK to do that.

I'm interested in $\psi$-epistemic interpretations of quantum mechanics, which see the wavefunction (or density matrix) of a system as representing our limited knowledge about some underlying physical state, rather than the physical state itself. The collapse of the wavefunction then becomes very closely analogous to the "collapse" of a classical probability distribution when some new information becomes available.

But there seems to a slight problem, in that quantum measurement does't seem to obey the conservation of (microscopic) information in quite the way it should. If we don't measure a system then its unitary dynamics conserve von Neumann entropy, just as Hamiltonian evolution of a classical system conserves Shannon entropy. In the classical case this can be interpreted as "no process can create or destroy information on the microscopic level." But if I take a spin-$\frac{1}{2}$ particle and make a $\sigma_x$ measurement (measure the spin in the $x$ direction), then make a $\sigma_y$ measurement, then another $\sigma_x$ one, then $\sigma_y$, and so on for $n$ measurements, then I will end up with a string of completely random bits, of length $n$. These bits of information seem to have appeared from nowhere.

It's clear that there are some interpretations for which this isn't a problem. In particular, for the Everett interpretation the Universe just ends up in a superposition of $2^n$ states, each containing an observer looking at a different output string. But what I'm really interested in is whether this "extra" information can be explained in a $\psi$-epistemic interpretation.

further details

The following is an expanded version of the above, in an attempt to make it clearer. I want to start by talking about the classical case, because only then can I make it clear where the analogy seems to break down. Let's consider a classical system that can take on one of $n$ discrete states (microstates). Since I don't initially know which state the system is in, I model the system with a probability distribution.

The system evolves over time. We model this by taking the vector $p$ of probabilities and multiplying it by a matrix T at each time step, i.e. $p_{t+1} = Tp_t$. The discrete analogue of Hamiltonian dynamics turns out to be the assumption that $T$ is a permutation matrix, i.e. it has exacly one 1 on each rown and column, and all its other entries are 0. (Note that permutation matrices are a subset of unitary matrices.) It turns out that, under this assumption, the Gibbs entropy (aka Shannon entropy) $H(p)$ does not change over time.

(It's also worth mentioning, as an aside, that instead of representing $p$ as a vector, I could choose to represent it as a diagonal matrix $P$, with $P_{ii}=p_i$. It then looks a lot like the density matrix formalism, with $P$ playing the role of $\rho$ and $T$ being equivalent to unitary evolution.)

Now let's say I make a measurement of the system. We'll assume that I don't disturb the system when I do this. For example, let's say the system has two states, and that initially I have no idea which of them the system is in, so $p=(\frac{1}{2},\frac{1}{2})$. After my measurement I know what state the system is in, so $p$ will become either $(1,0)$ or $(0,1)$ with equal probability. I have gained one bit of information about the system, and $H(p)$ has reduced by one bit. In the classical case these will always be equal, unless the system interacts with some other system whose state I don't precisely know (such as, for example, a heat bath).

If entropy just represents a lack of information about a system then of course it should decrease when we get some information. But there's a problem when we try to interpret the von Neumann entropy in the same way. In the experiment described in the first part of this question, I'm getting one bit of information with every measurement, but the von Neumann entropy is remaining constant (at zero) instead of decreasing by one bit each time. In the classical case, "the total information I have gained about the system" + "uncertainty I have about the system" is constant, whereas in the quantum case it can increase. This is disturbing, and I suppose what I really want to know is whether there has been anything written about $\psi$-epistemic interpretations in which this "extra" information is accounted for somehow (e.g. perhaps it could come from thermal degrees of freedom in the measuring apparatus), or in which it can be shown that something other than the von Neumann entropy plays a role analogous to the Gibbs entropy.

N. Virgo
  • 33,913
  • Any chance of an explanation for the downvote? – N. Virgo May 01 '12 at 11:49
  • I didn't like that 1) the same question was copied from one website where it may have been marginally OK to another one and that you didn't seem to look for an answer to a question but selectively for a confirmation of biases at Physics SE; 2) any-letter-ontic or epistemic interpretations - and the very idea that one may discuss dozens of "interpretations" of QM is a philosophers' misconception, not a proper science: up to differences in equivalent wording, QM only has one interpretation, like every physical theory; – Luboš Motl May 01 '12 at 12:35
  • you are trying to hide a completely trivial point - your dissatisfaction with the fact that according to QM, the results of measurements are random - into an excessively complicated, contrived, and would-be clever terminology. For those reasons, I don't believe that the question will lead to anything valuable such as insightful answers and that's why I downvoted it. Did it help? The measurement is an event during which we're learning about some observables, so of course that the wave function or density matrix discontinuously changes and all of their functions do. What's the problem?
  • – Luboš Motl May 01 '12 at 12:38
  • IMHO it's a bit silly to downvote on the basis that you disagree with the field of research that I'm asking about - but I appreciate the explanation, thanks. – N. Virgo May 01 '12 at 12:41
  • Dear @Nathaniel, I don't think it's silly at all. Imagine that someone posted a question on this server that even you consider to be a bad science – e.g. a question about the phlogiston, i.e. heat taking the form of a new mysterious liquid. Would you agree that it's OK to downvote questions on phlogiston? Or creationism or perpetual motion machines or other things? For me, the case of these not-so-hidden-variables is fully analogous so I downvote it in the same way. At most, your comments show that some obviously invalid remarks about QM (attempting to deny randomness) are indeed invalid. – Luboš Motl May 01 '12 at 12:45
  • I forgot to say that quite generally, you seem to have ignored the important answers you have gotten at Physics SE. It seems to me that the people who wrote these answers were kind of wasting their time. You were explained that the measurement changes the density matrix etc. so the von Neumann entropy is discontinuously changing during the measurement. It's totally clear and it answers all these questions. Here, you have repeated the very same question, pretending that you haven't heard anything. I think that you are just trying to force the people to confirm your misconceptions. – Luboš Motl May 01 '12 at 12:50
  • Also, please note that the problem isn't the discontinuous change in the density matrix. That happens in the classical version too. The difference is that in the classical version the entropy decreases by one bit for every bit of information we learn, whereas in the quantum version it may decrease by less than this, or not decrease at all. – N. Virgo May 01 '12 at 12:51
  • BTW this discontinuity may occur in classical physics as well. I throw dice. The probability distribution tells me that each number 1,2,3,4,5,6 is equally likely - some distribution on the phase space is a generalization. But when I see 4, the uncertainty drops to zero. The phase space probability distribution is also changing discontinuously when we perceive a result. – Luboš Motl May 01 '12 at 12:51
  • The counting of the information only looks simple in the classical case because you are not treating the information and/or entropy carried by the macroscopic apparatus or observer carefully enough. In any sensible classical limit, the entropy of warm macroscopic objects is really infinite (as a number of bits). There's no reason why this accounting should be "clean"; statistical physics is a clever way to discuss statistical or "likely" properties of systems with many configurations; it is something else than the fundamental laws of physics. – Luboš Motl May 01 '12 at 12:56
  • And whether a system should be described by a pure state - whose von Neumann entropy is zero - or as a density matrix - whose von Neumann entropy is nonzero - isn't an objectively answerable question. The state vector or density matrix describe the state of observer's knowledge about the physical system, not a "classical field" storing some objectively agreed upon classical observables. So anything that an observer does with the statistics of the density matrix etc. is "subjective", too. At any rate, why you want to encapsulate these discussions in a fringe psi-epistemic theory is beyond me. – Luboš Motl May 01 '12 at 12:59
  • Even in the continuous classical limit, the "accounting" can be made clean by talking about Kullback-Leibler divergences instead of entropies. I avoided talking about that in the question because in quantum mechanics it's common enough to talk about systems (such as qubits) whose classical analogues have only a finite number of states. – N. Virgo May 01 '12 at 13:00
  • I know and agree with all the technical points you're making. The reason I'm interested in putting these things in the context of $\psi$-epistemic theories is because of what you said: "statistical physics ... is something else than the fundamental laws of physics." But if quantum mechanics is the fundamental laws of physics then why is it so closely analogous to statistical mechanics? Perhaps it's because QM isn't the fundamental laws of physics either. This idea might be wrong, but there are plenty of recent papers on it. – N. Virgo May 01 '12 at 13:08
  • Quantum mechanics is a fundamental layer of the laws of physics. Much like classical statistical physics, it uses the concept of probability in an essential way, but unlike classical statistical physics, there is no "non-statistical or deterministic classical physics" that underlies it (quantum mechanics): QM is not classical. Both in classical stat. mech. and QM, one may discuss the "information" or "entropy" but the precise way how and when to include small changes of information such as 1 bit depend on conventions. The growth of entropy after many steps is however convention-independent. – Luboš Motl May 01 '12 at 15:13
  • "This idea might be wrong, but there are plenty of recent papers on it." - I would phrase the very same two pieces of the information in exactly the opposite way. "People may write lots of papers about it but it's wrong." This subtle difference in phrasing these two simple pieces of information reflect our main difference. I primarily care what is right and what is wrong about scientific statements; you don't care about that too much - the priority is that one may write papers because the paper is ready suffer anything. – Luboš Motl May 01 '12 at 15:17
  • 1
    Of course I care about what's right and wrong in scientific statements. I just happen to disagree with you. If you want that to change, perhaps you could elaborate on the reasoning behind your position? I'm not aware of any argument that says $\psi$ cannot be epistemic, unless you also demand that the underlying physics be both local and forward-causal. (The point about papers was just an attempt to claim that this is a relevant question for this site, since the subject is, demonstrably, a current research topic in theoretical physics.) – N. Virgo May 01 '12 at 15:59
  • 1
    May be interesting: http://mathoverflow.net/questions/95537/psi-epistemic-theories-in-3-or-more-dimensions – Alex 'qubeat' May 04 '12 at 11:42