Uncertainty Principle – Heisenberg’s Uncertainty Principle in Different Frames of Reference

bose-einstein-condensateheisenberg-uncertainty-principlelow-temperature-physicsreference frameswavefunction

The Heisenberg's uncertainty principle

\begin{align}\Delta x \Delta p \geq \frac{\hbar}{2},\end{align}

states that two canonically conjugate variables can't be measured simultaneously with arbitrary precision.

But velocity and momentum are quantities that depend on the frame of reference of the measurement apparatus. Without resorting to relativity, just speaking "classical" quantum mechanics, to which frame of reference does the measurement of $p$ refers to?

I'll provide a concrete example: Bose-Einstein condensates (BEC). I heard this from Professor Claude Cohen-Tannoudji himself on a lecture about BECs: as the temperature of the gas decreases, the momentum of the atoms decrease, and thus the uncertainty in the momentum also decreases. Due to the uncertainty principle, the uncertainty in the position has to increase. Atoms grow bigger as they are cooled down.

So, in an atomic BECs the gas temperature is so low, and atoms get so large, that their atomic wavefunctions start to overlap, and begin oscillate in-phase. Thus the condensate behaves as if it was a huge single atom, their wavefunctions oscillating coherently, allowing for applications like atom-lasers (see also here).

My question is: if someone at the lab where the BEC experiment is being performed sees an enlarged atom because its momentum is very small, does an observer travelling with constant speed $v_\textrm{obs}$ relative to the lab's frame of reference also see the same enlarged atom? For the later observer, isn't the momentum of the atom larger than the one the former observer sees?

EDIT: I asked a related question in Matter Modeling: In which theoretical framework does the size of an atom depends on the temperature of the gas (Bose-Einstein condensates).

EDIT2: In order to provide futher insight into the question, I quote an excerpt of this article on Nature by James Anglin and Wolfgang Ketterle:

As long as the atoms’ de Broglie wavelength $\lambda_{\textrm{dB}} = \hbar / (2 M k_{\textrm{B}} T)^{1/2}$ is small compared to the spacing between atoms, one can describe their motion with classical trajectories. ($\lambda_{\textrm{dB}}$ is the position uncertainty associated with the thermal momentum distribution, and increases with decreasing temperature $T$ and atomic mass $M$.) Quantum degeneracy begins when $\lambda_{\textrm{dB}}$ and the interatomic distance become comparable. The atomic wave packets overlap, and the gas starts to become a ‘quantum soup’ of indistinguishable particles. (emphasis mine)

Best Answer

As for your specific question about uncertainties, let's just do Galilean relativity: The observer at rest w.r.t. to the BEC has the momentum operator $p_0$ and another observer has a momentum operator $p_v$ and the difference $c = p_v - p_0$ is a constant (velocity of the observer times mass of the system under consideration). We have \begin{align} \sigma_{p_v} & = \sqrt{\langle p_v^2\rangle_- \langle p_v\rangle^2} = \sqrt{\langle (p_0 + c)^2\rangle_- \langle p_0 + c\rangle^2} \\ & = \sqrt{\langle p_0^2\rangle + 2\langle p_0\rangle c + \langle c^2\rangle - \langle p_0\rangle^2 - 2\langle p_0\rangle c - \langle c\rangle^2} \\ & = \sqrt{\langle p_0^2\rangle - \langle p_0\rangle^2 } = \sigma_{p_0} \end{align} that is, adding a constant to a random variable does not change its standard deviation, so at least in the Galilean case the position and momentum uncertainties of any given state do not depend on the observer, and so if you think of temperature as a measure of the momentum uncertainty of a substance, all Galilean observers agree on the temperature of a system.

In general the interaction of thermodynamics with special relativity is a bit controversial, see this question about the relativity of temperature in general and whether different observers might agree or disagree about the temperature of a system.

Related Question