Some plasmas have a refraction index of less than 1. In these plasmas the phase velocity of light can be faster than light-speed. But the phase itself won't transfer information, so no paradox occurs here. But what if I constructed a tube filled with plasma with a torch on one end and a photon sensor on the other. When I see a beam of red light travelling past me, I lit up the torch, and some guy at the other end can know there's red light coming at him before the red light arrives. Isn't that an FTL occasion?
[Physics] Faster than light in plasma
faster-than-lightopticsrefractionspeed-of-light
Related Solutions
The front velocity, defined as the propagation speed of the point where the field first differs (by any arbitrarily small amount) from exactly zero, is always no greater than $c$. (In fact, the front velocity is always exactly equal to $c$, no greater or less.)
The problem here is that a Gaussian pulse extends infinitely in both directions, so it simply does not have a "front" to speak of. Of course the amplitude decays super-exponentially on both sides, but that doesn't matter. There is no causality problem with the pulse emerging arbitrarily early in time, because your input pulse made the field start changing long before that.
As a thought experiment, let's imagine we have a button we can push to start the input pulse going. If any trace of the output pulse comes out before a signal at $c$ has a chance to propagate from when and where the button was pushed, that's a causality violation. But that is impossible, for the following reason:
Since a theoretically perfect Gaussian pulse has no finite start time, but has a nonzero amplitude at arbitrarily early times, it's impossible to create such a perfect Gaussian pulse by pushing a button. Of course, you can get arbitrarily close to perfection, but there will always be some distortion that gets worse and worse as you try to make the pulse fire "faster", that is, try to make a shorter separation in time between the button push and the maximum of the pulse. The theorem that the medium has a causal response to the field guarantees that the response to this distortion will always interfere with the response to the perfect Gaussian to cancel out exactly at times earlier than a signal traveling at $c$.
Of course, in a real experiment, the "button push" (actually some electronic signal to the machine that creates the pulse) happens a relatively long time before any trace of the output pulse is detected.
Here's a great book chapter I just found about this: http://books.google.com/books?id=kE8OUCvt7ecC&pg=PA26 It has lots more math than my answer (though not too high a level) and might clear up a lot of things.
You also ask about single photons, but that opens up a huge can of worms I can't really get into (not least because I don't understand it well enough myself). Let me just say that there is always a minimum amount of noise in any mode / degree of freedom of the electromagnetic field, which is equivalent to half a photon. You could make a pulse of the right shape that's so weak there's only a single photon in it (I'm sure people do things like that all the time), but the problem is if you try to "announce" too early that you've detected that photon, you'll be wrong such a large fraction of the time that you can show statistically that no information is being transmitted. It's really difficult to do quantum-limited measurement in the first place (because you have to severely limit the back-reaction of the measurement apparatus on the field you're measuring), and if you try to do it too fast it becomes literally impossible.
Looks like you are already familiar with the classical explanation but are still curious about the quantum version of it.
2.phase difference between absorbed and emitted light
Yeah, this is essentially the lowest order contribution to the phase shift in the photon-electron scattering. Here is the sloppy way to visualize it continuously (this is basically the 'classical EM wave scattering' point of view): you can imagine that the "kinetic energy" (-> frequency) of the "photon" increases as it approaches the atom's potential well and then it goes back to its normal frequency upon leaving the atom. This translates to a net increase in the phase ($(n-1)\omega/c$).
- "drift velocity" of photons ( they aren't the same photons, they are re-emitted all the time)
By "drift velocity" do you mean a pinball-like, zigzag motion of the photon? This won't contribute that much because it requires more scattering (basically it is a higher order process).
And also, I still don't really understand about the detail of the absorption-emission process.
Yes the absorption will still occur in all range of the frequency. The hamiltonian of the atom will be modified by the field (by $- p \cdot E$ where p is the dipole moment of the atom and E is the electric field component of the light). This will give us the required energy level to absorb the photon momentarily, which will be re-emitted again by stimulated+spontaneous emission.
edit: clarification, the term 'energy level' is misleading, since the temporarily 'excited' atom is not in an actual energy eigenstate.
See the diagram here: http://en.wikipedia.org/wiki/Raman_scattering
Best Answer
One way to look at that:
The envelope of the red light pulse travels at the group velocity, less than $c$.
The fast oscillations within the main pulse travel faster than $c$.
In the end, the torch "lighting up" is given by the group velocity, because the envelope of the pulse is what you receive.
This animation showing a train of pulses is pretty clear. You can see that each pulse goes slowly, but the fast oscillations go fast.