October 11, 2011

Relativity-2: The CERN-Gran Sasso experiment

(For previous posts in this series, see here.)

The nice feature about the experiments involved in the recent reports of faster than light neutrinos is that the basic ideas are so simple that anyone can understand them. It involved producing neutrinos at the CERN laboratory in Switzerland and detecting them at the Gran Sasso laboratory in Italy. By measuring the distance between the two locations and the time taken for the trip, one could calculate the speed of the neutrinos by dividing the distance by the time.

The measured distance was about 730 km so if we take that as the exact value, and if the neutrinos were traveling at exactly the speed of light (299,792 km/s), the time taken would be 2.435022 milliseconds (where a millisecond is one-thousandth of a second) or equivalently 2,435,022 nanoseconds (where a nanosecond is one-billionth of a second). What the experimenters found was that the actual time taken was 60 nanoseconds less than this time, which seemed to require the neutrinos to be traveling slightly faster than the speed of light. Since the existence of faster than light particles has never been confirmed before, this would be a major discovery and so the search is now underway to see if this conclusion holds up under close scrutiny.

If the experimental results are at fault and the effect is spurious, this must arise from errors in the distance measurement and/or the time measurement. Although the time difference that produced the effect is very small (60 nanoseconds out of a total travel time of over 2 million nanoseconds constitutes only about 0.0025% of the total time), the experimenters say their time measurements are accurate up to 10 nanoseconds, much less than the size of the error needed to resolve the discrepancy, thus ruling that out as the source of error. Similarly, if the actual distance were less than the measured distance by just 18 meters, the effect would again go away. The experimenters used GPS technology to measure the space and time coordinates of the events and say that their experiment can measure distances up to an accuracy of just 0.2 meters, making that too an unlikely source of any error. As for the possibility of some kind of random statistical fluctuations causing the effect, the number of neutrino measurements they have taken over the past two years exceed 16,000, which makes that highly unlikely as the source of error.

So why is there still skepticism? It is because the very feature of neutrinos that makes this experiment so conceptually simple is also what makes it so difficult to rule out what are called systematic errors. These are artifices of the experimental setup that can bias the results consistently in one particular direction, unlike random errors that can go either way and can be reduced by repeating the experiment a large number of times, as was done in this case. Unearthing systematic errors is difficult and time consuming because it depends on the esoteric details of the experimental set-up. What some other groups will now try and do is identify possible sources of systematic errors that the original experimenters did not consider, while others will repeat the experiment with different experimental set-ups, measuring the time and distance using different techniques so that the likelihood of systematic biases pushing the results in the same direction is reduced. Yet other groups will examine if any of the side effects that would automatically accompany faster than light travel are also seen. It is this kind of investigation for replicability and consistency that characterizes science.

But getting back to the original experiment, the reason that neutrinos are good for measuring velocities that may exceed the speed of light is that they usually travel at speeds close to or at the speed of light. If a particle has zero mass (as is the case with 'photons', the name given to particles of light), then according to Einstein's theory of relativity, it must travel exactly at the speed of light. If it has a mass, however small, it can approach the speed of light but never attain it because to do so would require an infinite amount of energy. But it takes less energy to accelerate lighter particles to high speeds than it does heavier particles.

In the case of neutrinos, we have not been able to directly detect them having any non-zero mass as yet. All we have been able to do so far is put a small upper limit on the amount of mass it can have, which is 2 eV/c2 which is about 3.5x10-33 kg. (By comparison, the particle with the smallest mass we know, the electron, has a relatively huge mass of 511,000 eV/c2.) It had long been assumed that the mass of the neutrino was exactly zero. But it turns out that there are three kinds of neutrinos and that they may oscillate from one kind to another as they travel through space, and the postulated mechanism for such oscillations require that they have non-zero mass. The purpose of the CERN-Gran Sasso experiment was to actually look for such oscillations, and it just so happened that it turned up the evidence that neutrinos may be traveling faster than light, completely shifting the focus of attention. Such accidental discoveries when looking for something else are not uncommon in science, the discovery of X-rays being one of the more famous examples.

Next: The elusive neutrino


Trackback URL for this entry is:



I'm wondering how distance and time measurements fit within the 3-sigma threshold that I've heard about.

To be honest, I don't really understand what the 3-sigma thing is, beyond the fact that it represents a certain amount of deviation from an expected result (or something along those lines).

Posted by peter on October 11, 2011 10:48 AM

Sigma is the symbol used for standard deviation of the statistical variation of a result about its mean value. Assuming that the errors are random and normally distributed, it is about 32% likely that the "true" value of a measurement lies outside the range given by the mean plus or minus one sigma.

If the range encompasses two sigmas, then the likelihood of lying outside that range drops to 5%. For three sigma it drops to about 0.25%, for four sigma it drops to about 0.006%, and so on.

This experiment is a six-sigma one, since the required error to nullify the result is 60 nanoseconds, while the standard deviation is 10 nanoseconds (as far as I understand the results from media reports) so the chances of random errors causing this result are minuscule, of the order of 0.00034%.

Posted by Mano on October 11, 2011 11:05 AM

Great series, Professor!

Here's what I'm wondering...the speed of light constant ("c")is the speed of light in a VACUUM. Particles can move faster than light in certain mediums (see "Cerenkov radiation"), though they never exceed c in those mediums.

I'm guessing the neutrinos being measured are not in a perfect vacuum.

So, if I'm understanding things, the neutrinos are not only exceeding c, but they're exceeding c and they may not even be in a vacuum!

Posted by healthphysicist on October 11, 2011 11:49 AM

Cerenkov radiation is one source of corroborative evidence that people will look for.

If the result holds up, questions about whether the speed of light or whether light itself has some mass and so its speed is not really the upper bound or whether the vacuum is really a vacuum as far as light travel is concerned will all be investigated.

Posted by Mano on October 11, 2011 11:57 AM