Entries in "Relativity"
November 18, 2011
Relativity-14: Revised OPERA experiment finds same result
The OPERA experiment that caused such a flurry of interest with its reports of faster-than-light neutrinos has been repeated to take into account one of the criticisms and they find that the neutrinos still seem to be traveling faster than the speed of light. You can read the paper on the revised experiment here. (For previous posts in this topic, see sere.)
In the earlier experiments, the neutrinos were sent in clusters that spanned 10 microseconds, much longer than the 60 nanoseconds time difference that signaled the faster-than-light effect, and thus the experimenters had to do some fancy statistical analyses to extract the time of flight of each neutrino. Some skeptics had suggested that those statistical analyses were flawed. The new experiment has clusters that last only 3 nanoseconds, thus ruling out that particular source of systematic error.
The other potential sources of error will take longer to check out.
November 10, 2011
Relativity-13: Some concluding thoughts
(For previous posts in this series, see here.)
A lot of things need to happen before the extraordinary claims of faster-than-light neutrinos are accepted as true. As Carl Sagan once said, "Extraordinary claims require extraordinary evidence." The required evidence needs to take many forms: the results should be consistent and reproducible, corroborating evidence will have to be found, consistency with other phenomena will have to established, and alternative explanations for the phenomenon based on traditional physics will have to be ruled out. All this is going to take some time.
But if the result seems to hold up, even then it is not usually the case that scientists completely discard a highly respected old theory and start from scratch. While a few bolder scientists may take this opportunity to try and create a completely new theory, the majority of them usually seek to find minimal changes in the existing theory that would accommodate the new result.
As physicist Heinrich Pas says:
Even if true, this result neither proves Einstein wrong nor implies that causality has to be violated and time travel is possible. Things can move faster than the speed of light without violating Einstein if either the speed of light is not the limiting velocity as one can observe it for light propagation in media such as, for example, water. This can be modeled with background fields in the vacuum as has been proposed by [Indiana University physicist] Alan Kostelecky.
Or spacetime could be warped in a way so that neutrinos can take a shortcut without really being faster than the speed of light. As our three space plus one time dimensions look pretty flat, this would require an extra dimension (as proposed by [University of Hawaii at Manoa physicist] Sandip Pakvasa, [Vanderbilt University physicist] Tom Weiler and myself).
It was Einstein who suggested in 1905 that there is a limiting speed in nature and that this is the speed of light in a vacuum. I have already discussed in connection with Cherenkov radiation that when traveling in a medium such as water or glass or even air, the speed of light is reduced and it is possible to have other particles travel at speeds greater than light in that medium.
So one possible explanation for the OPERA neutrino results is to decouple the speed of light with the limiting speed. Perhaps what we call the vacuum has properties that slows down light from this potentially larger limiting value, and that this new upper limit is what should appear in the theory of relativity. If so, then having neutrinos travel faster than the speed of light in the vacuum would simply mean that neutrinos are slowed down less than light by the vacuum, similar to what happens in other media like the Sun or water or glass. This would require some additional adjustments to theory. Einstein said that the limiting speed must be an invariant for all observers and equated this limiting speed to the speed of light because it overcame some problems of consistency with Maxwell's electromagnetic theory. Decoupling those two speeds may require us to refine Maxwell's laws as well, at the very minimum. As is well known, there is no free lunch in science. You cannot make changes in one scientific theory without having to make adjustments in other theories so that they all fit together again.
This series has tried to explain why the proper scientific response to reports of a major discovery is skepticism. This should not be equated with dogmatic obstructionism because in the case of dogma, one starts with a belief that cannot be changed whatever the evidence. Skepticism, on the other hard, is merely resistance that can be overcome with sufficient evidence and reason.
Major theories in science are rarely overthrown on the basis of a single experimental result, though textbooks sometimes tend to give that erroneous image of scientific progress. Usually what happens when a surprising result crops up is that a few people start to look at it closely to see if the results can be replicated by other people in different contexts, and if the ancillary consequences of the new result are also seen.
If none of these pans out, then the original result is deemed to be due to an error (usually a subtle one in the case of careful scientists) or to some factor that was overlooked in the data collection or analysis. The latter is often referred to as a systematic error and is more common because it is hard to be sure that you have accounted for all the possible factors that can influence an experiment, especially if you are working at the frontiers of knowledge, pushing the limits. Sometimes, as in the case of cold fusion, an adequate explanation of the phenomenon within the standard framework is not discovered for a long time and a few scientists believe they do have a new effect and continue to work on it. Such theories die only when their advocates die out.
I doubt that the faster-than-light neutrino story will remain similarly ambiguous for too long but it is a difficult experiment and so may take years to sort out. The quickest resolution to such controversies is when the original experimenters find some error that causes them to withdraw their claim. The OPERA team already has plans to repeat the neutrino experiment with modifications designed to address at least a few of the concerns expressed so far. Another group known as MINOS also plans to repeat the experiment but at locations in the US, with neutrinos produced at Fermilab near Chicago and detectors in northern Minnesota or even South Dakota, the latter being a longer distance than that between CERN and Gran Sasso,
Whatever the final outcome, the faster-than-light neutrino reports have shone a light onto how science really works and that is always a good thing.
Just for the fun, I am ending this series with a word cloud made out of this series of posts. (Ignore the href and em items since these are merely html tags and have nothing to do with the content.)
November 08, 2011
Relativity-12: David Hume and causality
(For previous posts in this series, see here.)
Suppose that the claim that neutrinos can travel faster than light holds up. What are the implications?
As I said earlier in the series, this does not mean that Einstein's theory of relativity is overthrown, since it always allowed for faster than light particles, though we had never observed them. But it does mean that Einstein causality, the idea that if two events are causally connected by a signal that travels from one event to the other, then all observers' clocks will agree that the signal left the source before it arrived at the other end, will have to go.
How hard would it be to keep the theory of relativity but abandon the idea of Einstein causality? It is not impossible. The idea that if A causes B, then A must occur before B is, after all, just another hypothesis subject to empirical testing. As Victor Stenger points out, long before Einstein came along, the whole idea of causality, that we can know that one event causes another, was challenged by philosopher David Hume (1711-1776).
Wikipedia has a nice synopsis of Hume's views on the relationship of the problem of induction to that of causality:
First, Hume ponders the discovery of causal relations, which form the basis for what he refers to as "matters of fact." He argues that causal relations are found not by reason, but by induction. This is because for any cause, multiple effects are conceivable, and the actual effect cannot be determined by reasoning about the cause; instead, one must observe occurrences of the causal relation to discover that it holds. For example, when one thinks of "a billiard ball moving in a straight line toward another," one can conceive that the first ball bounces back with the second ball remaining at rest, the first ball stops and the second ball moves, or the first ball jumps over the second, etc. There is no reason to conclude any of these possibilities over the others. Only through previous observation can it be predicted, inductively, what will actually happen with the balls. In general, it is not necessary that causal relation in the future resemble causal relations in the past, as it is always conceivable otherwise; for Hume, this is because the negation of the claim does not lead to a contradiction.
Next, Hume ponders the justification of induction. If all matters of fact are based on causal relations, and all causal relations are found by induction, then induction must be shown to be valid somehow. He uses the fact that induction assumes a valid connection between the proposition "I have found that such an object has always been attended with such an effect" and the proposition "I foresee that other objects which are in appearance similar will be attended with similar effects." One connects these two propositions not by reason, but by induction. This claim is supported by the same reasoning as that for causal relations above, and by the observation that even rationally inexperienced or inferior people can infer, for example, that touching fire causes pain. Hume challenges other philosophers to come up with a (deductive) reason for the connection. If that the justification of induction cannot be deductive, then it would beg the question for induction to be based on an inductive assumption about a connection. Induction, itself, cannot explain the connection.
In this way, the problem of induction is not only concerned with the uncertainty of conclusions derived by induction, but doubts the very principle through which those uncertain conclusions are derived.
What Hume pointed out is that what we actually observe is always just a sequence of events and just because in the past we have always seen one event preceding another does not mean that it will always do so in the future or that the first event is the cause of the second. The past is not a predictor of the future, something known as the problem of induction. Just because our neighbor has, even since we have known him, picked up the morning paper from his driveway in his bathrobe does not mean that he will do so tomorrow. He may appear in a tuxedo. This is true even for events that we consider to be driven by natural laws. Just because the Sun has come up every day of our lives does not allow us to infer that it will do so tomorrow. Just because when I release my pen it falls and hits the ground, and this happens over and over again, does not allow me to conclude that it will do so the very next time I try it. Because inductive thinking is so appealing, we have developed laws that explain correlated sequential phenomena in terms of cause and effect. But just because such laws are so successful does not mean that we can ignore the fact that causality is merely an inference based on an idea of induction that has not been a priori justified.
Hume argued that our ideas of causality suffer for the same reasons that induction does. Going back to our shooting example, if person A fires a gun and the bullet enters person B and causes B to die, we say that A's actions caused the death of B. But all that we actually observed is that there was a temporal sequence of events in which the gun was fired, and the bullet then traveled and entered B who died and so we impute causality to the process. Our belief in causality is so strong because we have constructed laws that explain those temporal correlations in behavior that enable us to predict that if the first event is repeated, the second will too. So A shooting B in the same way will always result in the death of B. But what Hume says is that we cannot be sure of this. Maybe the next time A shoots at B, the bullet will, like a boomerang, stop half way and go back and hit A.
If we view a film and see a bleeding person lying on the floor and blood flowing back inside him followed by the person standing up and a bullet emerging from his body and going back into a gun held by another person, we would conclude that the film was being run backwards because all these things seem to violate causality. If we looked at the clock readings at the locations of the two events, we would expect to find that the clock reading with the man lying on the floor would be later than the clock reading in which the bullet was in the gun even when the film was run backwards. But can we be sure of this?
Hume's idea that causality is merely an assumption that may not always hold true received support in modern physics when it was found that the basic laws of physics are (almost always) time-reversal invariant. This means laws of physics are such that if one looked at a film showing reactions between elementary particles, one would not be able to deduce whether the film was being run forwards or backwards, because the basic laws of physics do not discriminate between the two. The only exception we have to date that violates this conclusion is the decay of an elementary particle known as the neutral kaon.
Furthermore, modern physics has shown that not every effect need be associated with a cause. For example, the decay of a radioactive nucleus appears to be totally spontaneous and unpredictable. Nothing causes it, it just happens. So if we can have acausal events and cannot even in principle assign a causal connection between two events, then Einstein causality may turn out to be just one of those convenient assumptions that seemed at one time to be self-evidently true but that we now need to outgrow and replace with a more sophisticated way of thinking, just like we grew out of assuming that the Earth was flat or that it was at the center of the universe.
Not that doing so will be easy. Causality, like the belief in free will, is so deeply ingrained on our psyches that abandoning it will be difficult.
Next: Some concluding thoughts
November 03, 2011
Relativity-11: The cold fusion debacle
(For previous posts in this series, see here.)
The 'cold fusion' episode from back in 1989 illustrates the danger with issuing press releases announcing a major scientific discovery before the scientific community has had a chance to weigh in and sift through the evidence. Two respected scientists Stanley Pons and Martin Fleischmann at the University of Utah discovered reactions producing enormous amounts of heat when the metal palladium was immersed in what is known as 'heavy water', which contains a large fraction of water molecules in which the ordinary hydrogen atom has been replaced by the heavier isotope deuterium. The experimenters thought that chemical reactions could not account for the scale of the energy release and were convinced that they had discovered a way to produce nuclear fusion reactions at room temperature, thus opening the way to a vast, cheap, and clean new energy source. Needless to say, this would be a revolutionary discovery, both scientifically and practically.
In March 1989 they announced their results at a press conference to loud fanfare. I remember hearing the announcement on BBC news over my short-wave radio and thinking "Wow! This is huge." As in the current case of faster-than-light neutrinos, the initial surprise was quickly followed by considerable skepticism within the scientific community because cold fusion went completely against all that we thought we knew about nuclear fusion. For two nuclei to come close enough to fuse, they have to overcome the strong repulsive forces due to both having positive charges. For the nuclei to overcome this 'Coulomb barrier', they have to have high energies that are associated with high temperatures as found in the Sun and other stars, which is what enables fusion to be their energy source. What Pons and Fleischmann were suggesting would require some new mechanism to overcome the well-known and well-understood obstacles to low-temperature fusion.
Other scientists pointed out that even if we ignored the Coulomb problem, the byproducts of fusion, which should have been copiously produced, were not observed either, throwing doubt on whether fusion was actually occurring. This objection was countered by claiming that perhaps this was a new form of nuclear reaction that did not produce those specific byproducts. As I pointed out in my series on the logic of science, almost any theory can be salvaged by the introduction of such auxiliary hypotheses. But adopting such stratagems tends to weaken the case for a new theory unless they too can be corroborated with other evidence.
If the claims of Pons and Fleischmann were true, the practical benefits and the revolutionary science they spawned were enormous and this persuaded enough scientists to take the cold fusion claims seriously enough to spend considerable time and effort and money to investigate them. As far as I am aware, over two decades later, though some scientists still continue to work on it, there is still no consistency about the cold fusion reactions, despite periodic resurgences of enthusiasm, enough that the Pentagon is funding further studies. In 2009, the program 60 Minutes did a program giving the history of cold fusion and some new developments.
One problem with cold fusion is that the heat reactions cannot be reliably reproduced. "The experiments produce excess heat at best 70 percent of the time; it can take days or weeks for the excess heat to show up. And it's never the same amount of energy twice." This is always a troubling sign. Scientific laws are not idiosyncratic. If they work, they should work all the time in the same way with no exceptions. If there are exceptions, these should also be law-like in that you should be able to predict exactly under what conditions they will or will not occur. Results that occur sometimes with no understanding why are signs that there are some unknown factors at work that are skewing the results.
So what has all this history to do with the recent neutrino story? The fact that this result was also announced via what was essentially a press release and not at a scientific meeting or in a peer-reviewed journal article aroused some concern. Press releases do not face the same degree of scrutiny as a journal article, where a sensational claim of this sort would be subject to close scrutiny before being approved for publication. In the above video, Fleischmann recognized this mistake, saying that he had two regrets: "calling the nuclear effect "fusion," a name coined by a competitor, and having that news conference, something he says the University of Utah wanted."
It is not the case that scientists are hidebound dogmatists, determined to cling on to old ideas, as is sometimes claimed by non-scientists when their pet theories (such as intelligent design) are rejected. As I said before, part of the strength of science is that because scientific knowledge is the product of a consensus-building process, it does not get easily swayed by each and every claim of a big discovery. It initially views reports of revolutionary developments with skepticism, waiting to see if the results hold up and corroborating evidence is produced. If so, the community can and does accept the new idea. For example, this year's award of the Nobel prize for physics was for the discovery that distant galaxies are not only moving away from us (which agreed with existing theories) but are actually accelerating (which flatly contradicted everything we had thought and has led to the highly counter-intuitive idea of so-called 'dark energy' permeating and dominating all of space) shows that the community can change its collective mind and accept radically new ideas, and fairly quickly. But the reason such a seemingly outlandish result as dark energy became the conventional wisdom within the short space of less than two decades is because the proponents were able to marshal the evidence in favor of it that survived close scrutiny and was corroborated.
The history of cold fusion, despite not becoming mainstream, also puts the lie to the claims of the so-called intelligent design movement that scientists conspire to suppress those ideas that challenge conventional wisdom. Despite the fact that most of the scientific community is highly skeptical of it being a real effect, cold fusion advocates actually do have a research program in which they do experiments, produce data, and publicize their results. All that members of the intelligent design community do is write books and articles and give talks whining that the scientific community refuses to give them a platform to promote their ideas and that this is because the community is hidebound and refuses to even consider their bold new idea that challenges the accepted 'dogma' of evolution.
The actual explanation for why the scientific community rejects intelligent design is simple and mundane. More that two decades after the idea was first proposed, intelligent design advocates still have not done a single experiment or have even a research program to do any.
Next: What if Einstein causality has to be abandoned?
November 01, 2011
Relativity-10: Science and public relations
(For previous posts in this series, see here.)
Scientists want their work to influence the field and so they would like it to gain the widest possible audience. Most of the time, their peers (and funding agencies) are their target audience because they are the only ones who really understand what they do. But when the work also has appeal to the general public because of its practical applicability or its revolutionary implications, then there can arise tensions in how the work is publicized and in the case of the OPERA experiment on faster-than-light neutrinos, there has been considerable unease with how this whole episode was handled with respect to the media.
The usual process when scientists have something new to say is that they write up a paper with their results and send it to a journal. The journal then sends the paper to referees who work in the same field (the number of referees depends on the journal and the discretion of the editor) who provide feedback to the editor. The referees do not usually check the results or repeat the calculations and experiments. What they do is see if the paper makes sense, the methodology is correct, if the authors have taken into account all the relevant factors and provided all the necessary information so that readers know exactly what was done (and how) so that they could repeat and check the results if they are so inclined, and that proper credit has been given for prior related work. Based on this feedback, the editor decides whether to accept the paper, reject it, or send it back to the authors for revisions and/or additional work. Good referees and editors can improve a paper enormously by providing the authors with valuable feedback and useful information and suggestions.
In the sciences, authors also usually simultaneously send out copies of the paper (known as preprints) to colleagues in the field. This serves to give their colleagues advance notice of their work (since the time taken to appear in the journal can often take over a year), to get feedback, and to establish priority for any discovery. All this occurs out of the public eye. Once the paper has been accepted and published by a journal, then it enters the public discussion and the media can publicize it. If the paper has significant implications, the journals may alert the media and give reporters a copy of the paper before it appears in print so that they can research and prepare an article about it, but the reporter is under an embargo to not publish until the journal article actually appears. Some of the more influential journals will refuse to publish an article if the authors release the information to the media before the journal prints it.
In the pre-internet days, and for research results that do not have revolutionary implications, this system worked reasonably well. Due to the cost of mailing, not too many preprints went out so the pre-publication discussions remained within a fairly small circle. With the internet, it became much easier to send out preprints to huge numbers of people at no cost and it was not long before it was realized that it made sense to create a system that could serve as a permanent archive that would allow scientists to post their preprints online so that anyone could gain access to them and search for those results that interested them. Currently the most popular venue for such preprints is arXiv and Wikipedia has a good article about its history and how it works.
The articles that are found on arXiv are preprints and thus have not been peer-reviewed but the system is minimally moderated to keep out rubbish. In general, scientists are concerned about their reputations among their peers and so most are careful to only post articles that they think would meet the standards of quality required if they were submitting to a peer-reviewed journal. Almost all of them do simultaneously submit their articles to such journals. As a result, the papers that appear on arXiv tend to be of pretty good quality. All the papers associated with the faster-than-light OPERA experiment are on arXiv.
A few scientists feel that peer-reviewed print journals are an anachronism and do not bother to try to even get their work into journals, feeling that the quality of the work will speak for itself. They think that if their work is correct and important, the community of scientists will accept it and build on it, while if it is wrong the community will criticize and reject it. Possibly the worst fate is that the community will think it is useless and a waste of time and completely ignore it. It may well be the case that in the future, expensive peer-reviewed print journals will disappear and that this kind of open-source publication will become the norm, with quality being determined by the consensus judgment of the scientific community. We are not there yet.
In the case of the OPERA experiment, the system broke down somewhat for several reasons. The OPERA experiment is very difficult and is a huge enterprise involving many collaborators and lasting over three years, with the paper having over 150 authors. Given the culture of the free sharing of information in science, it is very hard to keep preliminary results under wraps and it was pretty much an open secret that these faster-than-light results had been obtained. But this knowledge stayed within the community. What the OPERA team did was the day after they posted their preprint on arXiv on September 22, they issued a press release announcing their results and promoting a big press conference the next day with media and scientists present.
This rubbed some scientists the wrong way. Scientists can be as publicity hungry as celebrities but there are norms and there is a discreet way of making one's name known. Holding press conferences or issuing press releases so early in the game, before the scientific community has had time to pass its verdict on the research, is considered bad form and the OPERA team has received some criticisms on this score.
While some of the carping may be due to jealousy, it is also the case that trumpeting that a scientific revolution has occurred can harm the image of science if the claim has to be later retracted. The reliable knowledge that science produces tends to be the consensus verdict of the community, achieved after a lot of behind-the-scenes work has smoothed out the rough edges and corrected mistakes. Bypassing that filtering process and going public too soon can lead to embarrassing reversals and give ammunition to the critics of science that its results cannot be trusted.
Next: Recalling an earlier public relations debacle
October 27, 2011
Relativity-9: The importance of corroborating evidence in science
(For previous posts in this series, see here.)
In my series on the logic of science, I recounted how philosopher of science Pierre Duhem had pointed out as far back as 1906 that the theories of science are all connected to each other and changes in one area will have unavoidable effects on others that should be discernible. In this case, if neutrinos in the OPERA experiment did in fact travel faster than the speed of light, then we should be able to look at some other effects that should occur and see if they are observed.
One of them is the 'Cherenkov effect'. This effect says that when something travels faster than the speed of light, it should emit a certain kind of radiation that is analogous to the shock waves that are produced when something travels faster than the speed of sound. This is known as the 'sonic boom' that we can hear when jet planes break the speed of sound. It also occurs when bullets are fired at speeds greater than the speed of sound but because bullets are so small the sonic boom is too weak for us to hear it.
The Cherenkov effect is well known and has been studied and confirmed. How can this be if it requires something to travel faster than the speed of light? Recall that the speed of light barrier in Einstein's theory is that in a vacuum. When light travels through any medium (light, water, atmosphere), it is slowed down by the interactions of the medium with the light particles. Other particles such as electrons are also slowed down by the medium but they may not be to the same extent, in which case it can be possible for some particles in a medium to travel faster than the speed of light in that same medium. If they do so, they should emit the light equivalent of the sonic boom and this is called Cherenkov radiation. The spectrum of light emitted lies mainly in the ultraviolet region and its overlap with the visible spectrum produces a characteristic blue glow. One can see this in the cooling water that surrounds nuclear reactors, as in the image on the right, and in this video of a pulse of radiation being sent into the cooling liquid.
In a paper, Andrew Cohen and Sheldon Glashow calculate that high energy, faster-than-light neutrinos as produced in the OPERA experiment would lose much of their energy due to Cherenkov radiation, mainly by the production of electron-positron pairs, on their way from CERN to Gran Sasso. But that does not seem to have happened, according to a different experiment at Gran Sasso (known as ICARUS) that works with the same neutrino source as the OPERA experiment.
Another concern involving consistency is with the supernova SN1987A that was observed in 1987. It turned out that a cluster of 24 neutrinos were detected in three different detectors on the Earth about three hours before the supernova was observed, i.e. before the light signals reached Earth. That difference was not put down to the neutrinos traveling faster than the speed of light but to the fact that the neutrinos, while created at the same time as the light, escaped from the exploding star three hours before the light did due to their low interactivity with matter, and so had a head start on the journey to Earth, even though they traveled in free space at the same speed as light. The measured time difference was consistent with our understanding of the processes involved in a supernova.
If the neutrinos had speeds greater than that of light by even the small amount given by the OPERA experiment, then because of the huge distance of the supernova from Earth (about 168,000 light years), the supernova neutrinos should have reached Earth about 4.7 years before we saw the supernova. If neutrinos in the OPERA experiment had in fact, been traveling faster than the speed of light, why had they not done so in other situations, such as the 1987 supernova?
The working model of science is that things behave in a law-like, repeatable manner and not idiosyncratically. If we observe something in one situation, we expect to see it happening again in similar situations. If a deviation from law-like behavior is observed, we assume that this is due to the existence of another, deeper, hitherto unknown law whose effect only became apparent because of some conditions that had been incorrectly assumed to be unimportant.
In this case, one could postulate that since the OPERA neutrinos have a thousand times as much energy as the supernova neutrinos, faster-than-light speeds only arise for such high-energy neutrinos. Of course, such a new explanation requires new corroborative evidence and so the discussion will go on as explanations and evidence play out their dialectical relationship until a consensus emerges. That is how science works.
Next: Science and public relations
October 24, 2011
Relativity-8: General relativity
(For previous posts in this series, see here.)
To understand the role of Einstein's general theory of relativity, recall that the original OPERA experiment claimed that they had detected neutrinos traveling faster than the speed of light. This posed a challenge to what is known as Einstein's theory of special relativity, proposed in 1905, which said that the relationship between the clock and ruler readings for two observers moving relative to one another would be different from the ones given by the seemingly obvious relationships derived by Galileo centuries earlier. According to Einstein's theory, it is the speed of light that would be the same for all observers, while clock readings could differ, and that Einstein causality (the temporal ordering of any two events that are causally connected by a signal traveling from one to another) would be preserved for all observers. One inference that followed from Einstein causality is that no causal signal can travel faster than the speed of light, and this was what was seemingly violated by the OPERA experiment.
But Einstein had a later and more general theory that he proposed in 1915, called the general theory of relativity, that included the effects of gravity. He showed that clock readings were not only affected by the speed with which the clock was moving, they were also affected by the size of the gravitational field in which the clock found itself. This is the source of what is referred to as the 'gravitational red shift' that enters into cosmology that causes the light emitted by distant stars and galaxies to be shifted towards larger wavelengths as they escape the gravitational field of those objects on their journey to us.
To understand what is going on, recall that when we measure the elapsed time between two events, what we are really doing is measuring the number of clock ticks that occur between the events. According to general relativity, the stronger the gravitational field, the slower the rate at which a clock ticks. The slower the rate at which a clock ticks, the less time that it records as having elapsed between two events.
So, for example, since we know that the Earth's gravitational field decreases as we go up, this means that if we take two identical clocks, one on the floor and the other on the ceiling, the one on the floor would have fewer ticks between two events than the one on the ceiling, even if both are stationary. So the clock on the floor would 'run slower' than the one on the ceiling and hence the time interval measured between two events measured by clocks on the floor will be less than that measured by clocks on the ceiling.
In the OPERA experiment, the time measurements were made using GPS satellites. These are whizzing by at both high speeds (about 4 km/s) and high altitudes (about four Earth radii). Typically, the signals are handed off from one satellite to another as they appear and disappear over the horizon and the transition is almost seamless and produces such small errors that we do not notice it. But the OPERA experiment requires such high precision that they arranged to do the experiment during the transit time of just a single satellite so that even that source of error was eliminated.
Because the rate at which clocks run depends upon the size of the gravitational field, one has to make corrections to allow for the fact that the time readings given by clock readings of the satellites will be different from the time readings given by clocks on the Earth, and so one needs to make extremely subtle corrections to the GPS time stamp to get the correct clock readings on the Earth. This is why much of the attention has focused on this aspect. It is not that the OPERA experimenters overlooked this obvious feature (such general relativistic corrections are routinely made by GPS software in order to make the GPS system function with sufficient accuracy) but whether they have made all the necessary corrections to the extremely high level of precision required by this experiment.
Carlo Contaldi at Imperial College, London has suggested that the clocks at CERN and Gran Sasso were not synchronized properly due to three effects, one of which is the fact that the gravitational field experienced by the satellite is not the same at all points on its path since the Earth is not a perfect sphere. He says that the errors that would be introduced are of the size that could produce the OPERA effect. (You can read Contaldi's paper here.)
Ronald A. J. van Elburg at the University of Groningen has argued that subtle effects due to the motion of the detectors with respect to the satellite could have shifted the time measurements at each clock on the ground by 32 nanoseconds in the directions required to explain the 60 nanosecond discrepancy. (You can read van Elburg's paper here and reader Evan sent me a link to a nice explanation of this work.)
There have been no shortage of ideas and papers pointing out problems and possible alternative explanations for the OPERA results. Sorting and sifting through them all before we arrive at a consensus conclusion will take some time.
October 21, 2011
Relativity-7: What could be other reasons for the CERN-Gran Sasso results?
(For previous posts in this series, see here.)
The reactions to the reports of the CERN-Gran Sasso discovery of possibly faster-than-light neutrinos open a window into how science operates, and the differences in the way that the scientific community and the media and the general public react whenever a result emerges that contradicts the firmly held conclusions of a major theory.
The initial reaction within the scientific community is almost always one of skepticism, that some hitherto unknown and undetected effect has skewed the results, while the media and public are much more likely to think that a major revolution has occurred. There are sound reasons for this skepticism. Science would not have been able to advance as much if the community veered off in a new direction every time an unusual event was reported.
What usually happens is that most of the community goes on as before as if nothing had occurred while a relatively small number who are experts in that area examine the new results closely. Some will try to identify possible sources of systematic errors that the original experimenters did not consider. The experimenters who reported the possibility of faster-than-light neutrinos are reportedly careful people and if any errors occurred, we can be sure that they are not trivial ones that will be uncovered easily or quickly. Others will examine if any of the side effects that would accompany faster than light travel are also seen. If those two efforts fail to turn up any problems, other groups will try to repeat the basic experiment with different experimental set-ups, measuring the time and distance using different techniques so that the likelihood of systematic biases pushing the results in the same direction is reduced. The last option is very expensive and time-consuming, since these experiments are very difficult to do, which is why it is usually the last resort. During this period, there will often be claims and counter-claims and some confusion until the dust settles and a consensus emerges. But it is this painstaking investigation seeking replicability and consistency that characterizes science and enables it to be confident that once a consensus emerges, that it has produced reliable knowledge.
In this case, recall that the original experiment (which has the acronym OPERA) that aroused such interest involved sending neutrinos over a distance of 730 km and measuring their speed, where the distance and time measurements used GPS satellite technology. Assuming that 730 km was the exact distance, if the neutrinos travelled at exactly the speed of light, it should take them 2.435 milliseconds to make the trip. What was observed was that the neutrinos arrived 60 nanoseconds earlier than expected, thus violating Einstein causality, though not overthrowing the theory of relativity. This effect would go away if there were a 60 nanosecond error in the time measurement and/or an 18 meter error in the distance measurement of the journey, and searching for hitherto unconsidered factors that could produce effects of that size has been the initial focus.
There have already been some developments. When it comes to looking at sources of systematic errors, Lubos Motl has a long discussion on possible errors and has compiled a partial list of potential sources that need to be examined closely.
- inconsistencies in the whole GPS methodology of measuring space and time coordinates
- inconsistencies of units (meters, second) used at various places: the errors would have to be huge, indeed, so this is unlikely
- subtle old-fashioned physics issues neglected by GPS measurements: the index of refraction of the troposphere and (even more importantly) ionosphere that slows down and distorts the path of GPS signals; confusing spherical Earth and geoid; neglecting gravitational effects of the Alps; neglecting magnetic fields at CERN that distort things; and so on
- forgetting that 2 milliseconds isn't zero and things change (e.g. satellites move) during this short period, too
- subtle special relativistic effects neglected in the GPS calculations
- subtle general relativistic effects neglected in the GPS calculations
- wrong model of where and when the neutrinos are actually created on the Swiss side
more radical: wrong model of the wave equation for the neutrinos (regardless of oscillations etc., neutrinos should never move information faster than light in the vacuum, but maybe we're doing some mistake about the group vs phase velocity and entanglement of the two places: recall that the difference between the phase and group velocity for these neutrinos should be negligible, around 10 -19).
Notice that a lot of the suggested errors focus on the GPS or the Global Positioning System. This currently consists of 31 orbiting satellites that are continuously emitting signals that include the time the signal was sent as well as the orbital information of the satellite. Receivers on the ground (such as in your car) take that information and calculate the position of the receivers. The OPERA experiment used such signals to pinpoint the locations of the detectors at CERN and Grand Sasso and the time of travel. Most everyday situations do not require very high levels of accuracy. But since time interval errors of just 60 nanoseconds or distance errors of 18 meters could nullify the results, people have been looking into the possible sources of subtle errors, especially those associated with Einstein's general theory of relativity.
Next: General relativity effects.
October 19, 2011
Relativity-6: Measuring time and space more precisely
(For previous posts in this series, see here.)
In the previous post in this series, I said that Einstein's claim that the speed of light must be the same when measured by all observers irrespective of how they were moving led to the conclusion that the rate at which time elapsed must depend on the state of motion of the observer. But if time is not an invariant entity, then we need to be more precise about how we measure it for observers in relative motion to one another so that we can better determine how their measurements are related.
What we now postulate is that associated with each observer is a grid of rulers that spreads out into all space in all directions. At each point in space are also a clock and a recorder. It is assumed that all the rulers and clocks of all the observers are constructed to be identical to each other, the clocks are properly synchronized, and the recorders never make errors. When an event occurs anywhere at any time, the location and time of that event are those noted by that recorder who happens to be exactly at the location of the event and who notes the ruler and clock readings located at the place at the instant when the event occurred. This rules out the need to make corrections for the time that elapses for the light to travel from the location of the event to the recorder.
If there is another observer who is moving with respect to the first, that person too will have her own set of rulers and clocks and recorders spread out through all space, and the location and time of an event will be that noted by her recorder using her rulers and clocks at the location where the event occurs. This set up seems rather extravagant in its requirement of infinite numbers of rulers and clocks and recorders but of course all these rulers and clocks and recorders are merely hypothetical except for the ones we actually need in any given experiment. The key point to bear in mind is that the location and time of an event for any observer is now unambiguously defined to be that given by that observer's ruler and clock readings at the location of the event, as noted by the observer's recorder located right there.
What 'Einstein causality' says is that if event A causes event B, then event A must have occurred before event B and this must be true for all observers. If one observer said that one event caused another and thus the two events had a particular ordering in time, all observers would agree on that ordering. Thus causality was assumed to be a universal property.
What we mean by 'causes' is that event B occurs because of some signal sent by A that reaches B. So when the person at B is shot by the person at A, the signal that caused the event is the bullet that traveled from A to B. Hence the clock reading at event A must be earlier than the clock reading at event B, and this muust be true for every observer's clocks, irrespective of how that observer is moving, as long as (according to Einsteinian relativity) the observer is moving at a speed less than that of light. The magnitude of the time difference between the two events will vary according to the state of motion of the observer, but the sign will never be reversed. In other words, it will never be the case that any observer's clocks will say that event B occurred at a clock reading that is earlier than the clock reading of event A.
But according to Einstein's theory of relativity, this holds only if the signal that causally connects event A to B travels at speeds less than that of light. If event B is caused by a signal that is sent from A at a speed V that is greater than that of light c (as was claimed to be the case with the neutrinos in the CERN-Gran Sasso experiment) then it can be shown (though I will not do so here) that an observer traveling at a speed of c2/V or greater (but still less than the speed of light) will find that the clock reading of when the signal reached B would actually be earlier than the clock reading of when the signal left A. This would be a true case of the effect preceding the cause. The idea that different observers would not be able to agree on the temporal ordering of events that some observers see as causally connected would violate Einstein causality and this is what the faster-than-light neutrino reports, if confirmed, would imply.
Note that this violation of Einstein causality occurs even though the observer is moving at speeds less than that of light. All it requires is that the signal that was sent from A to B to be traveling faster than light.
(If the observer herself can travel faster than the speed of light (which is far less likely to occur in reality than having an elementary particle like a neutrino doing so), then one can have other odd results. For example, if the speed of light is 1 m/s and I could travel at 2 m/s, then one can imagine the following scenario. I could (say) dance for five seconds. The light signals from the beginning of my dance would have traveled 5 meters away by the time my dance ended. If at the end of my five-second dance, I traveled at 2 m/s for 5 seconds, then I would reach a point 10 meters away at the same time as the light that was emitted at the beginning of my dance. So if I look back to where I came from, I could see me doing my own dance as the light from it reaches me. So I would be observing my own past in real time. This would be weird, no doubt, but in some sense would not be that much different from watching home movies of something I did before. It would not be, by itself, a violation of Einstein causality since there is no sense in which the time ordering of causal events has been reversed.)
So the violation of Einstein causality, not the theory of relativity itself, is really what is at stake in the claims that neutrinos traveling at speeds faster than light have been observed. This is still undoubtedly a major development, which is why the community is abuzz and somewhat wary of immediately accepting it is true.
Next: What could be other reasons for the CERN-Gran Sasso results?
October 17, 2011
Relativity-5: Galilean and Einsteinian relativity
(For previous posts in this series, see here.)
In the previous post in this series, I posed the situation where, seated in my office, I observe two events on the sidewalk outside my window and measure the locations and time of two events and deduce the distance between them and the time interval according to the rules for using my own ruler and watch. Now suppose another person is moving with respect to me (say in a train that passes right by where the two events occur) and sees the same two events as I do and measures the locations and times of the two events and deduces the distance and time interval between them using her ruler and watch. Will her measurements agree with mine?
When it comes to location and distance measurements, it is not hard to see that the two results will be different. When I take ruler readings of the two events, the ruler is not moving compared to the two events. But because the person in the moving train's ruler will be moving along with her in the train, the ruler readings of where the two events occurred will be affected by her motion. After the person in the train takes the reading on her ruler at the location where event A occurred, by the time the later event B occurs, she and her ruler would have moved along with her train and so the ruler reading for event B would be different from what would have been obtained if the ruler had been stationary. So the locations and the measured distance between the two events based on her two ruler readings will be different from those based on my two ruler readings.
What about the time interval between events A and B? It used to be thought that even though the two observers used different clocks and they were moving relative to each other, as long as the clocks were identical and synchronized properly, the two observers would at least agree on this because it seemed so commonsensical that time was some sort of universal property, independent of the observer measuring them or her state of motion. Time measurements were said to be invariants.
These relationships between the location and time measurements made by observers moving with respect to one another were first postulated by Galileo. It is now known as 'Galilean relativity'. Galileo used these relations to show why, even though the Earth was moving quite fast through space (a seemingly absurd idea at that time), a ball thrown vertically upwards would fall back down to the same point from where it was thrown, and not be displaced because the Earth had moved during the time that elapsed. This everyday observation had previously been used to argue that the Earth must be stationary but Galileo turned it around to show that it was consistent with the Earth moving.
But one consequence of the assumption that time is an invariant is that if you measure the speed of light (by taking two events, one consisting of light being emitted at one point and the other of it being detected at another point and dividing the difference in ruler readings between the two events by the time interval between the events), you would get different values for two observers in relative motion to each other, since the distances traveled (i.e., the differences in the ruler readings) would be different for the two observers but the time interval would be the same. In other words, the measured speed of light was not an invariant but depended on the speed with which the observer was moving.
What Einstein postulated (based on several reasons that I will not get into here) was that the speed of light was the same for all observers. In other words, it is the measured speed of light that is an invariant, the same for all observers irrespective of how they are moving. One important consequence of this is that the elapsed time between two events is no longer an invariant, and depends on the observer. Time is no longer a universal property but depends on who is measuring it. The difference in measured times is tiny for the normal speeds we encounter in everyday life, which is why we don't perceive it. But it does leads to things like the celebrated 'twin paradox' where if you have a pair of identical twins, one remaining on Earth and the other going in a rocket at high speed to a distant star and returning, the traveling twin would have aged much less than the one who stayed home.
Needless to say, this caused some consternation and it took some time for people to be persuaded that this seemingly bizarre result was correct. What Einstein did was force us to be more precise about how we measure the location and time at which events occur, so that we can meaningfully compare the results of different observers viewing the same events.
Next: Measuring time and space more precisely.
October 13, 2011
Relativity-4: Measuring time and space
(For previous posts in this series, see here.)
To get a better grip on what is involved in the theory of relativity, we need to think in terms of 'events', things that occur instantaneously at a point in space and which every observer will agree happened and is unique. An example of an event might be me clapping my hands once. That occurs at one place in space (where my hands meet) at one moment in time (the instant they make contact) and all observers will agree that I did indeed clap my hands. Of course, actual events will be spread out over a region of space (my hands are quite big objects) and over a small but extended interval of time (the period during which my hands are in contact while clapping) but we can imagine idealized events as things that occur at a single point in space at a single instant in time. Specifying an event also uniquely specifies a location and a time since only one event can occur at any point in space at a particular time.
Suppose we have one event A that takes place at one place at one time (say a neutrino created by a nuclear reaction at CERN) and another event B that takes place at another place at another time (say the detection of the arrival of that same neutrino at the Grand Sasso laboratory). Einstein causality says that since event A caused event B, event A must take place before event B. Even if the neutrino were to travel at a speed greater than the speed of light, all that would do is reduce the time difference between the two events, not reverse their order, as was noted in the example given in the first post in this series. So why is this event seen as such a sensational development?
The answer lies in the fact that Einstein causality is believed to hold true for every observer who sees the same two events, irrespective of the state of motion of the observer. And the existence of faster than light neutrinos means that even though we on Earth will continue to see event A before event B, there are observers who are moving relative to us who will see the neutrino being detected at Gran Sasso before it was created at CERN or, more bizarrely in the case of the shooting example, that the bullet will emerge from person B and seem to travel back into the gun of person A. And unlike in that earlier example, this will not be due to an illusion due to the accident of where the observer happened to be located.
To understand how this can happen, we need to go more deeply into the question of how we measure the location and the time of events and how they differ for observers moving with respect to one another. Location and distance measurements seem pretty straightforward and we do it all the time when we measure the length of something. We simply hold a ruler along the line joining the two events, take the ruler readings at the locations of each of the two events, subtract the smaller reading from the larger, and the resulting number gives us the distance between the two events.
As for the time interval between two events, we can look at our watch when we see event A occurring and note the reading, then look again when we see event B occurring and note the reading, and once again subtract the smaller reading from the larger. The resulting number gives us the time that lapsed between the two events. There is a slight complication here in that it takes time for light to travel from one place to another so the actual time at which event A occurred would be a little earlier than when we see it. But since we know the speed of light, we can take that into account. All we have to do is measure the distance between where we are and the location of event A and divide that by the speed of light to get the time taken for the light to reach us. We then subtract that time from our watch reading to get the 'true' time at which the event A occurs. We can do the same thing for event B.
For example, in the earlier example, if you were standing next to the victim at B, you would have seen the bullet at the 2 meter mark 9 seconds after the gun fires. If you had been standing next to the shooter at A, you would have seen it 3 seconds after the gun fired. If you correct for the time of travel for the light to reach you from the bullet at the 2 meter mark, the bullet would be said to be at that point one second after the gun was fired, irrespective of where you were standing. So the time of an event can be specified uniquely in the case of different observers who are not moving with respect to the events.
What if the observer is moving, though? The question that Einstein pondered is the following. Suppose I, seated in my office, observe two events on the sidewalk outside my window and measure the distance between them and the time interval according to the above methods using my own ruler and watch. Now suppose another person is moving with respect to me (say passing by in a train) and sees the same two events as I do and measures the distance and time interval between them using her ruler and watch. Will that person's measurements of the distance and time intervals agree with mine?
It is the answer to this question that determines whether we live in a world in which Galilean relativity rules or one in which Einsteinian relativity rules.
Next: Galilean and Einsteinian relativity
October 12, 2011
Relativity-3: The elusive neutrino
(For previous posts in this series, see here.)
Neutrinos are very elusive particles that are produced in nuclear reactions. They interact hardly at all with anything, which enables them to penetrate anything easily. In any given second, tens of billions of neutrinos are coming from the Sun and passing though each square centimeter of our bodies and the Earth without doing anything, and heading off into the vast empty reaches of space on the other side. As a result of its extremely low interactivity with matter, it is hard to measure their properties, even basic ones like mass, because measurement involves getting the measured object to interact with the detector so that we know something about it. The existence of neutrinos was first postulated in 1930 as a theoretical device to explain missing energy in certain nuclear reactions but its elusive nature meant that it took until 1956 for direct experimental detection of their existence.
While the fact that neutrinos interact hardly at all with matter makes them hard to detect and discern their properties, this same elusiveness make them attractive candidates for measuring speed. This is because once produced they ignore everything in their path and travel in a straight line with constant speed so that measuring the distance traveled and the time taken does give you the speed. Even light is not as good for this purpose because both its speed and its trajectory are affected by the matter it passes through, as we all experience when we see how distorted things look when seen through glass prisms or bowls of water. Even slight changes in the density of the atmosphere can affect the path of light, which is the reason why we see mirages. So if you use light, the path taken by it in going from one point to another may not correspond to the straight geometric line distance connecting the two points that can be calculated once we know the coordinates of the two points, and so calculating the distance traveled by the light is not simple. But in the case of neutrinos, the path taken is dead straight and thus the geometric straight-line distance between two points will be the actual distance traveled by the neutrinos.
Another advantage is that the speed of neutrinos, unlike that of light, is unaffected by the medium it travels through. When light passes through glass or water, its speed is reduced which is the cause of the distortions we observe. As another example, take the light coming from the Sun. This light is produced as a result of nuclear reactions that produce both photons (particles of light) and neutrinos, among other things. But because the Sun is such a dense gas, it slows down light considerably and the photons produced at the core of the Sun can take as much as tens of thousands of years merely to reach the surface of the Sun, a distance of roughly 700,000 kilometers. Once there, it can travel freely in the vacuum of space to cover the remaining150 million kilometers to the Earth (over 200 times the radius of the Sun) in just over eight minutes. Neutrinos that are also produced in the core, however, travel almost as fast within the Sun as they do in the vacuum in space because matter is almost invisible to them. So if a neutrino and a light photon are produced in the same reaction in the core of the Sun, the neutrino will reach us long before the photon does.
Supposing the CERN-Gran Sasso experimental result holds up and the neutrinos are in fact traveling faster than the speed of light. Does this mean that Einstein's theory of relativity is completely overthrown? No. Einstein's theory does not rule out particles traveling faster than the speed of light. Such particles, known as tachyons, have always been allowed by the theory but we have never confirmed their existence so far. There have, however, been various false alarms in the past, which is part of the reason for the skepticism about the present claim.
What Einstein's theory says is that if a particle has zero mass, then it travels at exactly the speed of light but if it has non-zero mass, then its speed can approach the speed of light but cannot attain it. Particles can approach the speed of light 'from below' (these are the normal particles we have experience with that always have speeds less than that of light,) or 'from above' (they always have speeds greater than that of light, and these are called tachyons that we have never shown to definitively exist), but neither can cross the barrier of the speed of light to the other side. So the existence of faster-than-light particles would not overturn Einstein's theory of relativity completely since that theory always allowed for their existence, but would still be a momentous discovery because it would be a completely new phenomenon.
So does this mean that the existence of tachyons can be easily absorbed into existing knowledge? Not quite. The problem with the existence of tachyons is what it does to something known as 'Einstein causality', which is something that is connected to the theory of relativity, but is in addition to it. What this says is that if two events are causally connected, (i.e., one event causes another) then the cause must precede the effect. Going back to the commonly used bloodthirsty example, if person A fires a gun and the bullet enters person B, Einstein causality says that the firing of the gun by A must occur before the bullet enters person B because one caused the other. This seems eminently reasonable but we have to bear in mind that it is an assumption that is based on experience and, like all such assumptions, is subject to empirical scrutiny. If faster-than-light particles exist, the theory of relativity says that Einstein causality can be violated. i.e., effects can precede causes. It is this possibility, sometimes referred to as 'going backwards in time', that boggles the mind.
So how does the existence of tachyons violate Einstein causality? In the first post in this series, I gave an example where there seemed to be a situation of going backwards in time but said that this was not really so, because that was an illusion that arose due to the fact that we were dependent on when light from an event reached the observer.
To better understand what constitutes violations of Einstein causality, we have to get into the subtleties of what we mean by measuring distance and time, and this lies at the heart of the theory of special relativity. What Einstein did was make our understanding of how to measure distances and time more precise and operational, and in doing so altered our fundamental understanding of those two seemingly mundane concepts.
Next: Measuring time and space
October 11, 2011
Relativity-2: The CERN-Gran Sasso experiment
(For previous posts in this series, see here.)
The nice feature about the experiments involved in the recent reports of faster than light neutrinos is that the basic ideas are so simple that anyone can understand them. It involved producing neutrinos at the CERN laboratory in Switzerland and detecting them at the Gran Sasso laboratory in Italy. By measuring the distance between the two locations and the time taken for the trip, one could calculate the speed of the neutrinos by dividing the distance by the time.
The measured distance was about 730 km so if we take that as the exact value, and if the neutrinos were traveling at exactly the speed of light (299,792 km/s), the time taken would be 2.435022 milliseconds (where a millisecond is one-thousandth of a second) or equivalently 2,435,022 nanoseconds (where a nanosecond is one-billionth of a second). What the experimenters found was that the actual time taken was 60 nanoseconds less than this time, which seemed to require the neutrinos to be traveling slightly faster than the speed of light. Since the existence of faster than light particles has never been confirmed before, this would be a major discovery and so the search is now underway to see if this conclusion holds up under close scrutiny.
If the experimental results are at fault and the effect is spurious, this must arise from errors in the distance measurement and/or the time measurement. Although the time difference that produced the effect is very small (60 nanoseconds out of a total travel time of over 2 million nanoseconds constitutes only about 0.0025% of the total time), the experimenters say their time measurements are accurate up to 10 nanoseconds, much less than the size of the error needed to resolve the discrepancy, thus ruling that out as the source of error. Similarly, if the actual distance were less than the measured distance by just 18 meters, the effect would again go away. The experimenters used GPS technology to measure the space and time coordinates of the events and say that their experiment can measure distances up to an accuracy of just 0.2 meters, making that too an unlikely source of any error. As for the possibility of some kind of random statistical fluctuations causing the effect, the number of neutrino measurements they have taken over the past two years exceed 16,000, which makes that highly unlikely as the source of error.
So why is there still skepticism? It is because the very feature of neutrinos that makes this experiment so conceptually simple is also what makes it so difficult to rule out what are called systematic errors. These are artifices of the experimental setup that can bias the results consistently in one particular direction, unlike random errors that can go either way and can be reduced by repeating the experiment a large number of times, as was done in this case. Unearthing systematic errors is difficult and time consuming because it depends on the esoteric details of the experimental set-up. What some other groups will now try and do is identify possible sources of systematic errors that the original experimenters did not consider, while others will repeat the experiment with different experimental set-ups, measuring the time and distance using different techniques so that the likelihood of systematic biases pushing the results in the same direction is reduced. Yet other groups will examine if any of the side effects that would automatically accompany faster than light travel are also seen. It is this kind of investigation for replicability and consistency that characterizes science.
But getting back to the original experiment, the reason that neutrinos are good for measuring velocities that may exceed the speed of light is that they usually travel at speeds close to or at the speed of light. If a particle has zero mass (as is the case with 'photons', the name given to particles of light), then according to Einstein's theory of relativity, it must travel exactly at the speed of light. If it has a mass, however small, it can approach the speed of light but never attain it because to do so would require an infinite amount of energy. But it takes less energy to accelerate lighter particles to high speeds than it does heavier particles.
In the case of neutrinos, we have not been able to directly detect them having any non-zero mass as yet. All we have been able to do so far is put a small upper limit on the amount of mass it can have, which is 2 eV/c2 which is about 3.5x10-33 kg. (By comparison, the particle with the smallest mass we know, the electron, has a relatively huge mass of 511,000 eV/c2.) It had long been assumed that the mass of the neutrino was exactly zero. But it turns out that there are three kinds of neutrinos and that they may oscillate from one kind to another as they travel through space, and the postulated mechanism for such oscillations require that they have non-zero mass. The purpose of the CERN-Gran Sasso experiment was to actually look for such oscillations, and it just so happened that it turned up the evidence that neutrinos may be traveling faster than light, completely shifting the focus of attention. Such accidental discoveries when looking for something else are not uncommon in science, the discovery of X-rays being one of the more famous examples.
Next: The elusive neutrino
October 10, 2011
Relativity-1: Going backwards in time
Part of the reason that recent reports of the detection of neutrinos traveling faster than the speed of light aroused such excitement is because of claims that such a discovery would overthrow Einstein's venerable theory of relativity and that if you could send a signal faster than the speed of light, you could go backwards in time. Are these claims true or simply overheated? If true, what exactly was overthrown? And what does it mean to 'go backwards in time' anyway?
My initial reaction to the faster-than-light neutrino report was one of skepticism, saying that I would wait and see if the result held up but was not hopeful that it would. I did not give my reasons for this pessimism and reflecting later, I thought I should because understanding what was claimed (and why) serves as a good vehicle to understand the elements of the theory of special relativity as well as how science works., so the next series of posts will deal with these questions. (I was overdue for a series of posts on a single topic anyway.)
Let's look first at the 'backwards in time' claim. There is a simple (but wrong) way of interpreting this and a more subtle (but correct) way.
To see the simple way in which something traveling faster than the speed of light can cause things to appear to go backwards in time, think of a situation in which a man fires a gun at another man but with the bullet traveling faster than the speed of light. Nothing requires the shooting of people to understand this phenomenon but this is the customary example that is used, perhaps because a bullet is the fastest object that most people can think of (although it is still much slower than the speed of light) combined with the fact shooting someone is so dramatic and final that reversing the process seems impossible, kind of like Jesus rising from the dead.
Suppose the shooter is at point A and the person hit is at point B 10 meters away. Suppose you are standing right next to the person at B. If the bullet travels faster than the speed of light, what will you see? Remember that we 'see' something only when the light from that event enters our eyes. Since the speed of light (at 299,792 km/s) is beyond anything we are familiar with from our everyday experiences, let's greatly slow things down by assuming that it travels at (say) 1 m/s and that the bullet travels at (say) 2 m/s.
You will see the gun at A firing 10 seconds after it fires because the light from that instant will take that much time to travel the 10 meters to reach you. But one second after the gun is fired, the bullet will have traveled two meters towards B (and you), and light emitted by the bullet at that point will take only 8 more seconds to reach you. In other words, you will see the bullet at the 2 meter point 9 seconds after the gun is fired, which is one second before you see the gun firing. Similarly you will see the bullet at the 4 meter mark 8 seconds after the gun fires, at the 6 meter mark 7 seconds after the gun fires, at the 8 meter mark 6 seconds after the gun fires, and the bullet entering the person at B 5 seconds after the gun fires. Put it all together and what you see first is the person at B being hit (five seconds after the gun fires) and then in the next five seconds will see the bullet emerging from the victim and traveling back and entering the gun.
This no doubt looks like is going backwards in time. But this example is not what is meant by going backwards in time according to the theory of reelativity. After all, the victim was in fact hit five seconds after the gun was fired so there is no actual reversal of the ordering of the events. What you saw is more like watching a film run backwards, which is not really going backwards in time. This effect is an illusion, an artifice caused by the fact that light takes time to travel and your special location next to the victim. Had you observed the whole sequence of events while standing next to the shooter at A, you would not have noticed anything unusual because you would have seen the gun fire right at the beginning, the bullet at the 2 meter mark after 3 seconds, at the 4 meter mark after 6 seconds, at the 6 meter mark after 9 seconds, at the 8 meter mark after 12 seconds and hitting the person at B after 15 seconds. Everything would have seemed normal.
What this example does illustrate is that specifying the time at which an event occurs by the time noted by an observer is not satisfactory because it depends on where the observer is situated relative to the events. (For example, the bullet was observed at the 2 meter mark at 3 or 9 seconds after the gun was fired depending on where you were standing.) We will also see later that in addition to the location, the state of motion of the observer (if you were observing the events from a moving train, for example) also affects the time at which they see events.
It is in trying to unambiguously pin down exactly when something happens that we arrive at a deeper understanding of Einstein's theory of relativity and what we really mean by going backwards in time.
Next: The CERN-Gran Sasso experiment