Honours for optics

Published : Dec 02, 2005 00:00 IST

Work on understanding and using optical coherence and contribution to laser-based precision spectroscopy enable three scientists to share the Nobel Prize for Physics.

R. RAMACHANDRAN in New Delhi

QUANTUM theory tells us that all objects in the micro scale are endowed with a dual character - they have both wave-like and particle-like behaviour. This aspect is most clearly demonstrated in the case of light. Light or electromagnetic radiation is the result of the combined oscillations of the electric and magnetic fields in space and time. Its wave-like character - with well-defined properties of frequency, amplitude and phase - forms the basis of all classical optics, electrical engineering and radio and microwave applications and is accurately described by Maxwell's theory of electromagnetism, formulated in the late 19th century.

Its detection, on the other hand, is through the absorption of radiation energy in some material medium. This, on the other hand, is known to occur in packets since the work of Max Planck in 1900. In order to explain the spectral distribution of the so-called black-body radiation (for which Maxwell's theory proved insufficient), Planck had to assume that the exchange of energy between matter and radiation must be in discrete amounts or quanta. With Einstein's Nobel Prize-winning work in 1905 on photoelectric effect, the physics underlying Planck's assumption became clear - the radiation itself is `grainy', made up of particles (now called photons) that carry lumps or quanta of energy. Einstein's work laid the foundation for quantum mechanics resulting in a successful theory of the `quantised' electromagnetic field (known as quantum electrodynamics), which, in the classical limit, reduced to Maxwell's theory.

More importantly, Einstein showed that one photon transferred all its energy to one and only one electron. This electron, the photoelectron, gets emitted from the surface and is detected. That is, photons are detected only indirectly through the detection of photoelectrons produced by a quantum process.

However, it was generally believed that there was no conflict between the classical and quantum descriptions in typical optical observations in the macro scale where `photocurrents' generated by photoelectrons are measured. However, in 1954-56 astronomers R. Hanbury Brown and R.Q. Twiss found that their interferometric method to detect starlight showed some curious feature - the intensity of photocurrents in the two separated detectors they used showed correlations; the photons tended to arrive in bunched pairs instead of being totally random. Although Brown and Twiss (and others) rightly attributed this to the quantum nature of light, a correct explanation for the observed effect had to wait until the work of Roy J. Glauber of Harvard University in 1963.

Glauber is the winner of one-half of this year's Nobel Prize in Physics "for his contribution to the quantum theory of optical coherence".

Coherence in optics is a measure of the ability of waves to interfere with one another constructively or destructively (to give a visible interference pattern of dark and bright fringes). Two monochromatic waves (of the same frequency) are coherent if they have a constant relative `phase' between them. `Phase' of a wave refers to the position or a feature, typically a trough or a peak, in the waveform. Two waves have the same phase if the peaks and troughs of both are matched, in which case they will combine constructively. If they have a definite relative phase, they will interfere destructively depending on their relative phase at the meeting point. Waves that are incoherent, on the other hand, produce rapidly moving areas of constructive and destructive interference and, therefore, do not produce a visible interference pattern. (Figure 1)

Light from thermal sources such as a light bulb is a combination of random emissions from different points along the filament and contains a mixture of frequencies and phases. It is, therefore, incoherent. Light in a laser beam, on the other hand, is emitted as a result of a tightly controlled and ordered quantum mechanical atomic process. It has thus a definite frequency and phase and is, therefore, coherent. Its invention in 1960, shortly before Glauber's seminal paper in 1963, had already brought in quantum considerations into the realm of optics.

However, quantum effects were still seen as small fluctuations to the classical approach in the description of optical processes, that which could be handled by semi-classical analysis. A quantum-theoretic description of optical fields, which involve a large number of photons, was still lacking.

In an interview to the Nobel web site shortly after the award, Glauber said: "[In the early 1960s] it was very well understood that light has a granular structure, even though nearly everything one observed was explained by continuous waves. But there were various things about this granular structure, which were not taken fully seriously, because it didn't appear that they were necessary. In the context of the older optics, which dealt only with the intensity of light, the average intensity, and not with the statistical properties of light, you could get away with using the older form of the theory; and so people were rather lazy about it. I had the impression in the early 1960s that a couple of developments that had taken place were beginning to call for a much more vigorous version of the quantum theory, the full quantum theory that goes by the name... quantum electrodynamics."

Glauber introduced the concept of a coherent state, an appropriately defined quantum state of a statistical ensemble of photons. An essential feature of the quantum description of optical processes in terms of photons is that, when an optical observation is carried out and a photon is absorbed, the state of the field undergoes a change and the initial state for the subsequent observation is not the same as before. Glauber's construction of the coherent state incorporated this aspect. A coherent state is a mixture of quantum states with definite number of photons (from 0 to infinity) and is as close to a classical state as possible without violating the basic quantum mechanical principles.

All states of light could be described in terms of mixtures of these quantum coherent states, weighted by appropriate `distribution functions'. This is equivalent to describing any state in terms of photon statistics with different statistical distributions. An incoherent state is thus a mixture of coherent states. The statistics of photons detected from a coherent state, like a laser beam, is, therefore, distinct from the statistics of photons from an incoherent state, like that from a thermal source. Glauber was thus able to explain the correlation observed in the Hanbury Brown-Twiss (HBT) experiment to be a direct consequence of the photon statistics of an incoherent beam. A coherent laser beam will, in fact, not show HBT-like correlations at all because of the underlying photon absorption statistics.

Glauber's coherent state description thus marked the beginning of the new field of quantum optics. The concept of coherent states in quantum theory itself was not new. They occur in the quantum physics of harmonic oscillators. What was new was their use in describing optical fields. They were ideal because, like classical fields, a coherent quantum state can be ascribed attributes of amplitude and phase.

WITHIN a couple of months of Glauber's work, E.C.G. Sudarshan, the well-known Indian physicist then at the University of Rochester, was able to show that, in most classical measurements on light (like photon detection), the quantum description using the coherent state representation was formally equivalent to the classical description using probability distribution functions. In fact, he showed that the equivalence had general validity to include purely quantum situations as well. Sudarshan called it the Optical Equivalence Theorem. It forms the basis for a convenient description of optical measurements on quantum states in terms of what is now called P-Representation or Glauber-Sudarshan Representation. Together with J.R. Klauder, Sudarshan developed the detailed mathematical formalism of quantum optics.

This framework led to much of the subsequent work on quantum theory of lasers and photon correlation experiments. Many scientists feel that Sudarshan equally deserved the award.

"However, being an exact quantum description, the coherent state representation is also applicable to very low intensity levels when few photons are involved and the granularity of light influences observations," notes the Nobel citation. This is what Glauber showed in detail in his subsequent series of papers - what Sudarshan had noted earlier - that in such cases distribution functions in the coherent state admixture would include functions beyond classical probability distributions that would have no classical analogues.

"At the same time," adds the citation, "the formalism provides a tool to extract the classical limit which governs the applications of optical signals to communications and high-precision measurements. The classical description emerges, but the fundamental quantum fluctuations are still present [as quantum noise], setting limit to what accuracy is attainable in principle [in optical devices]."

Today, quantum optics has developed into an extremely active area of research and technological innovations that invariably bring quantum nature of light signals into play.

For instance, it is now possible to create what are known as `squeezed states' where the quantum uncertainty in one of the variables - position or momentum - is less than that in coherent states.

This allows a reduction in effects due to `quantum noise'. In the limit of low intensity or few photons, quantum optics can enable secure quantum communications, can be applied to quantum computing and can be used to record ultra-weak signals in high-precision experiments.

THE other half of the Nobel Prize in Physics for 2005 has been equally shared between John L. Hall, of JILA, a joint research centre of the University of Colarado and the U.S. National Institute of Standards and Technology, and Theodore W. Hansch, of Max-Planck-Institut fur Quantenoptik, Munich - "for their contribution to the development of laser-based precision spectroscopy, including the optical frequency comb technique".

With continuous improvement in spectral resolution, atomic spectroscopy, by which energy levels of atoms are measured, has provided a deeper understanding of the fine structure of atoms and the properties of the atomic nucleus. At very high-precision levels, questions about the optical transition frequencies, as atoms go from one energy level to another, can be asked. This gets related to the constancy of the fundamental constants of nature themselves. Such ideas are linked to the possible asymmetries between matter and anti-matter, an unanswered cosmological question in a matter-dominated world. By determining optical transition frequencies accurately, better atomic clocks can be designed. This, in turn, will allow better Global Positioning Systems (GPS), better space navigation and improved control of astronomical telescope arrays. The technique of the Nobel Prize-winning work opens up these various possibilities.

The definitions of the units of length and time have undergone continuous developments. In 1960, the metre was redefined as a certain number of wavelengths of a spectral line of Krypton-86. Likewise, a second was redefined in 1967 as the duration of 9,192,631,770 cycles of the (9.2 GHz) microwave signal absorbed or emitted by the transition between two hyperfine energy levels of Caesium-133 (Cs-133) atoms.

The velocity of light is just the product of frequency and wavelength of an electromagnetic signal. With improved measurement methods, the velocity of light itself could now be determined by determining the frequency and wavelength of a stable radiation source accurately. In 1983, in an experiment in which Hall was involved, the velocity of light in vacuum was determined to be 299,792,458 m/s. Now by linking this measurement to the definition of a second, an improved definition of the metre was given as the distance travelled by light in 1/299,792,458 seconds.

The question is, how accurately can one determine a second? Direct counting of the waves in the 9.2 GHz signal gives an accuracy of 1 in 10 billion. Improved peak positioning gives an accuracy of one in a trillion (1012) because the position of a peak can be determined to an accuracy of 100th of the oscillation. Interferometric techniques and signal averaging gave an accuracy of one in 1015. This corresponds to one-second drift in 30 million years.

For increased accuracy, one needs to use a faster oscillator compared to GHz frequencies in the microwave region. Oscillators such as lasers in optical frequencies, which are about 1015 Hz, can be thought of but to measure frequencies in the optical range requires clock speeds to improve by a factor of 100,000 over the Cs-133 clocks. Unfortunately, no electronics can handle this speed. A long chain of highly-stabilised lasers (to within 10 mHz) and microwave sources in large specialised laboratories had to be used to overcome this problem. This in principle would improve the accuracy to one in 1017. But the problem of counting the ticks of such a clock still remained. Also, the practical utilisation of the new definition of a metre became problematic. There was an urgent need for a simpler method to measure frequency accurately.

Enter the `optical frequency comb' (OFC) technique, the new frequency measuring stick. This important development in high-precision metrology solved the mounting problem in an ingenious way. An OFC is the spectrum of a light source, which consists of equidistant lines in the frequency space. The OFC frequencies are more like equally spaced teeth of a comb or the marks of a ruler. If the comb frequencies are known, it can be used to measure unknown frequencies by measuring beat notes, which yields the difference between comb frequencies and the measured frequencies. For doing such measurements in a wide frequency range, a large overall bandwidth of the frequency comb is needed.

The underlying concept is that if a large number of coherent oscillations of somewhat different frequencies are combined, it will result in extremely short duration pulses, caused by interference (Figure 2). This takes place if the different oscillations (modes) in a cavity are locked to each other in what is called mode-locking. The more the number of oscillations that can be locked, the shorter can be the pulses. A 5 femtosecond (fs, 10{+-}{+1}{+5} seconds) pulse locks about a million frequencies. Nowadays this is achievable in special laser media such as dyes or titanium-doped sapphire crystals in devices called mode-locked lasers. Such lasers release a train of short pulses but since they also transmit sharp frequencies, it was realised by Hansch and V.P. Chebotayev in Novosibirsk in the late 1970s. However, Chebotayev died in 1992.

The real breakthrough came in 1999 when Hansch realised that the lasers with extremely short pulses could be used to measure optical frequencies directly with respect to the Cs-133 clock. The underlying principle is the following. Roughly speaking, in a mode-locked laser, the frequencies of the laser modes are given as f(mode n) = n x f(rep), where f(rep) is the pulse repetition frequency in the cavity. Now, it is possible to tune this accurately to radio frequencies, in particular the Cs clock frequency and the number of modes (n) can be as large as 100,000 and more. The technique thus allows an accurate determination of the mode frequencies and a direct comparison of unknown sharp optical frequency oscillators because OFCs can cover the entire visible range. Also the beat frequency itself will be in the easily manageable radio frequency range.

Hansch and his colleagues demonstrated with extreme precision that the frequencies were really evenly spaced. However, one problem remained - the origin of the frequency comb is not defined because of the displacement of the pulse train from the origin by an unknown amount. Thus the absolute determination of a source frequency becomes difficult. It was Hall and his collaborators who demonstrated a practical solution to the problem around the year 2000. He showed that if the bandwidth can be made broad enough to cover an entire octave (that is, the highest frequency is double the lowest), this displacement can be determined precisely.

It is possible to create pulses with such broad frequency range in the so-called photonic-crystal fibres (Frontline, Interview with Eli Yablonovich, April 11 2003), in which the material is partially replaced by air-filled channels. In these, a broad spectrum of frequencies (40 terahertz bandwidth) can be generated by the light itself.

Hansch and Hall and their colleagues have subsequently refined these techniques into a simple instrument that is already in widespread use and is commercially available. Very recently, the frequency comb technique has been extended to the extreme ultraviolet range. This means that extreme precision can be attained at very high frequencies, thereby leading to the possibility of developing even more accurate clocks at X-ray frequencies.

In now seems possible to make frequency measurements in the future to an accuracy of one in 10{+1}{+8}. This should lead to the realisation of a new optical standard clock. The precision will make better satellite-based navigation using GPS possible. Such precision will be useful in long space journeys and for space-based telescope arrays for gravitational wave detection and testing the Theory of Relatively to higher accuracy. It may also lead to new applications in telecommunications.

Sign in to Unlock member-only benefits!
  • Bookmark stories to read later.
  • Comment on stories to start conversations.
  • Subscribe to our newsletters.
  • Get notified about discounts and offers to our products.
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide to our community guidelines for posting your comment