Masters of light

Print edition : November 20, 2009

Charles K. Kao, who won half the prize for his research on optical fibres.-AFP

OPTICAL fibres thin threads of glass in which light flows are the backbone of todays worldwide communication networks. These low-loss glass fibres facilitate global broadband communication such as the Internet which carries huge volumes of data, text, voice, images and video. This years Nobel Prize has been awarded for two scientific achievements that helped in laying the foundations for the current information technology revolution and modern networked societies.

One half of the prize of 10 million Swedish kroner has gone to 76-year-old Charles K. Kao of the Standard Telecommunication Laboratories (STL), Harlow, United Kingdom, and the Chinese University of Hong Kong for groundbreaking achievements concerning the transmission of light in fibres for optical communication. If we were to unravel all the fibres that go around the globe to build this intensely networked world, we would get a single thread over one billion kilometres long, which is enough to encircle the earth more than 25,000 times. And this is only increasing by thousands of kilometres every day. A large fraction of this communication traffic is in the form of digitised images that are transmitted across the globe at the click of a computer mouse.

The other half of the prize has gone to 85-year-old Willard S. Boyle and 79-year-old George E. Smith of Bell Laboratories, New Jersey, United States, for the invention of an imaging semiconductor circuit the Charge-Coupled Device (CCD) sensor. The CCD is a digital cameras electronic eye that has made the photographic film obsolete. The digital form enables easy processing, transmission through optical networks and distribution of these images. Digital photography has become an indispensable tool in many fields of research, particularly astronomy and space, enabling images of the previously unseen with unbelievable clarity and resolution. CCD technology is also used in medical imaging, both for diagnostics and for microsurgery.

While it was known that light can be transmitted and even bent in media with a higher refractive index than air such as glass, communication using light over long distances had to wait for a whole lot of other developments and inventions, big and small. The glass fibre required modern glass technology to evolve to be able to draw ultra-pure glass fibres, a reliable source of light such as laser needed to be invented, which was provided by semiconductor technology. Besides, other components required to make long-distance transmission possible such as specialised transistors, amplifiers, switches, transmitters and receivers, and other devices had to develop alongside.

Willard S. Boyle, who shared the other half of the prize with (below) George E. Smith for inventing the charge-coupled device. A CCD chip is seen inside a digital camera held by Smith.-ANDREW VAUGHAN/THE CANADIAN PRESS/AP

That internal reflection in media such as glass could be used for transmitting light was known during the 19th century. The first ideas of guiding light in glass fibres date back to the 1920s and were explored for transmitting images in medicine, defence and even early television. But glass fibres were quite leaky and did not transmit efficiently. If fibres touched each other or the surface of fibres was scratchy, light managed to get out of the fibres; they also got worn out easily. In the 1950s, it was demonstrated that cladding the fibres in a sheath of glass with a lower refractive index facilitated total internal reflection and helped light transmission.


In 1954, H.H. Hopkins and Narinder Singh Kapany, an Indian scientist working at Imperial College, London, successfully constructed a bundle of several thousand fibres 75 cm long and demonstrated its good image transmission properties. Combining bundling of fibres with cladding, in particular for its application in medical imaging with gastroscope, the development even reached industrial production. Kapany also developed the theory of light propagation in fibres, which was later improved by E. Snitzer. He is, in fact, credited with the coining of the phrase fibre optics. Kapanys glass-clad fibres, however, did not serve the purpose of long-distance transmission. The 1960s were also the times when electronics and radio technology was riding high and optical technologies did not receive that much attention. Satellite communication began to cater to the vastly growing demands of communication, including telephony, radio and television.

Of course, it was only a question of time that light would come into play for communication. It was known that the higher the frequency of the transmitted signal, the higher the signal modulation frequency and consequently higher bandwidth, higher transmission efficiency and higher data transmission rate. Thus the optical region of the spectrum, comprising the visible, the near infrared and the ultraviolet (UV), were always attractive wavelengths for communications as compared to the radio wavelengths. But in the 1950s and 1960s, optical communication was not viewed as a viable concept. It was Kaos vision and perseverance with the possibility that has made long-distance optical transmission and communication possible today. While he justly deserves the award, it can be equally justifiably argued that Kapany too deserved to be awarded along with Kao. Is the reason for this omission the Nobel Foundations Rule of Three, which limits the number of awardees in a discipline to three, at work?

The invention of the laser in the 1960s made a decisive shift in the shape of things to come. The laser is a stable source of light that emits an intense beam of light that can be focussed and pumped into the thin glass fibre. First came the pulsed ruby laser and very soon the continuous wave (CW) gas-based lasers. Semiconductor lasers also appeared around the same time, but the real technological breakthrough in the form of semiconductor-heterostructure lasers enabled their operation at room temperature, which obviated the cooling requirements of the earlier lasers, making them ideal for optical communication applications. In fact, Willard S. Boyles invention of the CCD, which has got the Nobel this year, overshadowed his invention of the CW ruby laser in 1961.

These developments boosted the search for a suitable optical transmission media over long distances because all information could now be converted into extremely fast flashes of light, corresponding to digital 1s and 0s. Optical fibre was still not under consideration because of the high attenuation in it. First optical fibres typically had an attenuation rate of about 1,000 dB/km, which means a loss of about 99 per cent of the signal during transmission over just 20 m. Different variants of the basic mechanism, including various waveguides, sequences of lenses and gas tubes, were tried without much success.

Kao, who was then a young electronics engineer working at STL on optical communication, took up the challenge of reducing the loss of light in optical fibres. Along with his colleague George Hockham, Kao meticulously studied the properties of glass fibres. Their stated goal was that at least 1 per cent of the light signal that entered the glass fibre should remain over a distance of 1 km. Unlike their contemporaries in the business, the two considered not only the physics of waveguides but also the material properties, in particular scattering from the fibre. Their results were presented and published in 1966. They found that it was not the imperfections in the fibre that caused the loss of light but absorption and scattering. The latter was predominantly caused by impurities, in particular iron, in the fibre. Including these effects they showed that the attenuation would be only a few dB/km, much less than that seen at that time. They also concluded that fibres made of high-purity glass could greatly improve transmission, and cladded single-mode fibre (that supports only a single propagation path within), as against a multi-mode fibre that had higher losses and higher dispersion, is the best transmission medium for optical communication.

Though Kao was aware of difficulties in producing ultra-pure glass fibres, his visionary goal of a glass fibre of transparency that had never been attained before spurred researchers and manufacturers worldwide to achieve it. Quartz, the most abundant mineral on earth, is the basic material used for making glass. Impurities in glass manufacture arise from additives such as lime and soda that are used to simplify the process. Kao suggested that fused quartz, or fused silica, could be used to make ultra-pure glass. The problem, however, lay in the fact that fused quartz melted only at 2,000 Celsius making the drawing of thin fibres from it difficult.

In 1971, four years after Kao spelt out his vision, Corning Glass Works, U.S., succeeded in producing a one-km-long ultra-pure optical fibre using the Chemical Vapour Deposition (CVD) technique. To make a core and a cladding with very close refractive indices, they doped titanium in the fused silica core and used pure silica in the cladding. An improved fibre with germanium doping, which achieved a loss of mere 4 dB/km with 850 nm frequency light, appeared a few years later. Researchers at Bell Labs used a modified CVD technique to achieve an attenuation loss of less than 1 dB/km, far below what Kao had set out to achieve. Extremely thin glass fibres are actually not fragile as it might seem. A correctly drawn fibre is, in fact, strong, light and flexible, a necessary requirement for fibres to be buried under earth or water or bent around.

The first non-experimental fibre was installed in the U.K. in 1975 and soon after in the U.S. and Japan. In 1988, the first optical cable was laid out connecting Europe and the U.S. through the Atlantic Ocean with a 6,000-km-long fibre.

Today, attenuation of light at 1.55 micrometre (infrared frequency, the most favoured one today for communication because of lowest fibre losses) is below 0.2 dB/km. Optical fibres of the present day are extraordinarily transparent to light signals with more than 95 per cent of the light remaining even after traversing one kilometre. This level of attenuation is compensated these days using optical amplifiers, particularly erbium-doped fibre amplifiers, which are placed at regular intervals. Earlier this was achieved using electronic repeaters. Techniques such as wavelength division multiplexing (WDM) are used to carry different signals in the same fibre, thus increasing transmission rates several fold. Today the capacity of optical networks is growing at an enormous rate and transferring terabits of information is now a reality.

In the 1960s, Kao doing an experiment on optical fibre at the Standard Telecommunications Laboratory in Harlow, U.K.-HO/CHINESE UNIVERSITY OF HONG KONG/AFP

The second part of the Nobel Prize, namely the invention of the CCD, is actually one that happened in a surprising and unanticipated manner, not serendipitous though. What today forms the key element in digital imaging of all kinds and is an integral component of modern-day digital cameras, videocams, medical imaging systems, astronomical telescopes and satellite imaging cameras is not what its inventors, Boyle and Smith, working at Bell Labs 40 years ago, set out to do. Their aim was to create a better electronic memory. Boyle was then the Director of the Semiconductor Device Development Laboratory, and Smith (who, incidentally, holds the record of the shortest Ph.D. thesis of just three pages at the University of Chicago) was the Department Head under Boyle, both engaged in semiconductor device development. They had been egged on by their boss and the father of transistor electronics, Jack Morton, to come up with something that can compete with the great stuff that others at Bell Labs were doing with magnetic bubbles towards developing a high capacity memory device. The heck with transistors. Try and come up with something different, Morton is supposed to have provoked them.

In a recent review, Smith says: Jack Morton was anxious to speed up the development of magnetic bubbles as a major memory technology and was planning to transfer funds from Bills division to the other where the bubble work was being done. Magnetic bubble memory uses a thin film of a magnetic material to hold small magnetised areas, known as bubbles, each of which stores one bit of data. To escape the threat held out by their boss, Boyle and Smith apparently got together on the afternoon of October 17, 1969. In a discussion lasting not more than an hour, writes Smith, the basic structure of the CCD was sketched out on the blackboard, the principles of operation were defined and some preliminary ideas concerning applications were developed.

Their idea was to develop an electrical analogue of the magnetic bubble concept, the charged bubble if one may call it, based on semiconductor physics of negatively charged electrons and positively charged holes. The train of thoughts, writes Smith, that evolved was as follows. The electric dual is a packet of charge. The next problem is how to store this charge in a confined region. The structure that came to mind was a simple metal-oxide semiconductor (MOS) capacitor in depletion. When they had finished the basic design for the CCD, they found out from the fabrication facility that it would take just a week to assemble the prototype. Apparently, other researchers at Bell Labs were sceptical of the idea. But, you know the first model [of the CCD] that we made, Boyle told the interviewer from the Nobel website shortly after being chosen for the award, worked immediately and it was amazing! We never had that kind of luck before.

But, as a memory device, the CCD they invented is long forgotten but it has become an indispensable part of modern imaging technology. In fact, the bubble memory device too, which started out as a promising technology in the 1970s, failed as newer developments in hard-disk storage overtook this development which not only vastly increased their capacities but brought down their prices as well.

A CCD device is made of silicon, like most electronic devices. The CCD imaging technique makes use of the photoelectric effect predicted by Albert Einstein. A silicon CCD array, say the size of a stamp, carries millions of photocells that are sensitive to light and when light strikes the photocells, electrons are knocked out. The liberated electrons are gathered in the cells, which act like small wells for them, with the number being proportional to the intensity of light. By applying a voltage to the CCD array, the contents of the wells can be read out one by one; row by row, the electrons are transported for being processed by a mechanism that is akin to a conveyor belt. For example, an array of 10 x 10 image points is transformed to a linear chain of 100 charge read-outs. The CCD array thus transforms the optical signal into an electrical signal and the signals are subsequently processed into digital 1s and 0s. Each cell can then be converted back to an image point, called a pixel. The imaging capacity is given in terms of pixels. For example, a 1280 x 1024 CCD array will have a capacity of 1.3 mega (million) pixels.

The CCD renders the image in black and white; so various filters are used to obtain the original colours in the image. One kind of filter, which is one of the base colours of red, blue or green, is placed over every cell.

In just a year after the invention, Smith and Boyle demonstrated a CCD in their video camera. In 1972 the American company Fairchild Corporation constructed the first image sensor with 100 x 100 pixels, which soon entered production. In 1975, Boyle and Smith themselves built a video camera of sufficiently high resolution that could serve television broadcasts. The first camera, with a built-in CCD sensor, however, hit the market only in 1981. It was bulky when compared with contemporary cameras and somewhat primitive in its operational characteristics. But this ushered in the commercialisation of digital photography. In 1986, the first 1.4 megapixel sensor came on the scene and in 1995 the First Worlds first fully digital photographic camera appeared.

Since then, the market has just exploded with more and more compact cameras and cheaper products. And with that, one can say that the era of film photography, which began in 1839 with Louis Daguerre and his Daguerreotypes, has ended. Three years ago, cameras exceeded the limit of 100 megapixels. Current developments have also resulted in a new challenge to the CCD in the form of what is called Complementary Metal-Oxide Semiconductor (CMOS) technology, which was also invented around the same time. This technology has the advantage of lower energy consumption. But, however, it is yet to catch up with the CCD in terms of resolution and noise levels. So, for advanced applications, CCD is the technology of choice.

A 1974 handout photo of Boyle (left) and Smith with the CCD, which transforms patterns of light into useful digital information.-ALCATEL-LUCENT BELL LABS/AFP

It is thanks to the CCD sensor that the Hubble Space Telescope is able to send back astonishing images of the cosmos. The Hubble CCD camera originally had only 0.64 megapixels. However, with four sensors interconnected, Hubble has a total of 2.56 megapixels. Today one is talking of a 95 megapixel mosaic sensor for the Kepler Telescope launched early this year. Quite in the early days of the CCD, astronomers had seen the advantages of digital CCD sensor. It spans the entire electromagnetic spectrum, from X-rays to infrared rays. It is a thousand times more sensitive than a conventional camera because a CCD can catch 90 of the incoming photons whereas the latter will catch only one. A CCD camera gathers light from a distant astronomical object in a few seconds; otherwise it used to take several hours. Astronomy began to adapt the CCD technology in 1974 itself; it was used to take the first digital image of the moon. In 1979, a digital camera with 320 x 512 pixels was mounted on one of the telescopes at Kitt Peak in Arizona, U.S.

Today the CCD camera is ubiquitous, whether it is ordinary photography, videography or advanced imaging such as surveillance and medical imaging for diagnosis and surgery. Digital image today has virtually become part of everyday life, in a span of just 40 years.

This article is closed for comments.
Please Email the Editor