Rapid strides

Published : Jan 15, 2010 00:00 IST

Human embryonic STEM CELLS-

Human embryonic STEM CELLS-

Nanotechnology can be broadly defined as the manipulation of materials, structures, devices and systems that range in size from one to 100 nanometre (nm), or a billionth of a metre (10-9 m). An atom is about 10-10 m in size. It was in the mid-1980s that Eric Drexler explored the idea of molecular manufacturing through a deterministic, rather than stochastic, approach to handling individual atoms and molecules. The discovery in 1985 of large buckyball carbon molecules (fullerenes) by Harold Kroto and others (Nobel Prize 1996) and the techniques that followed to produce gram quantities of fullerenes and to fabricate nanotubes have now made Drexlers vision realisable. A few significant findings in nanotechnology and their applications have emerged from Indian laboratories also.

The potential technological impact of another revolutionary discovery has not yet been fully realised. This is the discovery of oxide-based high-temperature superconductor (HTS). The first discovery of an yttrium-barium-copper-oxide (YBCO) based-HTS, which breached the 30 K (-243C) barrier of conventional low temperature superconductors, was made in 1986 (Nobel Prize 1987) by Johannes Bednorz and Karl Mueller of IBM, Zurich. This led to the discovery of a whole class of cuprate materials that became superconducting even above the liquid nitrogen temperature of 77 K (-196C), and room-temperature superconductivity suddenly seemed possible. However, the latter has proved to be difficult and the original promise of HTS for widespread application too has not materialised. This may be because the mechanism of superconductivity in these layered materials is not yet fully understood. In the 1970s and 1980s, biotechnology made rapid strides in concepts and techniques.

Polymerase chain reaction (PCR) is one such technique invented in 1984 by Kary Mullis (Nobel Prize 1993). It is a technique that amplifies a single or a few copies of a piece of DNA (deoxyribonucleic acid) to several thousands to million copies quickly. It has become indispensable for a variety of applications such as DNA cloning, functional analysis of genes, and DNA fingerprinting. It has revolutionised infectious disease detection and diagnostics by enabling rapid screening of large populations. Stem cell therapy, which has gained importance since the late 1990s, is a potentially revolutionary concept to treat disease or injury by introducing new cells into the damaged tissues. There are mainly two cell types: embryonic stem cells and adult stem cells. The former are isolated from blastocysts and the latter are found in adult tissues. Embryonic stem cells are pluripotent; that is, unlike adult stem cells they can differentiate into all kinds of specialised tissues. The potential widespread use of stem cell therapy got a boost after James Thomson and colleagues at the University of Wisconsin, Madison, produced the first embryonic stem cell line in 1997. Most stem cell treatments, with the exception of bone marrow transplantation, are still in experimental stages. Scientists believe that stem cell therapies should be able to treat cancer, diabetes, Parkinsons disease, muscle damage, cardiac failure and neurological disorders.

The unravelling of the human genome, the genetic make-up of the human species, is one of the greatest achievements of global scientific enterprise. The international Human Genome Project (HGP) was launched in 1990 to identify and map the approximately 25,000 genes of the human genome, estimated to contain about 3.3 billion base pairs. The project released the rough draft of the genome in 2000 and the complete genome in 2003. Today genomics the study of genomes of organisms has become a major discipline on its own, and the complete genome sequences of about as many as 26 mammals and a few other animals are known besides those of thousands of viruses, hundreds of bacterial species, tens of insects and nematodes, 15 fungi, five algae and 10 higher plants.

Human genomics has unleashed a distinct approach to biotechnology and medicine. For example, gene therapy, a concept developed in the 1990s, refers to the insertion of genes into an individual to treat a disease. Hereditary diseases caused by deleterious mutations can be treated by replacing them with functional ones. Genomics has also given rise to new disciplines called functional genomics and pharmacogenomics.

Pharmacogenomics aims to develop rational means of optimising drug therapy with respect to patients genotypes in order to ensure maximum efficacy. The Indian Genetic Variation Consortium (IGVC), a multi-institutional study to map the countrys various distinct population groups, is an effort in that direction. Such approaches can lead to personalised medicine in which drug combinations are optimised for each individuals unique genetic make-up.

Functional genomics attempts to make use of the vast data produced by the whole genome sequencing to describe gene (and protein) functions and their interactions. As against the static aspects of gene sequences, it is concerned with the genome-wide description of the functions of DNA at the levels of genes, RNA (ribonucleic acid) transcripts and protein products instead of the traditional gene by gene approach. This approach has led to what has come to be called systems biology.

The ubiquitous WWW is the result of a project that Tim Berners-Lee, a British computer engineer, and Robert Cailliau, a Belgian computer scientist, undertook in November 1990 while working at the European Centre for Nuclear Research (CERN) in Geneva. The aim was to build a HyperText project called WorldWideWeb to demonstrate how information could be easily transferred by using hypertext language. Basically, their WWW was a web of hypertext documents to be viewed by browsers using client-server architecture. Berners-Lees breakthrough was in marrying hypertext to the Internet. The protocols for Internet communication were already known from the defence-related U.S. project called ARPANET. Berners-Lee used a NeXT computer as the worlds first web server and also to write the first web browser WorldWideWeb in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web. The first website built was at CERN and was first put on line on August 6, 1991. On April 30, 1993, CERN announced that the World Wide Web would be free.

An important development relating to hardware for IT is the hard disk drive (HDD), which can store great amounts of information. The discovery in 1988 of a physical phenomenon called giant magnetoresistance (GMR) by Albert Fert of Paris-South University, Orsay, France, and Peter Grnberg of the Jlich Research Centre, Germany (Nobel Prize 2007), enabled the development of devices that can quickly retrieve densely packed digital data from hard disks. This provided the technological leap for read-out heads in modern digital applications such as mp3 players and video cameras.

A quantum computer, instead of the classical computer based on binary digits (0 and 1), may soon become a reality. It will be based entirely on a new kind of computation with qualitatively new algorithms based on quantum principles. The fundamental principles towards actual realisation of these in physical objects were developed during the 1980s, and the most important discovery in this context was made by David Deutsch in 1989. He proved that all the computational capabilities of any finite machine obeying the laws of quantum computation are contained in a single machine, a Universal Quantum Computer. Quantum computing is still in its infancy. Quantum computers have been built, for example, on the basis of the principles of nuclear magnetic resonance (NMR) and have executed operations on a very small number of qubits (quantum bits), but developments clearly point to their realisation in the future.

Advances in space technology have resulted in a space-based global navigational aid called the Global Positioning System (GPS). It is an American system based on a constellation of satellites whose development began during the 1970s and 1980s and became fully operational in 1993. In May 2000, the U.S. administration discontinued its selective availability policy and civilian users could access non-degraded signals globally. Today, GPS is widely used as a tool for map-making, land-surveying, commerce, tracking and surveillance. Its precise time reference is used in many scientific applications, such as the study of earthquakes, and as time synchronisation for cellular communication protocols.

The International Space Station (ISS) is a research facility maintained in low-earth orbit for experiments in the microgravity environment of outer space in diverse fields. Its construction began in 1998 and is planned to be completed by 2011. Its operations will continue at least until 2015, but there are views that it must extend up to 2020. With 11 pressurised modules, it is the largest satellite in earth orbit as of today. It is operated jointly by NASA, the Russian Federal Space Agency, the Japan Aerospace Exploration Agency (JAXA), the Canadian Space Agency (CSA) and the European Space Agency (ESA).

Developments in space technology, particularly propulsion and on-board power systems, have resulted in advanced space probes being launched by various space agencies of the world. One of the most successful probes of the 1980s was the ESAs Giotto probe launched for an encounter with Halleys Comet. On March 13, 1986, the probe approached Halleys nucleus at a distance of 596 km and returned the best data ever obtained on Halley. It was also the first mission to capture the image of the cometary nucleus. But more complex probes have since been flown, such as the Cassini-Huygens mission to Saturn launched in 1997 and the Mars Exploration Rover missions of NASA, Spirit and Opportunity, launched in 2003 and 2004 respectively.

One of most the remarkable and high-performance science platforms in space is the Hubble Space Telescope (HST). The HST was launched aboard a space shuttle in April 1990. Hubbles Ultra Deep Field image is the most detailed image in visible light ever made of the universes most distant objects. The HST is one of NASAs Great Observatories, along with the Compton Gamma Ray Observatory, the Chandra X-ray Observatory and the Spitzer Space Telescope. It is expected to be operational until 2014, when its successor, the James Webb Telescope, is due to be launched.

Two space observatories, Planck and Herschel, were launched by the ESA in May 2009. The Herschel observatory aims to see the coldest and dustiest objects in space; for example, cool cocoons where stars form and dusty galaxies in which new stars are just being born. Planck, on the other hand, is a space observatory to observe the anisotropies of cosmic microwave background (CMB) radiation over the entire sky, using high sensitivity and angular resolution. The mission will complement the observations made by the NASA Wilkinson Microwave Anisotropy Probe (WMAP), which measured the anisotropies at larger angular scales and lower sensitivity. WMAP, a highly successful mission launched in June 2001, was itself a successor to the Cosmic Background Explorer (COBE), which was launched in November 1989. COBE produced the first results that supported the Big Bang theory of cosmology, and the work showed that the CMB was a near-perfect black-body spectrum (of temperature 2.73 K) and that it had very faint anisotropies.

One of the spectacular experimental achievements in physics in recent years has been the realisation of Bose-Einstein condensation (BEC), 70 years after Einstein predicted the phenomenon on the basis of Satyendra Nath Boses quantum statistics of light quanta. BEC refers to the collective state of a dilute gas of weakly interacting bosons (particles with integral spin) and cooled to temperatures close to absolute zero (0K, 273.15C). Under these conditions, a large fraction of the bosons occupy the lowest possible quantum state. This collective quantum state, in fact, becomes apparent at a macroscopic level. Such a gaseous condensate was first produced by Eric Cornell and Carl Wieman in 1995 at the University of Colarado using a gas of rubidium atoms cooled to 170 nanokelvin (nK) (1.7x10-7 K). For this breakthrough, Cornell and Wieman along with Wolfgang Ketterle of the Massachusetts Institute of Technology (MIT) received the 2001 Nobel Prize in Physics.

The Standard Model (SM) of fundamental particles and forces of nature is one of the most successful theories in physics. According to it, the constituents of all matter (at low energies) are six quarks u, b, c, s, t and b and six leptons, the electron and its two cousins, the muon and the tau, and their respective chargeless neutrinos. In the 1990s, only the t (top) quark, which was predicted in 1973, remained to be discovered. If the Model was right, the top quark, the heaviest of the six, had to be found, especially after the fifth quark (b) had been discovered in 1977. Early searches for the top quark at accelerator centres SLAC and DESY (in Hamburg) had failed to find it. Between 1992 and 1995, at Fermilabs Tevatron hadron collider in the U.S., two groups working with two different detectors D-zero and CDF gathered enough data to show unambiguously its existence, with a mass equivalent to 176 18 GeV (giga electron volt) energy. In March 2009, D-zero and CDF announced the detection of single top quark events (instead of their production as top-antitop quark pairs), which is an even stronger confirmation of the SM. The Large Hadron Collider, which will begin its physics runs soon, is geared to find the remaining missing piece of the Model, the Higgs Boson. Its start-up is another landmark event in high-energy physics.

The SM does not predict the masses of the fundamental particles and the weakly interacting neutrino, which since its discovery in 1956 was always thought to be massless. However, it was known that if the neutrinos had mass, they would mix and transform from one kind to another (neutrino oscillations). In 1998, results from the Super Kamiokande neutrino detector in Japan determined that neutrinos do oscillate and thus have non-zero mass. Various empirical limits suggest that neutrinos have tiny masses. Other neutrino detectors worldwide have confirmed this finding but there are still unknown features in the leptonic sector of the SM, which the proposed underground India-based Neutrino Observatory (INO) could detect.

In astronomy and cosmology, dark matter and dark energy are postulates required respectively to explain the missing mass in the universe and the missing energy required to explain the accelerating expansion of the universe. Only about 4.6 per cent of the mass of the universe is ordinary matter. About 23 per cent is thought to be composed of dark matter. The remaining is thought to consist of dark energy distributed diffusedly in space. The discovery in 1998 that the universe is actually speeding up its expansion came as a total surprise. It was counter-intuitive, but the evidence has become convincing. There have been observations that can be explained only by invoking the presence of dark matter. In 1989, Margaret Geller and John Huchra, on the basis of redshift survey data, discovered the presence of the Great (Galactic) Wall, which is one of the largest known superstructures in the universe. It is a filament of galaxies approximately 200 million light years away and has dimensions which measure over 500 million light-years long, 300 million light-years wide and 15 million light-years thick. According to cosmologists, dark matter dictates the structure of the universe on the grandest of scales and such structures could form along web-like strings of dark matter. More recently, in August 2006, gravitational lensing observations in the Bullet Cluster showed that much of the lensing mass is separated from the X-ray-emitting baryonic mass. That is, the centre of the total mass (dark+normal) is displaced with respect to the visible matter. To date, this remains the best evidence for dark matter.

Mathematics has also had its share of spectacular achievements in the last two decades. The most important one is the proof of the famous Fermats Last Theorem in 1995 by the British mathematician Andrew Wiles. In number theory, Fermats Last Theorem states that there exist no three positive integers a, b, and c that satisfy the equation an + bn = cn for any integer n greater than two. Pierre de Fermat had conjectured it in 1637 and had famously stated in his copy of Arithmetica, I have discovered a truly marvellous proof of this, which this margin is too narrow to contain. Given the complexity of Wiles proof using techniques that have evolved only in the 20th century, and that a proof had eluded mathematicians for over 350 years, historians doubt whether Fermat really had a proof as he had claimed.

The other major landmark work in mathematics is that of the three IIT Kanpur computer scientists, Manindra Agrawal and his students Neeraj Kayal and Nitin Saxena, who provided a finite algorithm to determine whether a given number (however large) is a prime number or not. This finite-time deterministic algorithm has come to be called the AKS Primality Test, which the trio established in August 2002. The significance of the AKS Test is that it is the first primality-proving algorithm to be simultaneously general, polynomial, deterministic and unconditional. Previous algorithms have achieved any three of these properties, but not all four. The three were bestowed with several awards, including the famous Godel Prize in 2006.

The most significant scientific finding of recent times is the unequivocal evidence, based on climate observations and model studies by the global community of atmospheric scientists, that human activities do adversely impact global climate. Anthropogenic activities over the last 150 years, particularly the indiscriminate use of fossil fuels, have warmed the earth by nearly 0.8C. This global scientific exercise to determine the cause of the observed warming began with the creation of the Intergovernmental Panel on Climate Change (IPCC) in 1988 by the U.N. Environment Programme and the World Meteorological Organisation. The IPCCs report in 2007 says that if irreversible climate change that threatens humanity as a whole is to be avoided, greenhouse gas (GHG) emissions in the atmosphere should not exceed a concentration of 450 ppmv of carbon dioxide (CO2) equivalent by 2050 so that global warming remains less than 2C.

Sign in to Unlock member-only benefits!
  • Bookmark stories to read later.
  • Comment on stories to start conversations.
  • Subscribe to our newsletters.
  • Get notified about discounts and offers to our products.
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide to our community guidelines for posting your comment