The dark side of the universe

Print edition : April 28, 2017

Figure 3: Inhomogeneities in the universe seen on the large scale.

Saul Perlmutter. Photo: AFP

Adam Riess. Photo: AFP

Brian Schmidt. Photo: AFP

Professor Subir Sarkar of Oxford University.

Figure 1: Pie chart depicting the contents of the universe.

Figure 2: Supernovae characteristics: Tree diagram showing different classes of supernovae.

It is widely accepted that the dominant fraction of the total mass-energy content in the universe consists of “dark energy”, which is pushing galaxies away from each other at a rate that is increasing with time. But a recent analysis has found that the very evidence for accelerated expansion of the universe is weak.

ACCORDING to the currently widely accepted Standard Cosmological Model (SCM), the dominant fraction of the total mass-energy content in the universe (68.3 per cent) consists of some unknown form of energy called “dark energy”, which is pushing galaxies and clusters of galaxies away from each other at a rate that is increasing with time. Of the remaining 31.7 per cent, visible matter that makes up planets, stars, galaxies, and so on accounts for a mere 4.9 per cent and the rest is some mysterious unknown form of matter called “dark matter” (Figure 1). Following the discovery by Edwin Hubble in 1925, we know that after the Big Bang, the universe itself has been expanding in all directions, causing galaxies to move away from us as well as from each other, much like expanding raisin bread in an oven if you were to imagine the raisins to be stars and galaxies. This expansion, it is now widely believed, is actually speeding up, and to explain this acceleration it has become necessary to invoke this mysterious “dark energy”, whose nature is completely unknown.

While the evidence for the existence of dark matter appears to be fairly strong, experiments so far have been unable to detect it in any of its many theoretically hypothesised forms. On the other hand, according to a recent analysis, in the case of “dark energy”, even the evidence for it, namely, the accelerated expansion of the universe, seems to be on shaky ground.

Hubble constant

The Big Bang, as a result of which the universe came into being, is not an explosion of matter into empty space but the expansion of space with time as a result of which the physical distance between any two cosmic objects increases. This expansion (now known as Hubble’s Law), according to Hubble, is such that the farther the galaxy is, the faster it recedes from us. The proportionality factor in this linear relationship is called the Hubble Constant, which gives a measure of the rate of expansion.

(The recession velocity is inferred from the observed object’s redshift, which is the amount of shift of light from the source towards the red caused by the Doppler Effect just as the pitch of a car’s honking drops as it moves away from us. Thus, high redshift objects are moving away faster from us and are thus at much greater distances.)

Since Hubble’s discovery, cosmologists have believed that this expansion should eventually slow down, perhaps even reverse and cause the universe to contract, because of the mutual gravitational pull of all the matter (both visible and dark) that the universe contains. The ultimate fate of the universe could then be of a “Big Crunch”, with the universe collapsing back on itself and all the mass reversing to the starting point, or at best settling down to a steady state.

But observations made in the 1990s on a certain class of supernovae (called Type Ia) have led astronomers to conclude that the expansion of the universe is actually accelerating instead of slowing down as was expected. So it was argued that there ought to be some mysterious repulsive force that is counteracting gravity to cause this acceleration. This hypothetical energy was termed “dark knergy” because scientists had no idea what it could be. If dark energy truly exists, then the universe would be eternally expanding.

The apparent evidence for it came around 1998 when two astronomy research groups—the Supernova Cosmology Project (SCP) team and the High-Z Supernova Search (HZSS) team—concluded on the basis of their observations on several Supernova Ia (SN Ia) objects, and independent of each other, that the universe that we know to be expanding as a result of the Big Bang that created it was actually speeding up its rate of expansion. The SCP, headed by Saul Perlmutter, was initiated in 1988, and the HZSS, headed by Brian Schmidt and Adam Riess, was launched in 1994.

But according to a recent analysis by a team of scientists from the Niels Bohr Institute, Copenhagen, the very evidence for the accelerated expansion is weak. The authors of the work are Subir Sarkar (who is also a Professor at the Physics Department of Oxford University), his doctoral student Jeppe Nielsen, and Alberto Guffanti (who has since moved to University of Torino) and their paper was published last October in the journal Science Reports of the Nature group.

Compared with a small set of SN Ia that were known in the 1990s, from whose data the phenomenon of accelerated expansion was inferred, Sarkar and his colleagues looked at a much larger database of such objects that is available today as a result of large-scale all-sky surveys and have concluded that any indication of accelerated expansion from this larger data set is at best marginal. Put in the language of statistics, they say in their paper that the statistical significance of the claimed evidence for accelerated expansion is just “3 sigma”, well below the accepted golden standard of “5 sigma” for it to be called a discovery. In lay language, a 3-sigma significance corresponds to the finding having about three-in-a-thousand chance of being a statistical fluke or just background noise while 5-sigma corresponds to it having only one-in-two-million chance of being so. That is, in 3-sigma, we are only 99.7 per cent certain that there is accelerated expansion as against 99.999 per cent confidence in 5-sigma.

To understand the importance of any evidence to have a 5-sigma significance, consider the case of the recently discovered Higgs Boson. Even though early on there were indications of a 3-sigma level evidence of the particle in the data from the initial runs of the Large Hadron Collider (LHC) at CERN (European Organisation for Nuclear Research), scientists waited to gather enough data for the statistical significance to build up to 5-sigma level before they knew that they had a discovery. To give another analogy, Sarkar points out the case of suggestion for the existence of a new particle with a mass of about 750 GeV from initial LHC data in December 2015. The evidence initially had 3.9-sigma and 3.4-sigma significance in the two LHC experiments ATLAS and CMS respectively. But with additional data, this evidence disappeared when in August 2016 it was announced that the significance had dropped to less than 1-sigma. It was just a statistical fluctuation and there is no such particle. The case of SN Ia data is similar.

Cosmic explosions

Supernovae are violent explosions of massive stars when they reach the last evolutionary stages of their life. These are the largest cosmic explosions that take place in the universe and the brightness of a single supernova can exceed the total output of the sun during its entire lifetime of 10 billion years. They are bright enough to be visible at large distances and a single explosion, which can last over weeks, can outshine an entire galaxy.

Type I supernovae are a class of supernovae discovered during the 1980s whose “light curves”, plots of brightness of supernovae as they wax and wane during the weeks following the explosion, have a very sharp maxima and then die away smoothly and gradually. They also have practically no hydrogen in them, as a result of which they do not show any hydrogen emission lines in their spectra. SN Ia, which are end-of-life runaway explosions of extremely compact stars called white dwarfs, are a subclass which, in addition, have silicon absorption lines (at 615 nm wavelength) in their spectra (Figure 2). Besides these distinct properties, their peak luminosity, their spectra and their “light curves” have been found to be amazingly uniform, matching feature by feature.

Standard candles

To record the expansion history of the universe, astronomers use a distinguishable class of celestial objects of known intrinsic brightness that exist over a wide range of distances as a “standard candle”. The recorded redshift and brightness of such objects will thus provide a measure of the total integrated expansion of the universe since the time the light was emitted. A compilation of such measurements over a sufficient range of distances will give us a record of the history of the universe’s expansion. Since SN Ia are believed to have the same intrinsic luminosity, they serve as excellent “standard candles” for use as cosmic distance markers. (Since the absolute luminosity of SN Ia class of objects is known, their measured luminosity at the detector on earth will give us their distance by applying the inverse square law, which says that the brightness of an object falls off as the square of its distance.)

The expansion history of the universe is essentially determined by its mass density. The expansion will be slowed down by gravity if the mass density is high. What the two experiments (of Perlmutter and Schmidt-Riess) found was that the deceleration for a given redshift was far less than expected for a high mass density universe such as ours. In particular, when the experiments looked at high redshift supernovae, they were much fainter than what one would expect even for an empty cosmos. If the cosmic expansion had been slowing down, the objects should have been brighter than what they seemed to be. From this they came to the surprising conclusion that the universe was not decelerating but actually accelerating.

While the HZSS observations were based on 16 SN Ia, the SCP observations included 45 such objects. Given that two independent experiments had come to the same conclusion, the apparent discovery has found general acceptance of this unexpected result by astronomers and cosmologists. Also, later investigations of supernovae with very high redshift—meaning supernovae from the very distant past when the universe was much denser—led researchers to make a much stronger statement that the repulsion that causes the speeding up of expansion set in when the universe was about half its present age. The results have also led to the widespread acceptance of the idea that the universe is dominated by some mysterious energy density called “dark energy” that is driving this accelerated expansion.

The original equations of General Relativity (GR) written down by Einstein in 1915, which form the bedrock of all studies in cosmology, does not suggest the existence of any such repulsive counterforce to gravity in the universe. However, there is a simple way to accommodate dark energy in GR equations, and that is by introducing a constant “lambda”, called the cosmological constant, which represents a constant energy component in the universe, into the GR equations. Interestingly, Einstein himself had introduced such a constant subsequently to make the solutions of the equations to represent a static universe rather than an expanding or a contracting universe. The solutions to the GR equations in their original form suggested that the universe should be an expanding or a shrinking one, which Einstein found very disturbing and contrary to his view of a static universe. But subsequent to Hubble’s discovery, he removed the constant, saying that the introduction of the constant was his biggest blunder.

Following the apparent discovery of accelerated expansion of the universe, “lambda” has made a comeback into the GR equations to accommodate dark energy. This constant energy component is now interpreted to represent the total energy of the vacuum in the universe. In quantum theory, vacuum is a seething sea of virtual particles and antiparticles being constantly created and annihilated.

The interpretation of “lambda” as the vacuum energy has its own pitfalls because the estimate of the universe’s vacuum energy far exceeds the dark energy density required to cause the measured apparent accelerated expansion. Nevertheless, it has come to represent dark energy in Einstein’s equations but its exact nature is not known. Since this constant energy component does not change with time, it began to dominate when matter got diluted to a low density because of expansion over billions of years. That, according to scientists, will explain why the cosmological constant entered the picture so late in the history of the universe, some five to six billion years ago. The lambda-modified Einstein’s equations, which govern the evolution of the universe, together with the presence of all-pervasive “cold dark matter” (CDM) co-existing with normal matter (see part I) is what constitutes the generally accepted “Standard Cosmological Model”, or the lambda-CDM model of the universe.

This dramatic finding naturally brought researchers many awards, most notably the Nobel Prize in 2011 “for the discovery of the accelerated expansion of the universe through the observations of distant supernovae” to quote the Nobel citation. The research was also awarded the Gruber Cosmology Prize in 2009 and Breakthrough Prize in Fundamental Physics in 2015. These awards have also reinforced the belief in the concept of dark energy that had come to be accepted following the research finding by the two groups in 1998. The latest results by Sarkar and his colleagues, if true, will overturn the findings and awards and upset the paradigm shift that has occurred in cosmology since then. In fact, it will be the first ever case of a work that merited the Nobel Physics prize being proved wrong.

It is interesting to note that Sarkar was critical of the finding even at the time of the Nobel Award in 2011 ( Frontline, November 4, 2011) when he told Frontline: “The evidence for accelerated expansion is also indirect. It is based on interpreting the brightness of distant supernovae (versus their redshift) in the framework of an assumed homogeneous cosmological model whereas the real universe is manifestly not so [Figure 3]…The very fact that this immediately implies that the universe is dominated by the cosmological constant should ring alarm bells and force us to examine the assumptions of the ‘standard cosmological model’ …. Although the universe is inhomogeneous on small scales, averaging over spatial fluctuations and studying time evolution is not easily done in a mathematically rigorous and consistent way. The real problem is we have very few cosmological solutions of the GR equations apart from the simplest ones based on the cosmological principle [which says that the universe is homogeneous and isotropic; that is, the universe looks the same on cosmological scales to all observers, independent of the location and independent of the direction in which they look].”

Statistical technique

For their analysis, Sarkar et al. used the much larger set of 740 SN Ia contained in what is known as the Joint Lightcurve Analysis (JLA) catalogue, which is more than 10 times the number of SN Ia used in the original 1998 analysis by Perlmutter and the Schmidt-Riess combine, which formed the basis for the emergence of the “dark energy” postulate. They carried out what is known as “Maximum Likelihood” analysis on the data, which is a well-known industry-standard statistical technique. The analysis showed that the evidence for accelerated expansion from the data is less than 3-sigma.

The authors also point out that another group of researchers, which had carried out an equivalent and much more sophisticated statistical analysis known as “Bayesian Hierarchical Model” in 2016 on the same SN Ia database of JLA, arrived at similar conclusions regarding the inference of an accelerating universe. It may also be pointed out here that the authors of the larger JLA catalogue itself (M. Betoule and associates in the journal Astronomy & Astrophysics in 2014) from their analysis had concluded otherwise; that is, that the JLA data reinforce the original conclusion of the 1998 analysis.

However, Sarkar et al. have pointed out several shortcomings in Betoule et al.’s paper. According to Sarkar, Betoule et al. used a flawed statistical technique (called “constrained chi-squared” fit) to analyse the data and arrive at their conclusion of accelerated expansion. “We are,” Subir et al. write in their paper, “concerned here solely with performing the analysis in a statistically sound manner to highlight the different conclusions from the previous analysis [by Betoule et al.] of the same data.” Sarkar et al.’s paper has, not unexpectedly, caused quite a bit of flutter in the astronomical community, with several rebuttals appearing in blogs and even social media, including some that have stooped to unbecomingly low levels of non-academic means of criticism and attack. Only one criticism of the work has appeared as a peer-reviewed publication in Astrophysical Journal, which is a paper by D. Rubin and B. Hayden from the Lawrence Berkeley Laboratory, California. According to Rubin and Hayden, there are errors in the Nielsen-Sarkar-Guffanti paper arising from doing analysis on datasets that have been insufficiently corrected for observational biases in astronomy against fainter stars, especially when making measurements of highly luminous objects such as SN Ia. Countering this criticism, Sarkar pointed out: “Rubin and Hayden introduce 12 additional parameters (on top of the 10 we have in our [statistical] fit [to the data]) to describe what they say is uncorrected-for ‘Malmquist bias’ in the JLA dataset which we analysed. In fact its authors Betoule et. al.…had clearly stated that they had made all necessary corrections.

“Even assuming that this [additional correction by them] is justified, after doubling the number of parameters they [Rubin and Hayden] get barely any improvement in the evidence for acceleration. They confirm our result that the evidence for acceleration is barely 2.8 sigma and even after doing their so-called corrections, it goes up to a mere 3.6 sigma. This is still too low in significance to base the claim of a discovery on even though the dataset has increased from about 50 to 740 SN Ia in the last 20 years.”

One could, of course, argue that there are other pieces of evidences for the existence of dark energy and, indeed, bulk of the other criticisms, essentially in various blogs, have to do with the evidence for dark energy from other observations, beyond SN Ia measurements, such as measurements on the Cosmic Microwave Background (CMB) radiation, particularly the results from the Planck satellite mission of the European Space Agency (ESA) between 2009 and 2013 to achieve a high-resolution map of the CMB and derive fundamental cosmic parameters therefrom, and the phenomenon called the Baryon Acoustic Oscillations (BAO).

The minute fluctuations in the pattern of CMB are a kind of proxy for the extent of structures within the universe. Dark energy being repulsive, it would act against the formation of such structures and the distribution of these primordial fluctuations is such that it is consistent with the presence of dark energy. Planck’s CMB data are also supposed to tell us about the geometry of the universe, according to which the space is close to being flat.

It is the mass-energy density of the universe that determines the geometry of the universe. The universe would be flat like a sheet of paper, and infinite in extent, only if the density of the universe exactly equals the critical density. However, according to scientists, the observed mass-energy density of the universe is not sufficient to make the geometry “flat”. About 70 per cent seems to be missing. And hypothesising the existence of this mysterious dark energy from the SN Ia data seemed to naturally resolve this discrepancy. So, flatness of the universe as determined the CMB measurements is taken to be evidence for dark energy.

Similarly, BAO is the imprint of the primordial fluctuations in the CMB, which in the present epoch has manifested as galaxies around us. By looking at the distances at which galaxies at different redshifts tend to cluster, it is possible to determine a standard angular diameter distance of clustering and use that to compare the distances predicted by different cosmological models. The observed angular distances from Planck data apparently provide support to the SCM with an accelerated expansion of the universe.

On cosmological models

Sarkar said in an email to Frontline: “We know that in recent years the SCM of an accelerating universe has been found to be quite consistent with the precision data from the Planck satellite. However, all such tests are indirect—carried out in the framework of an assumed model. For example, dark energy has no direct influence on the CMB. What is deduced from the data is the spatial curvature and the fractional matter content of the universe. Dark energy is then assumed to make up the rest for a spatially flat universe as suggested by the data. But this need not be correct if there are other corrections to the mass-energy density equation.

“So it is quite possible that we are being misled and the apparent manifestation of dark energy is a consequence of analysing the data in an oversimplified theoretical model (which was in fact constructed in the 1930s—before there was any real data).… A more sophisticated theoretical framework accounting for the observation that the universe is not exactly homogeneous and that its matter content may not behave as an ideal gas (two key assumptions of the standard cosmology) may well be able to account for all observations without requiring dark energy, of which we have absolutely no understanding in fundamental theory.”

“Naturally,” Sarkar added, “a lot of work will be necessary to convince the physics community of this, but our work serves to demonstrate that a key pillar of the Standard Cosmological Model is rather shaky indeed. Hopefully, this will motivate better analyses of cosmological data as well as inspire theorists to investigate more nuanced cosmological models.” He also pointed out that one can expect significant progress in resolving the issue when the European Extremely Large Telescope makes observations with a “laser comb” to directly measure over a 10-15-year period whether the expansion rate is indeed growing.

A letter from the Editor

Dear reader,

The COVID-19-induced lockdown and the absolute necessity for human beings to maintain a physical distance from one another in order to contain the pandemic has changed our lives in unimaginable ways. The print medium all over the world is no exception.

As the distribution of printed copies is unlikely to resume any time soon, Frontline will come to you only through the digital platform until the return of normality. The resources needed to keep up the good work that Frontline has been doing for the past 35 years and more are immense. It is a long journey indeed. Readers who have been part of this journey are our source of strength.

Subscribing to the online edition, I am confident, will make it mutually beneficial.


R. Vijaya Sankar

Editor, Frontline

Support Quality Journalism
This article is closed for comments.
Please Email the Editor