Two-sigma things happen all the time. But three sigma I’m listening now. Dragan Huterer Dark energy has been the most convenient term in cosmology over 25 years: a term that describes that something accelerates the expansion of space rather than slows it down. The conventional narrative has been spruced. White dwarfs that have gone off in type Ia supernovae, which burn with almost equal brightness, make “standard candles;” when astronomers observe a supernova appear so bright, and when it is redshifted by the expansion, they can use its appearance to determine its distance in the past. The strategy aided in the discovery of, in the late 1990s, that cosmic expansion is accelerating.

But the same instrument with which acceleration was originally secured is now being honed in such directions as tend to render the neatly told narrative more difficult to retain. A total of 2,087 Type Ia supernovae, selected on the basis of 24 datasets, including about seven billion years of expansion history, have been reconstituted by a group led by Saul Perlmutter. The headline is the methodological change: it has a high level of hierarchical propagation of uncertainties which has been modeled to operate in a more realistic manner, partial information is not thrown away, and subtle influencing factors of the instruments, like telescope-filter drift, are considered. The analysis attempts to minimize the possibility of the behavior of the dark energy being a product of the combination of the data by tightening the accounting of what is known and what merely inferred.
The payoff does not consist of a statement, but rather a continuing stress: the data once again hint at the fact that the cosmological constant may not be as constant as it can otherwise be. The expansion rate in the Einstein inspired image will always accelerate and the universe will tend towards an even thinner colder future, with a fixed dark-energy density. In the event that the dark-energy density decreases, the acceleration will become less accelerating, altering the long-term trajectory in a variety of ways historically categorized by cosmologists as “exotic,” such as the possibility of gravity eventually reasserting its dominance.
This question is also being driven home in other places besides the supernovae. The alternative experimental plan a three-dimensional mapping of the universe using galaxy recesses and the baryon acoustic oscillation “standard ruleer” have provided their own indications that the dark energy will change as well. DESI has already put over 40 million objects on an expansion timeline, of 11 billion years. Direction is the interesting aspect of that analysis: a dark-energy effect that seems less powerful in the present day than it used to be in the past.
Those maps are constructions of industrializing a previously-patient art. DESI has 5,000 robotic fiber-positioners, which target galaxies and quasars within minutes, and collects spectra with which redshift can be measured with precision. Distances are based on the frozen sound-wave pattern which was abandoned in the matter in the early times of the universe rings out there one billion and a half light-years about and which makes large-scale clustering into a ruler which can be re-laid down time after time.
In the meantime, the supernova community has been strengthening its own roots. The supernova cosmology of the Dark Energy Survey compiled 1,499 supernovae (Type Ia) of high quality, the largest deep sample of a single telescope, and reported w = -0.80 +- 0.18 without supernovae and w = -1 when combined with Planck. “w is tantalizingly not exactly on –1, but close enough that it’s consistent with –1,” said Tamara Davis. The number matters less than what it represents: a disciplined attempt to ask whether dark energy behaves exactly like a cosmological constant or only approximately so.
In all these endeavours, the engineering narrative is as fateful as the cosmological narrative. The question “Is the dark energy changing?” has been inextricably linked with issues of calibration, selection effects, classification uncertainty and the problem of how to integrate heterogeneous surveys without bringing in some hidden bias. It is due to this that Perlmutter reanalysis highlights the importance of probabilistic bookkeeping, and why DES was the first to innovate photometric workflows and machine-learning classification which can be scaled by future surveys.
The following constraints are already being nurtured. Euclid has commenced the publication of wide-field calibrated images and catalogues, the initial core cosmology data release of the mission will occur in October 2026. With ground-based surveys that increase the samples of the supernovas, and also map large scale structure, the field is converging on a straightforward prospect: provided dark energy is drifting, it will have to pass not just a single mode of measurement, but an array of them, with various failure modes.

