The most difficult thing about learning the dark energy is that the dark matter does not present itself directly; it only manifests itself in the way the Universe stretches and the way matter accumulates. A six-year observational campaign has now transformed that indirect evidence into a sharper internally cross-checked image; an image that identifies the range of possible cosmic histories and reveals a systematic incompatibility in the appearance of structure.

It is the new clarity of Dark Energy Survey (DES) that applied the 570-megapixel Dark Energy Camera to the Victor M. Blanco 4-meter telescope in Chile. During 758 nights between 2013 and 2019, DES had captured light of 669 million galaxies distributed over one-eighth of the sky, too distant to be in a nearby disk, the distribution of which carries billions of years of cosmic history.
The most interesting part about the latest analysis is that it is not a single measurement but a purposeful convergence. DES combined four probes that identify dark energy in four distinct physical mechanisms: Type Ia supernovae (markers of distance), baryon acoustic oscillation (length scale frozen in by the early Universe), galaxy clustering (patterns of the matter concentration), and weak gravitational lensing (indirectly caused distortions that become imprinted as light travels through intervening mass). “It is an incredible feeling to see these results based on all the data, and with all four probes that DES had planned,” at the outset by DES, as expressed by Yuanyuan Zhang of NSF NOIRLab. “This was something I would have only dared to dream about when DES started collecting data, and now the dream has come true.”
The engineering fact of such a synthesis is statistical discipline on scale. In particular, weak lensing requires galaxy shapes to be measured sufficiently fine that the minute correlations (generated by the “shearing” of a background image by foreground mass) can be extrapolated out of instrumental and astrophysical noise. DES made their strategy even less susceptible to exclusion by combining lensing and clustering, in which involving a check of whether galaxies that seem to be similarly distorted occupy the same position on the sky. The combination can be used to re-create the way matter has been distributed in the last 6 billion years, transforming subtle, noisy structures into a time-resolved map of structure formation.
On theory side, DES has answered two well-known frameworks, the standard LCDM model, where the dark energy is an constant energy density, and wCDM, where the action of the dark energy can vary with time. The data was consistent with both LCDM and also with wCDM with no strong preference- a result that narrows but does not collapse the space of possibility.
One tension did sharpen. Both LCDM and wCDM rely upon information about the early Universe in order to forecast the extent to which matter ought to cluster in the late universe; DES once again discovered that the current clustering amplitude was not falling towards those predictions, and the difference was exaggerated with the entire dataset. According to Regina Rameika, these findings of DES cast a new dimension of knowledge about the universe and its expansion. They show how years of research spending and the combination of various forms of analysis can give them some clue to some of the largest mysteries of the universe.
An improvement in accuracy is already in the pipeline: DES is already being upgraded to work with the Legacy Survey of Space and Time based on the Vera C. Rubin Observatory which will catalog approximately 20 billion galaxies. The imaging of the wide field developed by Rubin is intended to transform weak-lensing “cosmic shear” into a more detailed and even more three-dimensional accounting of mass through time precisely the sort of redundancy and scale needed when the object of interest is a force determined by its effects constituting its form.

