What survives a black hole: an additional black hole, or a neutron star? Recalculations put the two at the same clock and the one change flows all the way to a surprisingly earlier time of the “last light” of the universe.

The concept begins with the proposal by Stephen Hawking in 1975 that the black holes were not black. Modes of quantum activity at the edge of a black hole permit pairs of particles to temporarily appear; when one falls into a black hole and the other escapes, the escaping particle takes away the energy. Over very large durations of time, such leaking mass causes evaporation, which at one time appeared to conflict with the intuition, based on general relativity, that the only thing the black holes do is to grow.
A group at Radboud University, including Heino Falcke, Michael Wondrak and Walter van Suijlekom, take that logic to the extreme to predict the existence of black holes. They work on the separation of particle pairs by the curvature of spacetime in the absence of an event horizon, making the Hawking effect more general, the production of “gravitational pairs”. In that image, compact remnants including neutron stars and white dwarfs are not only chilly relics of stellar evolution, but are slow emitters, shedding energy in a manner that is determined mainly by density. What it yields is a universe which has a hard limit on the length of time over which any compact object can exist, which is due to the same quantum accounting that makes Hawking radiation occur.
In the article an upper limit to the lifetime of stellar remnants, the authors calculate a maximum cosmic time of approximately 1078 years. This is inconceivably long, and much smaller than previously predicted times that allowed the universe to drift off to about 101100 years by assuming that it is only the black holes that evaporate and other remnants survive in effect.
There is one counterintuitive fact that makes the argument work: neutron stars and stellar-mass black holes emerge with identical property, which is the time of evaporation of approximately 1067 years. It is not the “strength of gravity” in the common sense of the word that determines the rate, but rather density. The equality is explained by the team as due to a physical discrepancy that diminishes the rate of loss of the black hole: an absence of material surface allows a black hole to reassimilate some of what it effectively emits, quenching the overall evaporation in comparison to that which simple intuition would suggest.
The long-haul holdouts in this count are the white dwarfs-burnt-out cores abandoned by stars such as the Sun. This is precisely why they are important to the last chapter of the universe: when all the matter condensed gradually evaporates, then the last significant landmark is not the extinction of starlight, but the extinction of the remnants of the oldest stars themselves.
The researchers even drove the machinery on intentionally weird examples to stress-test the equations even including estimates that a human body and the Moon would require the order of 1090 years to evaporate using the same method. Those figures are not the predictions of friendly things; they are sanity checks, which indicate how the equations behave when used on grossly different masses and densities.
Along with larger end-of-universe models, e.g. the long “heat death” fade caused by expansion and star formation losing momentum, the Radboud finding constrays one aspect of uncertainty: the time-scale of the universe to sustain any concentrated sources of structure whatsoever. It is no longer focused on epic conclusions, but instead on a consistent, density-regulated erosion where even the most inert cosmic objects are engaged in the same lethargic tallying of quantum fields in curved spacetime.

