The distance duality relation tells us how, assuming that photons propagate on null geodesics in a pseudo-Riemannian spacetime and that their number is conserved, luminosity and angular diameter distances are related, via

where d_{L} is the luminosity distance, d_{A} the angular diameter distance and z the redshift. This relation was introduced by Etherington in 1933, and as such is sometimes called the Etherington reciprocity theorem.

To violate the distance duality relation, you either need to introduce some mechanism by which photons can be produced or destroyed, give mass to the photons so they no longer propagate on null geodesics, or consider a theory of gravity starkly different from general relativity.

Detecting a deviation from this relation is therefore an exciting goal for future surveys. Two of the most promising probes of the future are standard sirens (gravitational wave events with an electromagnetic counterpart) and strongly lensed Type Ia supernovae.

## Standard sirens

In arxiv.org/abs/2007.14335, we considered a specific physical model in which photons decay into axions in the presence of magnetic fields, thus violating the distance duality relation. We created mock datasets for the Einstein Telescope, LSST and DESI to investigate the constraints these surveys will provide on such a model. We found that these surveys will achieve around a 3% constraint on the parameter we used to characterise the violation. We also investigated what happens if the presence of modified gravity is neglected in the analysis, finding that it can lead to extremely significant (~10σ) false detections of distance duality violation.

## Strongly lensed supernovae

In our paper out today, arxiv.org/abs/2010.04155, we were more agnostic, choosing to forego a specific model for distance duality breaking, investigating instead how strongly lensed Type Ia supernovae (SNIa) could allow the violation to be reconstructed as a function of redshift.

Strongly lensed Type Ia supernovae are an excellent tool for this purpose, as they allow a measurement of a luminosity distance and an angular diameter distance to be made to the same object. While only one strongly lensed Type Ia supernova has been observed to date, LSST is expected to observe up to one hundred such systems.

After creating our mock strongly lensed Type Ia supernova datasets, we firstly used an MCMC analysis to constrain the parameter which controls the violation of the distance duality relation, obtaining constraints for three different mock catalogues: twenty lenses (realistic), one hundred lenses (optimistic) and one thousand lenses (futuristic). We found that with only twenty lensed systems, the constraints are competitive with current combined SNIa and baryon acoustic oscillation (BAO) constraints, while one thousand lensed systems reaches a sensitivity comparable with the expected constraining power of the Euclid BAO survey combined with the full LSST (unlensed) SNIa survey (see arxiv.org/abs/2007.16153 for more details on the Euclid constraint).

We then used the machine learning approaches of genetic algorithms and Gaussian processes to reconstruct the distance duality relation as a function of redshift in three different fiducial cosmologies: no violation of distance duality, a weak violation and a strong violation. We found that both approaches learn well from the mock data, even in the most conservative case of twenty available lensed systems, recovering the underlying fiducial model and yielding 1σ constraints that were completely compatible with the parameterised approach.

## 1 comment