The LISA Data Challenge Radler Analysis and Time-dependent Ultra-compact Binary Catalogues

Kristen Lackeos, Tyson B. Littenberg, Neil J. Cornish, James I. Thorpe
General Relativity and Quantum Cosmology, General Relativity and Quantum Cosmology (gr-qc)
A&A 678, A123 (2023)
2023-08-23 16:00:00
Context. Galactic binaries account for the loudest combined continuous gravitational wave signal in the Laser Interferometer Space Antenna (LISA) band, which spans a frequency range of 0.1 mHz to 1 Hz. Aims. A superposition of low frequency Galactic and extragalactic signals and instrument noise comprise the LISA data stream. Resolving as many Galactic binary signals as possible and characterising the unresolved Galactic foreground noise after their subtraction from the data are a necessary step towards a global fit solution to the LISA data. Methods. We analyse a simulated gravitational wave time series of tens of millions of ultra-compact Galactic binaries hundreds of thousands of years from merger. This data set is called the Radler Galaxy and is part of the LISA Data challenges. We use a Markov Chain Monte Carlo search pipeline specifically designed to perform a global fit to the Galactic binaries and detector noise. Our analysis is performed for increasingly larger observation times of 1.5, 3, 6 and 12 months. Results. We show that after one year of observing, as many as ten thousand ultra-compact binary signals are individually resolvable. Ultra-compact binary catalogues corresponding to each observation time are presented. The Radler Galaxy is a training data set, with binary parameters for every signal in the data stream included. We compare our derived catalogues to the LISA Data challenge Radler catalogue to quantify the detection efficiency of the search pipeline. Included in the appendix is a more detailed analysis of two corner cases that provide insight into future improvements to our search pipeline.
PDF: The LISA Data Challenge Radler Analysis and Time-dependent Ultra-compact Binary Catalogues.pdf
Empowered by ChatGPT