The recently published LIGO signal was extremely strong, this was detected using the upgraded, more sensitive version of the previous LIGO setup. Since the signal of a black hole merger is described by only a few parameters, this begs the question of why one couldn't have extracted information about the probability distribution over the parameters that describe the mergers from the noise, without necessarily being able to identify any individual events.
1 Answers
For very small signals, the ones produced from heavier binaries, there is not enough information to do what you suggest. But, if you see the same event in 2 or more detectors you can be confident about your observation. While it is entirely possible if signal lasts for a long time. The sensitivity of the detector is not the primary concern. Its the sensitivity at low frequency. The time to merge depends on the frequency from where you start observing. If you strt at 1Hz, the time to merge is almost a full day. With signal lasting for such long duration there is no need for another detector (there is if you want to improve localization of the signal). However, pushing detectors sensitivity to 1 Hz is a far cry right now. Current detectors become insensitive to any frequency below 25 Hz.
- 324
Marginalizing over these parameters yields much less sensitive results. Having said this, there are plans to search for sub-threshold events.
– OTH Nov 25 '17 at 01:31