FEAT: add option to crop whitened time series#1030
FEAT: add option to crop whitened time series#1030ColmTalbot wants to merge 15 commits intobilby-dev:mainfrom
Conversation
f39871f to
5fbc454
Compare
| @property | ||
| def frequency_mask(self): | ||
| frequencies = bilby.core.utils.series.create_frequency_series( | ||
| duration=self.ifo.cropped_duration, sampling_frequency=self.sampling_frequency | ||
| ) | ||
| return (self.ifo.minimum_frequency <= frequencies) & ( | ||
| frequencies <= self.ifo.maximum_frequency | ||
| ) | ||
|
|
There was a problem hiding this comment.
We should probably have a utility function that does this.
0b74e2b to
0972d62
Compare
GregoryAshton
left a comment
There was a problem hiding this comment.
Overall looks good and I'm supportive, just a few things I wanted to check.
b8f4b97 to
c7e762a
Compare
bilby/gw/detector/interferometer.py
Outdated
| else: | ||
| return self.whiten_and_crop(frequency_series=frequency_series) | ||
|
|
||
| def whiten_and_crop(self, frequency_series : np.array) -> np.array: |
There was a problem hiding this comment.
It seems like this procedure is more fundamental than suggested by its being a method of the interferometer class. The nice thing about using the gwutils SNR functions was that you could calculate SNRs using them without necessarily initializing an interferometer object. Is it possible to pull this out to the utils and just call it here?
| return np.nan_to_num(whitened) * frequency_mask * (4 / duration)**0.5 | ||
|
|
||
|
|
||
| def whiten_and_crop(frequency_series, amplitude_spectral_density, frequency_mask, time_mask, duration): |
There was a problem hiding this comment.
Should the name state what input it accepts? E.g. like 'frequency_domain_whiten' to make it clear it is acting on the frequency-domain input?
There was a problem hiding this comment.
I can make this change, but it's pretty clear from the signature and docstring.
GregoryAshton
left a comment
There was a problem hiding this comment.
A few more comments after another look.
| * np.sqrt(np.sum(self.frequency_mask)) / frequency_window_factor | ||
| * np.sqrt(np.sum(self.frequency_mask)) | ||
| / frequency_window_factor | ||
| * self.time_mask.mean()**0.5 |
There was a problem hiding this comment.
Why does this use both np.sqrt and **0.5. It would probably be easier to read if it was consistent.
There was a problem hiding this comment.
I think I was trying to avoid explicitly using numpy, but that's unavoidable since we need an fft and sqrt is sometimes meaningfully faster.
| * self.time_mask.mean()**0.5 | |
| * np.sqrt(self.time_mask.mean()) |
| The frequency series, whitened by the ASD | ||
| """ | ||
| return frequency_series / (self.amplitude_spectral_density_array * np.sqrt(self.duration / 4)) | ||
| if self.crop_duration == 0: |
There was a problem hiding this comment.
What happens (or can) crop_duration be "None" or None etc? This would silently fail and then trigger the else clause, likely also erroring, but could be confusing?
There was a problem hiding this comment.
This would fail in the crop_duration.setter in the InterferometerStrainData with a reasonably explicit error message.
| ) | ||
| _mask = interferometer.frequency_mask | ||
|
|
||
| if 'recalib_index' in parameters: |
There was a problem hiding this comment.
Why does the signal no longer require masking here?
There was a problem hiding this comment.
Unclear, is this tested?
Co-authored-by: Colm Talbot <talbotcolm@gmail.com>
962bc37 to
85294b4
Compare
This PR implements the proposed modification to the likelihood from https://iopscience.iop.org/article/10.1088/1361-6382/ae1ac7/meta.
The main new feature is adding a
crop_durationto theInterferometerStrainDataand some accompanying utility code.I modified how the
GravitationalWaveTransientandBasicGravitationalWaveTransientcall the data. I did not have to change the reference likelihood tests, so there is good evidence that this doesn't impact the numerical values when not using the cropping.I suspect there is a performance penalty due to repeatedly whitening the data. There should be a way to cache this calculation, it just requires some cleverness about when the time/frequency masks might have changed.
I also haven't validated how this interacts with time/calibration marginalization.