Abstract
Many baseline correction approaches have been developed to address baseline artifacts observed in measured infrared (IR) absorption spectra during post-processing. These approaches offer distinct advantages and disadvantages, and the choice of which one to employ depends on the complexity of baseline artifacts present in a particular application. In this paper, we compare the performance of two baseline correction approaches: a frequency-domain polynomial fitting approach and a time-domain modified free induction decay approach, under various baseline scenarios, spectral resolutions, and noise levels for mixtures containing up to 464 species. Our results showed that the frequency-domain approach outperformed the time-domain approach by a factor of up to 16 when the baseline was represented by a sine wave with fewer than two cycles over the full spectral range. On the other hand, the time-domain approach performed up to 12 times better when the baseline featured two cycles of a sine wave. Additionally, we observed that the time-domain approach exhibited higher sensitivity to spectral resolution and underperformed when the noise level was high. The findings of this study emphasize the importance of numerically testing a few candidate approaches for a given application, taking into consideration baseline characteristics, as well as the spectral resolution and noise constraints of the application.
© 2024 The Author(s)
PDF Article
Cited By
You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.
Contact your librarian or system administrator
or
Login to access Optica Member Subscription