Hi Justin,
The problem is that this isn't a simple thing to do reliably between instrument platforms. Historically one would take the background subtracted signal at the apex of the chromatographic peak and then divide by the standard deviation of the background (ideally from the blank). The Orbitrap platforms made this complicated as the instruments performed a significant amount of signal processing to eliminate the background making the S/N appear artificially too large. Because of this our lab generally uses the reciprocal of the coefficient of variation as the S/N. Basically, the signal is the mean area and the SD is the noise. So a CV of 10% is the same as a S/N = 10, CV of 20% is a S/N = 5, etc... Importantly, using the standard deviation of the measurement and not just the background takes into account shot noise in the estimate of the noise value. That said, with instruments having single ion detection I do see value in calculating the standard deviation of the background and it is something we have started having discussion about how to add this to what Skyline calculates.
Thanks for requesting this on the support board. We hope to be able to add a feature similar to this in the future. For the time being, I think 1/CV is your best way of approximating this for the time being as long as you have replicate measurements.
Cheers,
Mike