A
Anonymous
Guest
Speaking of technical terms, I caught the term Warbling used, which I think means the low level low frequency "noise" caused by such things as external signals being demodulated by the double correlated sampling receiver commonly used in pulse induction detectors.
External controls are provided on some detectors to "tweak" the transmit frequency (transmit period or receive period) which also changes the two sample spacing, I guess. On pulse induction detectors I have built, I have had the ability to adjust transmit on off periods, delays to first samples (two channel), delay between samples, sample widths. Basically I could adust everything in an analog format. Tweaking all these could have an interesting effect and could find some real sweet spots where the "noise" (unwanted signal) would go way down and I could increase total gain alot and sensitivity would go way up.
Of course the external signal is "beating" with the demodulator and all sorts of beat type waveforms are generated of various amplitudes while changing timing. I see this Warbling (if this is what this is) as the greatest limiting factor to a really great pulse induction detector.
My question is this, Has anyone ever really gotten their arms around this effect, to describe it mathmatically might help, but are there any clever ways to reduce it? Extreme many pole low pass filtering makes it go down but makes the response of the detector slow also. I am also not sure how anyone gets away with not being able to adjust at least one parameter to reduce this unless they just limit the sensitivity of their detector and make it slow, so the operator won't have to mess with it (his loss).
JC
External controls are provided on some detectors to "tweak" the transmit frequency (transmit period or receive period) which also changes the two sample spacing, I guess. On pulse induction detectors I have built, I have had the ability to adjust transmit on off periods, delays to first samples (two channel), delay between samples, sample widths. Basically I could adust everything in an analog format. Tweaking all these could have an interesting effect and could find some real sweet spots where the "noise" (unwanted signal) would go way down and I could increase total gain alot and sensitivity would go way up.
Of course the external signal is "beating" with the demodulator and all sorts of beat type waveforms are generated of various amplitudes while changing timing. I see this Warbling (if this is what this is) as the greatest limiting factor to a really great pulse induction detector.
My question is this, Has anyone ever really gotten their arms around this effect, to describe it mathmatically might help, but are there any clever ways to reduce it? Extreme many pole low pass filtering makes it go down but makes the response of the detector slow also. I am also not sure how anyone gets away with not being able to adjust at least one parameter to reduce this unless they just limit the sensitivity of their detector and make it slow, so the operator won't have to mess with it (his loss).
JC