Uniformly most powerful detection for integrate and fire time encoding
Proceedings of the 22nd European Signal Processing Conference (EUSIPCO)
A time encoding of a random signal is a representation of this signal as a random sequence of strictly increasing times. The goal of this paper is the rule for testing the mean value of a Gaussian signal from asynchronous samples given by the Integrate and Fire (IF) time encoding. The optimal likelihood ratio test is calculated and its statistical performance is compared with a synchronous test which is based on regular samples of the Gaussian signal. Since the IF samples based detector takes a decision at a random time, the regular samples based test exploits a random number of samples. The time encoding significantly reduces the number of samples needed to satisfy a prescribed probability of detection.