Lower bounds on scintillation detector timing performance
نویسندگان
چکیده
منابع مشابه
Lower bounds on scintillation detector timing performance
Fundamental method-independent linuts on the timing performance of scintillation detectors are useful for identifying regimes in which either present timing methods are nearly optimal or where a considerable performance gain might be realized using better pulse processing techniques . Several types of lower bounds on mean-squared timing error (MSE) performance have been developed and applied to...
متن کاملThe lower bound on the timing resolution of scintillation detectors.
The timing performance of scintillation detectors is ultimately limited by photon counting statistics. In fact, photon counting statistics form a dominant contribution to the overall timing resolution of many state-of-the-art detectors. A common approach to investigate this contribution is to calculate the variance in the registration times of individual scintillation photons within the photose...
متن کاملLower bounds on the signed (total) $k$-domination number
Let $G$ be a graph with vertex set $V(G)$. For any integer $kge 1$, a signed (total) $k$-dominating functionis a function $f: V(G) rightarrow { -1, 1}$ satisfying $sum_{xin N[v]}f(x)ge k$ ($sum_{xin N(v)}f(x)ge k$)for every $vin V(G)$, where $N(v)$ is the neighborhood of $v$ and $N[v]=N(v)cup{v}$. The minimum of the values$sum_{vin V(G)}f(v)$, taken over all signed (total) $k$-dominating functi...
متن کاملLower Bounds for the Performance of Iterative Timing Recovery at low SNR
The push for higher recording densities has motivated the development of iterative errorcontrol codes of unprecedented power, whose large coding gains enable low error rates at very low SNR [1] [2]. In addition, the iterative decoding technique has been extended to turbo equalization, where the equalizer and the decoder iterate [3]. Consequently, timing recovery, which typically derives no bene...
متن کاملLower bounds on kernelization
Preprocessing (data reduction or kernelization) to reduce instance size is one of the most commonly deployed heuristics in the implementation practice to tackle computationally hard problems. However, a systematic theoretical study of them remained elusive so far. One of the reasons for this is that if an input to an NP -hard problem can be processed in polynomial time to an equivalent one of s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment
سال: 1990
ISSN: 0168-9002
DOI: 10.1016/0168-9002(90)90767-z