636 research outputs found
Physics and application of photon number resolving detectors based on superconducting parallel nanowires
The Parallel Nanowire Detector (PND) is a photon number resolving (PNR)
detector which uses spatial multiplexing on a subwavelength scale to provide a
single electrical output proportional to the photon number. The basic structure
of the PND is the parallel connection of several NbN superconducting nanowires
(100 nm-wide, few nm-thick), folded in a meander pattern. PNDs were fabricated
on 3-4 nm thick NbN films grown on MgO (TS=400C) substrates by reactive
magnetron sputtering in an Ar/N2 gas mixture. The device performance was
characterized in terms of speed and sensitivity. PNDs showed a counting rate of
80 MHz and a pulse duration as low as 660ps full width at half maximum (FWHM).
Building the histograms of the photoresponse peak, no multiplication noise
buildup is observable. Electrical and optical equivalent models of the device
were developed in order to study its working principle, define design
guidelines, and develop an algorithm to estimate the photon number statistics
of an unknown light. In particular, the modeling provides novel insight of the
physical limit to the detection efficiency and to the reset time of these
detectors. The PND significantly outperforms existing PNR detectors in terms of
simplicity, sensitivity, speed, and multiplication noise
ROBUST KULLBACK-LEIBLER DIVERGENCE AND ITS APPLICATIONS IN UNIVERSAL HYPOTHESIS TESTING AND DEVIATION DETECTION
The Kullback-Leibler (KL) divergence is one of the most fundamental metrics in information theory and statistics and provides various operational interpretations in the context of mathematical communication theory and statistical hypothesis testing. The KL divergence for discrete distributions has the desired continuity property which leads to some fundamental results in universal hypothesis testing. With continuous observations, however, the KL divergence is only lower semi-continuous; difficulties arise when tackling universal hypothesis testing with continuous observations due to the lack of continuity in KL divergence.
This dissertation proposes a robust version of the KL divergence for continuous alphabets. Specifically, the KL divergence defined from a distribution to the Levy ball centered at the other distribution is found to be continuous. This robust version of the KL divergence allows one to generalize the result in universal hypothesis testing for discrete alphabets to that for continuous observations. The optimal decision rule is developed whose robust property is provably established for universal hypothesis testing.
Another application of the robust KL divergence is in deviation detection: the problem of detecting deviation from a nominal distribution using a sequence of independent and identically distributed observations. An asymptotically -optimal detector is then developed for deviation detection where the Levy metric becomes a very natural distance measure for deviation from the nominal distribution.
Lastly, the dissertation considers the following variation of a distributed detection problem: a sensor may overhear other sensors\u27 transmissions and thus may choose to refine its output in the hope of achieving a better detection performance. While this is shown to be possible for the fixed sample size test, asymptotically (in the number of samples) there is no performance gain, as measured by the KL divergence achievable at the fusion center, provided that the observations are conditionally independent. For conditionally dependent observations, however, asymptotic detection performance may indeed be improved when overhearing is utilized
- …