1,680 research outputs found

    Exploring the contributions of bed nets, cattle, insecticides and excitorepellency to malaria control: a deterministic model of mosquito host-seeking behaviour and mortality

    Get PDF
    Domestic and personal protection measures against malaria exposure either divert host-seeking vectors to other hosts or kill those attempting to feed. Here, we explicitly model mosquito host-seeking processes in the context of local host availability and elucidate the impacts and mechanisms of pyrethroid-treated bed nets in Africa. It has been suggested that excitorepellent insecticides could increase exposure of unprotected humans by concentrating mosquito biting activity on this vulnerable group. This worst-case scenario is confirmed as a possibility where vector populations lack alternative hosts, but an approximate ‘break-even' scenario, with users experiencing little overall change in exposure, is more likely because of increased mosquito mortality while foraging for resources. Insecticidal nets are predicted to have epidemiologically significant impacts on transmission experienced by users and non-users at levels of coverage that can be achieved by sustainable net distribution systems, regardless of excitorepellency or the ecological setting. The results are consistent with the outcome of several randomised controlled trials, predicting enormous reductions in transmission at individual and community levels. As financial support, technology and distribution systems for insecticide-treated nets improve, massive reductions in malaria transmission could be realise

    \u3ci\u3eThe Harvest Moon\u3c/i\u3e / \u3ci\u3eA Witch in the Well\u3c/i\u3e / \u3ci\u3eThe Grail\u3c/i\u3e / \u3ci\u3eLament\u3c/i\u3e

    Get PDF
    The Harvest Moon: Where the wolfbane blossoms, my mother said a man cries In fur covered hands, A Witch in the Well: A witch fell down a wishing well while leaning over to see in. The Grail: Hope eternal shrouded in medieval myth, guarded by the angels in some lost Middle-earth Lament: The moon is a horned dilemma And I am on the moon

    Effects of Vacuum Fluctuation Suppression on Atomic Decay Rates

    Full text link
    The use of atomic decay rates as a probe of sub-vacuum phenomena will be studied. Because electromagnetic vacuum fluctuations are essential for radiative decay of excited atomic states, decay rates can serve as a measure of the suppression of vacuum fluctuation in non-classical states, such as squeezed vacuum states. In such states the renormalized expectation value of the square of the electric field or the energy density can be periodically negative, representing suppression of vacuum fluctuations. We explore the extent to which atomic decays can be used to measure the mean squared electric field or energy density. We consider a scheme in which atoms in an excited state transit a closed cavity whose lowest mode contains photons in a non-classical state. The change in the decay probability of the atom in the cavity due to the non-classical state can, under certain circumstances, serve as a measure of the mean squared electric field or energy density in the cavity. We derive a quantum inequality bound on the decrease in this probability. We also show that the decrease in decay rate can sometimes be a measure of negative energy density or negative squared electric field. We make some estimates of the magnitude of this effect, which indicate that an experimental test might be possible.Comment: 19 pages, 3 figure

    Latent Print Examination and Human Factors: Improving the Practice Through a Systems Approach: The Report of the Expert Working Group on Human Factors in Latent Print Analysis

    Get PDF
    Fingerprints have provided a valuable method of personal identification in forensic science and criminal investigations for more than 100 years. Fingerprints left at crime scenes generally are latent prints—unintentional reproductions of the arrangement of ridges on the skin made by the transfer of materials (such as amino acids, proteins, polypeptides, and salts) to a surface. Palms and the soles of feet also have friction ridge skin that can leave latent prints. The examination of a latent print consists of a series of steps involving a comparison of the latent print to a known (or exemplar) print. Courts have accepted latent print evidence for the past century. However, several high-profile cases in the United States and abroad have highlighted the fact that human errors can occur, and litigation and expressions of concern over the evidentiary reliability of latent print examinations and other forensic identification procedures has increased in the last decade. “Human factors” issues can arise in any experience- and judgment-based analytical process such as latent print examination. Inadequate training, extraneous knowledge about the suspects in the case or other matters, poor judgment, health problems, limitations of vision, complex technology, and stress are but a few factors that can contribute to errors. A lack of standards or quality control, poor management, insufficient resources, and substandard working conditions constitute other potentially contributing factors

    Caltech Faint Field Galaxy Redshift Survey IX: Source detection and photometry in the Hubble Deep Field Region

    Get PDF
    Detection and photometry of sources in the U_n, G, R, and K_s bands in a 9x9 arcmin^2 region of the sky, centered on the Hubble Deep Field, are described. The data permit construction of complete photometric catalogs to roughly U_n=25, G=26, R=25.5 and K_s=20 mag, and significant photometric measurements somewhat fainter. The galaxy number density is 1.3x10^5 deg^{-2} to R=25.0 mag. Galaxy number counts have slopes dlog N/dm=0.42, 0.33, 0.27 and 0.31 in the U_n, G, R and K_s bands, consistent with previous studies and the trend that fainter galaxies are, on average, bluer. Galaxy catalogs selected in the R and K_s bands are presented, containing 3607 and 488 sources, in field areas of 74.8 and 59.4 arcmin^2, to R=25.5 and and K_s=20 mag.Comment: Accepted for publication in ApJS; some tables and slightly nicer figures available at http://www.sns.ias.edu/~hogg/deep

    Decision and function problems based on boson sampling

    Get PDF
    Boson sampling is a mathematical problem that is strongly believed to be intractable for classical computers, whereas passive linear interferometers can produce samples efficiently. So far, the problem remains a computational curiosity, and the possible usefulness of boson-sampling devices is mainly limited to the proof of quantum supremacy. The purpose of this work is to investigate whether boson sampling can be used as a resource of decision and function problems that are computationally hard, and may thus have cryptographic applications. After the definition of a rather general theoretical framework for the design of such problems, we discuss their solution by means of a brute-force numerical approach, as well as by means of non-boson samplers. Moreover, we estimate the sample sizes required for their solution by passive linear interferometers, and it is shown that they are independent of the size of the Hilbert space.Comment: Close to the version published in PR

    Caltech Faint Galaxy Redshift Survey. IX. Source Detection and Photometry in the Hubble Deep Field Region

    Get PDF
    Detection and photometry of sources in the U_n, G, ℛ, and K_s bands in a 9 × 9 arcmin^2 region of the sky, centered on the Hubble Deep Field, are described. The data permit construction of complete photometric catalogs to roughly U_n = 25, G = 26, ℛ = 25.5, K_s = 20 mag and significant photometric measurements somewhat fainter. The galaxy number density is 1.3 × 10^5 deg^(-2) to ℛ = 25.0 mag. Galaxy number counts have slopes d log N/dm = 0.42, 0.33, 0.27, and 0.31 in the U_n, G, ℛ, and K_s bands, consistent with previous studies and the trend that fainter galaxies are, on average, bluer. Galaxy catalogs selected in the ℛ and K_s bands are presented, containing 3607 and 488 sources in field areas of 74.8 and 59.4 arcmin^2, to ℛ = 25.5 and K_s = 20 mag

    Timing of invasive strategy in non-ST-elevation acute coronary syndrome: a meta-analysis of randomized controlled trials

    Get PDF
    AIMS: The optimal timing of an invasive strategy (IS) in non-ST-elevation acute coronary syndrome (NSTE-ACS) is controversial. Recent randomized controlled trials (RCTs) and long-term follow-up data have yet to be included in a contemporary meta-analysis. METHODS AND RESULTS: A systematic review of RCTs that compared an early IS vs. delayed IS for NSTE-ACS was conducted by searching MEDLINE, Embase, and Cochrane Central Register of Controlled Trials. A meta-analysis was performed by pooling relative risks (RRs) using a random-effects model. The primary outcome was all-cause mortality. Secondary outcomes included myocardial infarction (MI), recurrent ischaemia, admission for heart failure (HF), repeat re-vascularization, major bleeding, stroke, and length of hospital stay. This study was registered with PROSPERO (CRD42021246131). Seventeen RCTs with outcome data from 10 209 patients were included. No significant differences in risk for all-cause mortality [RR: 0.90, 95% confidence interval (CI): 0.78-1.04], MI (RR: 0.86, 95% CI: 0.63-1.16), admission for HF (RR: 0.66, 95% CI: 0.43-1.03), repeat re-vascularization (RR: 1.04, 95% CI: 0.88-1.23), major bleeding (RR: 0.86, 95% CI: 0.68-1.09), or stroke (RR: 0.95, 95% CI: 0.59-1.54) were observed. Recurrent ischaemia (RR: 0.57, 95% CI: 0.40-0.81) and length of stay (median difference: -22 h, 95% CI: -36.7 to -7.5 h) were reduced with an early IS. CONCLUSION: In all-comers with NSTE-ACS, an early IS does not reduce all-cause mortality, MI, admission for HF, repeat re-vascularization, or increase major bleeding or stroke when compared with a delayed IS. Risk of recurrent ischaemia and length of stay are significantly reduced with an early IS
    corecore