518 research outputs found

    Technology Assisted Reviews: Finding the Last Few Relevant Documents by Asking Yes/No Questions to Reviewers

    Get PDF
    The goal of a technology-assisted review is to achieve high recall with low human effort. Continuous active learning algorithms have demonstrated good performance in locating the majority of relevant documents in a collection, however their performance is reaching a plateau when 80\%-90\% of them has been found. Finding the last few relevant documents typically requires exhaustively reviewing the collection. In this paper, we propose a novel method to identify these last few, but significant, documents efficiently. Our method makes the hypothesis that entities carry vital information in documents, and that reviewers can answer questions about the presence or absence of an entity in the missing relevance documents. Based on this we devise a sequential Bayesian search method that selects the optimal sequence of questions to ask. The experimental results show that our proposed method can greatly improve performance requiring less reviewing effort.Comment: This paper is accepted by SIGIR 201

    School closure in response to epidemic outbreaks: Systems-based logic model of downstream impacts [version 1; peer review: 2 approved]

    Get PDF
    Background: School closures have been a recommended non-pharmaceutical intervention in pandemic response owing to the potential to reduce transmission of infection between children, school staff and those that they contact. However, given the many roles that schools play in society, closure for any extended period is likely to have additional impacts. Literature reviews of research exploring school closure to date have focused upon epidemiological effects; there is an unmet need for research that considers the multiplicity of potential impacts of school closures. Methods: We used systematic searching, coding and synthesis techniques to develop a systems-based logic model. We included literature related to school closure planned in response to epidemics large and small, spanning the 1918-19 ‘flu pandemic through to the emerging literature on the 2019 novel coronavirus. We used over 170 research studies and a number of policy documents to inform our model. Results: The model organises the concepts used by authors into seven higher level domains: children’s health and wellbeing, children’s education, impacts on teachers and other school staff, the school organisation, considerations for parents and families, public health considerations, and broader economic impacts. The model also collates ideas about potential moderating factors and ethical considerations. While dependent upon the nature of epidemics experienced to date, we aim for the model to provide a starting point for theorising about school closures in general, and as part of a wider system that is influenced by contextual and population factors. Conclusions: The model highlights that the impacts of school closures are much broader than those related solely to health, and demonstrates that there is a need for further concerted work in this area. The publication of this logic model should help to frame future research in this area and aid decision-makers when considering future school closure policy and possible mitigation strategies

    Ensemble averaged entanglement of two-particle states in Fock space

    Full text link
    Recent results, extending the Schmidt decomposition theorem to wavefunctions of identical particles, are reviewed. They are used to give a definition of reduced density operators in the case of two identical particles. Next, a method is discussed to calculate time averaged entanglement. It is applied to a pair of identical electrons in an otherwise empty band of the Hubbard model, and to a pair of bosons in the the Bose-Hubbard model with infinite range hopping. The effect of degeneracy of the spectrum of the Hamiltonian on the average entanglement is emphasised.Comment: 19 pages Latex, changed title, references added in the conclusion

    Fast and Accurate Camera Covariance Computation for Large 3D Reconstruction

    Full text link
    Estimating uncertainty of camera parameters computed in Structure from Motion (SfM) is an important tool for evaluating the quality of the reconstruction and guiding the reconstruction process. Yet, the quality of the estimated parameters of large reconstructions has been rarely evaluated due to the computational challenges. We present a new algorithm which employs the sparsity of the uncertainty propagation and speeds the computation up about ten times \wrt previous approaches. Our computation is accurate and does not use any approximations. We can compute uncertainties of thousands of cameras in tens of seconds on a standard PC. We also demonstrate that our approach can be effectively used for reconstructions of any size by applying it to smaller sub-reconstructions.Comment: ECCV 201

    Bezlotoxumab for prevention of recurrent Clostridium difficile infection in patients at increased risk for recurrence

    Get PDF
    Background: Bezlotoxumab is a human monoclonal antibody against Clostridium difficile toxin B indicated to prevent C. difficile infection (CDI) recurrence (rCDI) in adults at high risk for rCDI. This post hoc analysis of pooled monocolonal antibodies for C.difficile therapy (MODIFY) I/II data assessed bezlotoxumab efficacy in participants with characteristics associated with increased risk for rCDI. Methods: The analysis population was the modified intent-to-treat population who received bezlotoxumab or placebo (n = 1554) by risk factors for rCDI that were prespecified in the statistical analysis plan: age ≥65 years, history of CDI, compromised immunity, severe CDI, and ribotype 027/078/244. The proportion of participants with rCDI in 12 weeks, fecal microbiota transplant procedures, 30-day all cause and CDI-associated hospital readmissions, and mortality at 30 and 90 days after randomization were presented. Results: The majority of enrolled participants (75.6%) had ≥1 risk factor; these participants were older and a higher proportion had comorbidities compared with participants with no risk factors. The proportion of placebo participants who experienced rCDI exceeded 30% for each risk factor compared with 20.9% among those without a risk factor, and the rCDI rate increased with the number of risk factors (1 risk factor: 31.3%; ≥3 risk factors: 46.1%). Bezlotoxumab reduced rCDI, fecal microbiota transplants, and CDI-associated 30-day readmissions in participants with risk factors for rCDI. Conclusions: The risk factors prespecified in the MODIFY statistical analysis plan are appropriate to identify patients at high risk for rCDI. While participants with ≥3 risk factors had the greatest reduction of rCDI with bezlotoxumab, those with 1 or 2 risk factors may also benefit. Clinical Trials Registration: NCT01241552 (MODIFY I) and NCT01513239 (MODIFY II)

    Correlated X-ray and Optical Variability in V404 Cyg in Quiescence

    Get PDF
    We report simultaneous X-ray and optical observations of V404 Cyg in quiescence. The X-ray flux varied dramatically by a factor of >20 during a 60ks observation. X-ray variations were well correlated with those in Halpha, although the latter include an approximately constant component as well. Correlations can also be seen with the optical continuum, although these are less clear. We see no large lag between X-ray and optical line variations; this implies they are causally connected on short timescales. As in previous observations, Halpha flares exhibit a double-peaked profile suggesting emission distributed across the accretion disk. The peak separation is consistent with material extending outwards to at least the circularization radius. The prompt response in the entire Halpha line confirms that the variability is powered by X-ray (and/or EUV) irradiation.Comment: 5 pages; Accepted for publication in the Astrophysical Journal Letter

    Infinitesimals without Logic

    Full text link
    We introduce the ring of Fermat reals, an extension of the real field containing nilpotent infinitesimals. The construction takes inspiration from Smooth Infinitesimal Analysis (SIA), but provides a powerful theory of actual infinitesimals without any need of a background in mathematical logic. In particular, on the contrary with respect to SIA, which admits models only in intuitionistic logic, the theory of Fermat reals is consistent with classical logic. We face the problem to decide if the product of powers of nilpotent infinitesimals is zero or not, the identity principle for polynomials, the definition and properties of the total order relation. The construction is highly constructive, and every Fermat real admits a clear and order preserving geometrical representation. Using nilpotent infinitesimals, every smooth functions becomes a polynomial because in Taylor's formulas the rest is now zero. Finally, we present several applications to informal classical calculations used in Physics: now all these calculations become rigorous and, at the same time, formally equal to the informal ones. In particular, an interesting rigorous deduction of the wave equation is given, that clarifies how to formalize the approximations tied with Hook's law using this language of nilpotent infinitesimals.Comment: The first part of the preprint is taken directly form arXiv:0907.1872 The second part is new and contains a list of example

    Modelling coseismic displacements during the 1997 Umbria-Marche earthquake (central Italy)

    Get PDF
    We propose a dislocation model for the two normal faulting earthquakes that struck the central Apennines (Umbria-Marche, Italy) on 1997 September 26 at 00:33 (Mw 5.7) and 09:40 GMT (Mw 6.0). We fit coseismic horizontal and vertical displacements resulting from GPS measurements at several monuments of the IGMI (Istituto Geografico Militare Italiano) by means of a dislocation model in an elastic, homogeneous, isotropic half-space. Our best-fitting model consists of two normal faults whose mechanisms and seismic moments have been taken from CMT solutions; it is consistent with other seismological and geophysical observations. The first fault, which is 6 km long and 7 km wide, ruptured during the 00:33 event with a unilateral rupture towards the SE and an average slip of 27 cm. The second fault is 12 km long and 10 km wide, and ruptured during the 09:40 event with a nearly unilateral rupture towards the NW. Slip distribution on this second fault is non-uniform and is concentrated in its SE portion (maximum slip is 65 cm), where rupture initiated. The 00:33 fault is deeper than the 09:40 one: the top of the first rupture is deeper than 1.7 km; the top of the second is 0.6 km deep. In order to interpret the observed epicentral subsidence we have also considered the contributions of two further moderate-magnitude earthquakes that occurred on 1997 October 3 (Mw 5.2) and 6 (Mw 5.4), immediately before the GPS survey, and were located very close to the 09:40 event of September 26. We compare the pattern of vertical displacements resulting from our forward modelling of GPS data with that derived from SAR interferograms: the fit to SAR data is very good, confirming the reliability of the proposed dislocation model

    Improving ranking for systematic reviews using query adaptation

    Get PDF
    Identifying relevant studies for inclusion in systematic reviews requires significant effort from human experts who manually screen large numbers of studies. The problem is made more difficult by the growing volume of medical literature and Information Retrieval techniques have proved to be useful to reduce workload. Reviewers are often interested in particular types of evidence such as Diagnostic Test Accuracy studies. This paper explores the use of query adaption to identify particular types of evidence and thereby reduce the workload placed on reviewers. A simple retrieval system that ranks studies using TF.IDF weighted cosine similarity was implemented. The Log-Likelihood, ChiSquared and Odds-Ratio lexical statistics and relevance feedback were used to generate sets of terms that indicate evidence relevant to Diagnostic Test Accuracy reviews. Experiments using a set of 80 systematic reviews from the CLEF2017 and CLEF2018 eHealth tasks demonstrate that the approach improves retrieval performance
    corecore