4,882 research outputs found

    Some practicable applications of quadtree data structures/representation in astronomy

    Get PDF
    Development of quadtree as hierarchical data structuring technique for representing spatial data (like points, regions, surfaces, lines, curves, volumes, etc.) has been motivated to a large extent by storage requirements of images, maps, and other multidimensional (spatially structured) data. For many spatial algorithms, time-efficiency of quadtrees in terms of execution may be as important as their space-efficiency concerning storage conditions. Briefly, the quadtree is a class of hierarchical data structures which is based on the recursive partition of a square region into quadrants and sub-quadrants until a predefined limit. Beyond the wide applicability of quadtrees in image processing, spatial information analysis, and building digital databases (processes becoming ordinary for the astronomical community), there may be numerous further applications in astronomy. Some of these practicable applications based on quadtree representation of astronomical data are presented and suggested for further considerations. Examples are shown for use of point as well as region quadtrees. Statistics of different leaf and non-leaf nodes (homogeneous and heterogeneous sub-quadrants respectively) at different levels may provide useful information on spatial structure of astronomical data in question. By altering the principle guiding the decomposition process, different types of spatial data may be focused on. Finally, a sampling method based on quadtree representation of an image is proposed which may prove to be efficient in the elaboration of sampling strategy in a region where observations were carried out previously either with different resolution or/and in different bands

    Competitive Exclusion and Limiting Similarity: A Unified Theory

    Get PDF
    Robustness of coexistence against changes of parameters is investigated in a model-independent manner through analyzing the feed-back loop of population regulation. We define coexistence as fixed point of the community dynamics with no population having zero size. It is demonstrated that the parameter range allowing coexistence shrinks and disappears when the Jacobian of the dynamics decreases to zero. A general notion of regulating factors/variables is introduced. For each population, its 'impact' and 'sensitivity' niches a re defined as the differential impact on, and the differential sensitivity towards, the regulating variables, respectively. Either similarity of the impact niches, or similarity of the sensitivity niches, result in a small Jacobian and in a reduced likelihood of coexistence. For the case of a resource continuum, this result reduces to the usual "limited niches overlap" picture for both kinds of niche. As an extension of these ideas to the coexistence of infinitely many species, we demonstrate that Roughgarden's example for coexistence of a 'continuum' of populations is structurally unstable

    Changing a semantics: opportunism or courage?

    Full text link
    The generalized models for higher-order logics introduced by Leon Henkin, and their multiple offspring over the years, have become a standard tool in many areas of logic. Even so, discussion has persisted about their technical status, and perhaps even their conceptual legitimacy. This paper gives a systematic view of generalized model techniques, discusses what they mean in mathematical and philosophical terms, and presents a few technical themes and results about their role in algebraic representation, calibrating provability, lowering complexity, understanding fixed-point logics, and achieving set-theoretic absoluteness. We also show how thinking about Henkin's approach to semantics of logical systems in this generality can yield new results, dispelling the impression of adhocness. This paper is dedicated to Leon Henkin, a deep logician who has changed the way we all work, while also being an always open, modest, and encouraging colleague and friend.Comment: 27 pages. To appear in: The life and work of Leon Henkin: Essays on his contributions (Studies in Universal Logic) eds: Manzano, M., Sain, I. and Alonso, E., 201

    The impact of the introduction of fidaxomicin on the management of Clostridium difficile infection in seven NHS secondary care hospitals in England: a series of local service evaluations.

    Get PDF
    Clostridium difficile infection (CDI) is associated with high mortality. Reducing incidence is a priority for patients, clinicians, the National Health Service (NHS) and Public Health England alike. In June 2012, fidaxomicin (FDX) was launched for the treatment of adults with CDI. The objective of this evaluation was to collect robust real-world data to understand the effectiveness of FDX in routine practice. In seven hospitals introducing FDX between July 2012 and July 2013, data were collected retrospectively from medical records on CDI episodes occurring 12 months before/after the introduction of FDX. All hospitalised patients aged ≥18 years with primary CDI (diarrhoea with presence of toxin A/B without a previous CDI in the previous 3 months) were included. Recurrence was defined as in-patient diarrhoea re-emergence requiring treatment any time within 3 months after the first episode. Each hospital had a different protocol for the use of FDX. In hospitals A and B, where FDX was used first line for all primary and recurrent episodes, the recurrence rate reduced from 10.6 % to 3.1 % and from 16.3 % to 3.1 %, with a significant difference in 28-day mortality from 18.2 % to 3.1 % (p < 0.05) and 17.3 % to 6.3 % (p < 0.05) for hospitals A and B, respectively. In hospitals using FDX in selected patients only, the changes in recurrence rates and mortality were less marked. The pattern of adoption of FDX appears to affect its impact on CDI outcome, with maximum reduction in recurrence and all-cause mortality where it is used as first-line treatment

    Hunt for new phenomena using large jet multiplicities and missing transverse momentum with ATLAS in 4.7 fb−1 of s√=7TeV proton-proton collisions

    Get PDF
    Results are presented of a search for new particles decaying to large numbers of jets in association with missing transverse momentum, using 4.7 fb−1 of pp collision data at s√=7TeV collected by the ATLAS experiment at the Large Hadron Collider in 2011. The event selection requires missing transverse momentum, no isolated electrons or muons, and from ≥6 to ≥9 jets. No evidence is found for physics beyond the Standard Model. The results are interpreted in the context of a MSUGRA/CMSSM supersymmetric model, where, for large universal scalar mass m 0, gluino masses smaller than 840 GeV are excluded at the 95% confidence level, extending previously published limits. Within a simplified model containing only a gluino octet and a neutralino, gluino masses smaller than 870 GeV are similarly excluded for neutralino masses below 100 GeV

    Search for the standard model Higgs boson at LEP

    Get PDF

    Measurements of Higgs boson production and couplings in diboson final states with the ATLAS detector at the LHC

    Get PDF
    Measurements are presented of production properties and couplings of the recently discovered Higgs boson using the decays into boson pairs, H →γ γ, H → Z Z∗ →4l and H →W W∗ →lνlν. The results are based on the complete pp collision data sample recorded by the ATLAS experiment at the CERN Large Hadron Collider at centre-of-mass energies of √s = 7 TeV and √s = 8 TeV, corresponding to an integrated luminosity of about 25 fb−1. Evidence for Higgs boson production through vector-boson fusion is reported. Results of combined fits probing Higgs boson couplings to fermions and bosons, as well as anomalous contributions to loop-induced production and decay modes, are presented. All measurements are consistent with expectations for the Standard Model Higgs boson

    Standalone vertex finding in the ATLAS muon spectrometer

    Get PDF
    A dedicated reconstruction algorithm to find decay vertices in the ATLAS muon spectrometer is presented. The algorithm searches the region just upstream of or inside the muon spectrometer volume for multi-particle vertices that originate from the decay of particles with long decay paths. The performance of the algorithm is evaluated using both a sample of simulated Higgs boson events, in which the Higgs boson decays to long-lived neutral particles that in turn decay to bbar b final states, and pp collision data at √s = 7 TeV collected with the ATLAS detector at the LHC during 2011

    Measurement of the top quark-pair production cross section with ATLAS in pp collisions at \sqrt{s}=7\TeV

    Get PDF
    A measurement of the production cross-section for top quark pairs(\ttbar) in pppp collisions at \sqrt{s}=7 \TeV is presented using data recorded with the ATLAS detector at the Large Hadron Collider. Events are selected in two different topologies: single lepton (electron ee or muon μ\mu) with large missing transverse energy and at least four jets, and dilepton (eeee, μμ\mu\mu or eμe\mu) with large missing transverse energy and at least two jets. In a data sample of 2.9 pb-1, 37 candidate events are observed in the single-lepton topology and 9 events in the dilepton topology. The corresponding expected backgrounds from non-\ttbar Standard Model processes are estimated using data-driven methods and determined to be 12.2±3.912.2 \pm 3.9 events and 2.5±0.62.5 \pm 0.6 events, respectively. The kinematic properties of the selected events are consistent with SM \ttbar production. The inclusive top quark pair production cross-section is measured to be \sigmattbar=145 \pm 31 ^{+42}_{-27} pb where the first uncertainty is statistical and the second systematic. The measurement agrees with perturbative QCD calculations.Comment: 30 pages plus author list (50 pages total), 9 figures, 11 tables, CERN-PH number and final journal adde
    corecore