25,288 research outputs found

    Models of Dynamic Data for Emergency Response: A Comparative Study

    Get PDF
    The first hours after a disaster happens are very chaotic and difficult but perhaps the most important for successfully fighting the consequences, saving human lives and reducing damages in private and public properties. Despite some advances, complete inventory of the information needed during the emergency response remains challenging. In the last years several nationally and internationally funded projects have concentrated on inventory of emergency response processes, structures for storing dynamic information and standards and services for accessing needed data sets. A good inventory would clarify many aspects of the information exchange such as data sets, models, representations; a good structuring would facilitate the fast access to a desired piece of information, as well as the automation of analysis of the information. Consequently the information can be used better in the decision-making process.\ud This paper presents our work on models for dynamic data for different disasters and incidents in Europe. The Dutch data models are derived from a thorough study on emergency response procedure in the Netherlands. Two more models developed within the project HUMBOLDT reflect several cross border disaster management scenarios in Europe. These models are compared with the Geospatial Data Model of the Department of Homeland Security in USA. The paper draws conclusions about the type of geographical information needed to perform emergency response operations and the possibility to have a generic model to be used world-wide

    Syntactic structure and artificial grammar learning : The learnability of embedded hierarchical structures

    Get PDF
    Embedded hierarchical structures, such as ‘‘the rat the cat ate was brown’’, constitute a core generative property of a natural language theory. Several recent studies have reported learning of hierarchical embeddings in artificial grammar learning (AGL) tasks, and described the functional specificity of Broca’s area for processing such structures. In two experiments, we investigated whether alternative strategies can explain the learning success in these studies. We trained participants on hierarchical sequences, and found no evidence for the learning of hierarchical embeddings in test situations identical to those from other studies in the literature. Instead, participants appeared to solve the task by exploiting surface distinctions between legal and illegal sequences, and applying strategies such as counting or repetition detection. We suggest alternative interpretations for the observed activation of Broca’s area, in terms of the application of calculation rules or of a differential role of working memory. We claim that the learnability of hierarchical embeddings in AGL tasks remains to be demonstrated

    Modeling and prediction of surgical procedure times

    Get PDF
    Accurate prediction of medical operation times is of crucial importance for cost efficient operation room planning in hospitals. This paper investigates the possible dependence of procedure times on surgeon factors like age, experience, gender, and team composition. The effect of these factors is estimated for over 30 different types of medical operations in two hospitals, by means of ANOVA models for logarithmic case durations. The estimation data set contains about 30,000 observations from 2005 till 2008. The relevance of surgeon factors depends on the type of operation. The factors found most often to be significant are team composition, experience, and daytime. Contrary to widespread opinions among surgeons, gender has nearly never a significant effect. By incorporating surgeon factors, the accuracy of out-of-sample prediction of case durations of about 1,250 surgical operations in 2009 is improved by up to more than 15 percent as compared to current planning procedures.planning;ANOVA model;European hospital;current procedure terminology (CPT);health care management;lognormal distribution;operation room;surgeon factors

    ISM composition through X-ray spectroscopy of LMXBs

    Full text link
    The diffuse interstellar medium (ISM) is an integral part of the evolution of the entire Galaxy. Metals are produced by stars and their abundances are the direct testimony of the history of stellar evolution. However, the interstellar dust composition is not well known and the total abundances are yet to be accurately determined. We probe ISM dust composition, total abundances, and abundance gradients through the study of interstellar absorption features in the high-resolution X-ray spectra of Galactic low-mass X-ray binaries (LMXBs). We use high-quality grating spectra of nine LMXBs taken with XMM-Newton. We measure the column densities of O, Ne, Mg, and Fe with an empirical model and estimate the Galactic abundance gradients. The column densities of the neutral gas species are in agreement with those found in the literature. Solids are a significant reservoir of metals like oxygen and iron. Respectively, 15-25 % and 65-90 % of the total amount of O I and Fe I is found in dust. The dust amount and mixture seem to be consistent along all the lines-of-sight (LOS). Our estimates of abundance gradients and predictions of local interstellar abundances are in agreement with those measured at longer wavelengths. Our work shows that X-ray spectroscopy is a very powerful method to probe the ISM. For instance, on a large scale the ISM appears to be chemically homogeneous showing similar gas ionization ratios and dust mixtures. The agreement between the abundances of the ISM and the stellar objects suggests that the local Galaxy is also chemically homogeneous.Comment: 13 pages, 10 figures, 5 tables, accepted to A&

    Right-handed charged currents in the era of the Large Hadron Collider

    Full text link
    We discuss the phenomenology of right-handed charged currents in the framework of the Standard Model Effective Field Theory, in which they arise due to a single gauge-invariant dimension-six operator. We study the manifestations of the nine complex couplings of the WW to right-handed quarks in collider physics, flavor physics, and low-energy precision measurements. We first obtain constraints on the couplings under the assumption that the right-handed operator is the dominant correction to the Standard Model at observable energies. We subsequently study the impact of degeneracies with other Beyond-the-Standard-Model effective interactions and identify observables, both at colliders and low-energy experiments, that would uniquely point to right-handed charged currents.Comment: 50 pages plus appendices and reference

    Image Ellipticity from Atmospheric Aberrations

    Get PDF
    We investigate the ellipticity of the point-spread function (PSF) produced by imaging an unresolved source with a telescope, subject to the effects of atmospheric turbulence. It is important to quantify these effects in order to understand the errors in shape measurements of astronomical objects, such as those used to study weak gravitational lensing of field galaxies. The PSF modeling involves either a Fourier transform of the phase information in the pupil plane or a ray-tracing approach, which has the advantage of requiring fewer computations than the Fourier transform. Using a standard method, involving the Gaussian weighted second moments of intensity, we then calculate the ellipticity of the PSF patterns. We find significant ellipticity for the instantaneous patterns (up to more than 10%). Longer exposures, which we approximate by combining multiple (N) images from uncorrelated atmospheric realizations, yield progressively lower ellipticity (as 1 / sqrt(N)). We also verify that the measured ellipticity does not depend on the sampling interval in the pupil plane using the Fourier method. However, we find that the results using the ray-tracing technique do depend on the pupil sampling interval, representing a gradual breakdown of the geometric approximation at high spatial frequencies. Therefore, ray tracing is generally not an accurate method of modeling PSF ellipticity induced by atmospheric turbulence unless some additional procedure is implemented to correctly account for the effects of high spatial frequency aberrations. The Fourier method, however, can be used directly to accurately model PSF ellipticity, which can give insights into errors in the statistics of field galaxy shapes used in studies of weak gravitational lensing.Comment: 9 pages, 5 color figures (some reduced in size). Accepted for publication in the Astrophysical Journa

    Sustainability of small reservoirs and large scale water availability under current conditions and climate change

    Get PDF
    Semi-arid river basins often rely on reservoirs for water supply. Small reservoirs may impact on large-scale water availability both by enhancing availability in a distributed sense and by subtracting water for large downstream user communities, e.g. served by large reservoirs. Both of these impacts of small reservoirs are subject to climate change. Using a case-study on North-East Brazil, this paper shows that climate change impacts on water availability may be severe, and impacts on distributed water availability from small reservoirs may exceed impacts on centralised water availability from large reservoirs. Next, the paper shows that the effect of small reservoirs on water availability from large reservoirs may be significant, and increase both in relative and absolute sense under unfavourable climate change
    • 

    corecore