1,578 research outputs found

    Using 21-cm absorption surveys to measure the average HI spin temperature in distant galaxies

    Full text link
    We present a statistical method for measuring the average HI spin temperature in distant galaxies using the expected detection yields from future wide-field 21cm absorption surveys. As a demonstrative case study we consider a simulated all-southern-sky survey of 2-h per pointing with the Australian Square Kilometre Array Pathfinder for intervening HI absorbers at intermediate cosmological redshifts between z=0.4z = 0.4 and 11. For example, if such a survey yielded 10001000 absorbers we would infer a harmonic-mean spin temperature of T‾spin∼100\overline{T}_\mathrm{spin} \sim 100K for the population of damped Lyman α\alpha (DLAs) absorbers at these redshifts, indicating that more than 5050 per cent of the neutral gas in these systems is in a cold neutral medium (CNM). Conversely, a lower yield of only 100 detections would imply T‾spin∼1000\overline{T}_\mathrm{spin} \sim 1000K and a CNM fraction less than 1010 per cent. We propose that this method can be used to provide independent verification of the spin temperature evolution reported in recent 21cm surveys of known DLAs at high redshift and for measuring the spin temperature at intermediate redshifts below z≈1.7z \approx 1.7, where the Lyman-α\alpha line is inaccessible using ground-based observatories. Increasingly more sensitive and larger surveys with the Square Kilometre Array should provide stronger statistical constraints on the average spin temperature. However, these will ultimately be limited by the accuracy to which we can determine the HI column density frequency distribution, the covering factor and the redshift distribution of the background radio source population.Comment: 11 pages, 9 figures, 1 table. Proof corrected versio

    Stressors in anaesthesiology: development and validation of a new questionnaire: A cross-sectional study of Portuguese anaesthesiologists

    Get PDF
    BACKGROUND: Stress in anaesthesiologists is a common and multifactorial problem related to patients, colleagues and organisations. The consequences of stress include depression, work-home conflicts and burnout. Reduction in stress can be achieved by reducing the number and magnitude of stressors or by increasing resilience strategies. OBJECTIVES: We have created the self-reporting 'Stress Questionnaire in Anaesthesiologists' (SQA), to qualify the sources of stress in anaesthesiologists' professional lives, and measure the level of associated stress. Our study aimed to develop and validate the SQA using exploratory and confirmatory factor analyses. Construct validity was assessed through correlations between SQA and negative psychological outcomes as well as by comparing perception of stress among different known groups. DESIGN: A questionnaire-based cross-sectional, correlational, observational study. SETTINGS: The study was conducted between January 2014 and December 2014, throughout different anaesthesia departments in Portuguese hospitals. Data collection was from a representative subset at one specific time point. PARTICIPANTS: A sample of 710 anaesthesia specialists and residents from Portugal. MAIN OUTCOME MEASURES: The primary outcome measure was to identify specific stressors in anaesthesiologists. Secondary outcome was the association between stressors and burnout, depression symptoms, anxiety, stress, rumination, satisfaction with life and functional impairment. RESULTS: The exploratory analysis showed the SQA is a tri-dimensional instrument and confirmatory analysis showed the tri-dimensional structure presented good model fit. The three dimensions of SQA correlated positively with other stress measures and burnout, but negatively with satisfaction with life. CONCLUSION: SQA is a well adjusted measure for assessing stressors in anaesthesia physicians and includes clinical, organisational and team stress factors. Results showed that the SQA is a robust and reliable instrument.info:eu-repo/semantics/publishedVersio

    A homomorphism between link and XXZ modules over the periodic Temperley-Lieb algebra

    Full text link
    We study finite loop models on a lattice wrapped around a cylinder. A section of the cylinder has N sites. We use a family of link modules over the periodic Temperley-Lieb algebra EPTL_N(\beta, \alpha) introduced by Martin and Saleur, and Graham and Lehrer. These are labeled by the numbers of sites N and of defects d, and extend the standard modules of the original Temperley-Lieb algebra. Beside the defining parameters \beta=u^2+u^{-2} with u=e^{i\lambda/2} (weight of contractible loops) and \alpha (weight of non-contractible loops), this family also depends on a twist parameter v that keeps track of how the defects wind around the cylinder. The transfer matrix T_N(\lambda, \nu) depends on the anisotropy \nu and the spectral parameter \lambda that fixes the model. (The thermodynamic limit of T_N is believed to describe a conformal field theory of central charge c=1-6\lambda^2/(\pi(\lambda-\pi)).) The family of periodic XXZ Hamiltonians is extended to depend on this new parameter v and the relationship between this family and the loop models is established. The Gram determinant for the natural bilinear form on these link modules is shown to factorize in terms of an intertwiner i_N^d between these link representations and the eigenspaces of S^z of the XXZ models. This map is shown to be an isomorphism for generic values of u and v and the critical curves in the plane of these parameters for which i_N^d fails to be an isomorphism are given.Comment: Replacement of "The Gram matrix as a connection between periodic loop models and XXZ Hamiltonians", 31 page

    Une méthodologie de modélisation numérique de terrain pour la simulation hydrodynamique bidimensionnelle

    Get PDF
    L'article pose la problématique de la construction du Modèle Numérique de Terrain (MNT) dans le contexte d'études hydrauliques à deux dimensions, ici reliées aux inondations. La difficulté est liée à l'hétérogénéité des ensembles de données qui diffèrent en précision, en couverture spatiale, en répartition et en densité, ainsi qu'en géoréférentiation, notamment. Dans le cadre d'un exercice de modélisation hydrodynamique, toute la région à l'étude doit être documentée et l'information portée sur un support homogène. L'article propose une stratégie efficace supportée par un outil informatique, le MODELEUR, qui permet de fusionner rapidement les divers ensembles disponibles pour chaque variable qu'elle soit scalaire comme la topographie ou vectorielle comme le vent, d'en préserver l'intégrité et d'y donner accès efficacement à toutes les étapes du processus d'analyse et de modélisation. Ainsi, quelle que soit l'utilisation environnementale du modèle numérique de terrain (planification d'aménagement, conservation d'habitats, inondations, sédimentologie), la méthode permet de travailler avec la projection des données sur un support homogène de type maillage d'éléments finis et de conserver intégralement l'original comme référence. Cette méthode est basée sur une partition du domaine d'analyse par type d'information : topographie, substrat, rugosité de surface, etc.. Une partition est composée de sous-domaines et chacun associe un jeu de données à une portion du domaine d'analyse par un procédé déclaratoire. Ce modèle conceptuel forme à notre sens le MNT proprement dit. Le processus de transfert des données des partitions à un maillage d'analyse est considéré comme un résultat du MNT et non le MNT lui-même. Il est réalisé à l'aide d'une technique d'interpolation comme la méthode des éléments finis. Suite aux crues du Saguenay en 1996, la méthode a pu être testée et validée pour en démontrer l'efficacité. Cet exemple nous sert d'illustration.This article exposes the problem of constructing a Numerical Terrain Model (NTM) in the particular context of two-dimensional (2D) hydraulic studies, herein related to floods. The main difficulty is related to the heterogeneity of the data sets that differ in precision, in spatial coverage, distribution and density, and in georeference, among others. Within the framework of hydrodynamic modelling, the entire region under study must be documented and the information carried on a homogeneous grid. One proposes here an efficient strategy entirely supported by a software tool called MODELEUR, which allows to import, gather and merge together very heterogeneous data sets, whatever type they are, scalar like topography or vectorial like wind, to preserve their integrity, and provide access to them in their original form at every step of the modelling exercise. Thus, whatever the environmental purpose of the modelling exercise (enhancement works, sedimentology, conservation of habitats, flood risks analysis), the method allows to work with the projection of the data sets on a homogeneous finite element grid and to conserves integrally the original sets as the ultimate reference. This method is based on a partition of the domain under study for each data type: topography, substrates, surface roughness, etc. Each partition is composed of sub-domains and each of them associates a data set to a portion of the domain in a declarative way. This conceptual model represents formally the NTM. The process of data transfer from the partitions to the final grid is considered as a result of the NTM and not the NTM itself. It is performed by interpolation with a technique like the finite element method. Following the huge Saguenay flood in 1996, the efficiency of this method has been tested and validated successfully and this example serves here as an illustration.The accurate characteristics description of both river main channel and flood plain is essential to any hydrodynamic simulation, especially if extreme discharges are considered and if the two-dimensional approach is used.The ground altitude and the different flow resistance factors are basic information that the modeler should pass on to the simulator. For too long, this task remained "the poor relative" of the modeling process because it does not a priori seem to raise any particular difficulty. In practice however, it represents a very significant workload for the mobilisation of the models, besides hiding many pitfalls susceptible to compromise the quality of the hydraulic results. As well as the velocity and water level fields are results of the hydrodynamic model, the variables describing the terrain and transferred on the simulation mesh constitute the results of the Numerical Terrain Model (NTM). Because this is strictly speaking a modeling exercise, a validation of the results that assess the quality of the model is necessary.In this paper, we propose a methodology to integrate the heterogeneous data sets for the construction of the NTM with the aim of simulating 2D hydrodynamics of natural streams with the finite element method. This methodology is completely supported by a software, MODELEUR, developed at INRS-Eau (Secretan and Leclerc, 1998; Secretan et al., 2000). This tool, which can be assimilated to a Geographical Information System (GIS) dedicated to the applications of 2D flow simulations, allows to carry out all the steps of integrating the raw data sets for the conception of a complete NTM. Furthermore, it facilitates the application and the piloting of hydrodynamic simulations with the simulator HYDROSIM (Heniche et al., 1999).Scenarios for flow analysis require frequent and important changes in the mesh carrying the data. A return to the basis data sets is then required, which obliges to preserve them in their entirety, to have easily access to them and to transfer them efficiently on the mesh. That is why the NTM should rather put emphasis on basic data rather than on their transformed and inevitably degraded aspect after their transfer to a mesh.The data integrity should be preserved as far as possible in the sense that it is imperative to keep distinct and to give access separately to different data sets. Two measuring campaigns will not be mixed; for example, the topography resulting from digitised maps will be maintained separated from that resulting from echo-sounding campaigns. This approach allows at any time to return to the measures, to control them, to validate them, to correct them and possibly, to substitute a data set a by another one.The homogeneity of the data support with respect to the location of the data points is essential to allow the algebraic interaction between the different information layers. The operational objective which bear up ultimately the creation of the NTM in the present context is to be able to transfer efficiently the spatial basic data (measurements, geometry of civil works, etc.) each carried by diverse discretisations towards a single carrying structure.With these objectives of integrity, accessibility, efficiency and homogeneity, the proposed method consists of the following steps:1. Import of the data sets into the database, which possibly implies to digitise maps and/or to reformat the raw files to a compatible file format; 2. Construction and assembly of the NTM properly which consists, for each variable (topography, roughness, etc.), to create a partition of the domain under study, that is to subdivide it into juxtaposed sub-domains and to associate to each sub-domain the data set which describes the variable on it. More exactly, this declaratory procedure uses irregular polygons allowing to specify in the corresponding sub-domains the data source to be used in the construction of the NTM. As it is also possible to transform regions of the domain with algebraic functions to represent for example civil works in river (dikes, levees, etc.), the NTM integrates all the validated data sets and the instructions to transform them locally. From this stage, the NTM exists as entity-model and it has a conceptual character;3. Construction of a finite element mesh;4. Transfer by interpolation and assembly of the data of the different components of the NTM on the finite element mesh according to the instructions contained in the various partitions. The result is an instance of the NTM and its quality depends on the density of the mesh and the variability of the data. So, it requires a validation with respect to the original data;5. Realisation of the analysis tasks and/or hydrodynamic simulations. If the mesh should be modified for a project variant or for an analysis scenario, only tasks 3 and 4 are to be redone and task 4 is completely automated in the MODELEUR.The heterogeneity of the data sources, which constitutes one of the main difficulties of the exercise, can be classified in three groups: according to the measuring technique used; according to the format or the representation model used; according to the geographic datum and projection system.For the topography, the measuring techniques include conventional or radar satellite, airborne techniques, photogrammetry or laser scanning, ground techniques, total station or GPS station, as well as embarked techniques as the echo-sounder. These data come in the form of paper maps that have to be digitised, in the form of regular or random data points, isolines of altitude, or even as transects. They can be expressed in different datums and projections and sometime are not even georeferenced and must be first positioned.As for the bed roughness that determines the resistance to the flow, also here the data sets differ one from the other in many aspects. Data can here also have been picked as regular or random points, as homogeneous zones or as transects. Data can represent the average grain size of the present materials, the dimension of the passing fraction (D85 or D50 or median), the represented % of the surface corresponding to every fraction of the grain assemblage, etc... In absence of this basic data, the NTM can only represent the value of the friction parameter, typically n of Manning, which should be obtained by calibration for the hydrodynamic model. For the vegetation present in the flood plain or for aquatic plants, source data can be as variable as for the bed roughness. Except for the cases where data exists, the model of vegetation often consists of the roughness parameter obtained during the calibration exercise. The method was successfully applied in numerous contexts as demonstrated by the application realised on the Chicoutimi River after the catastrophic flood in the Saguenay region in 1996. The huge heterogeneity of the available data in that case required the application of such a method as proposed. So, elevation data obtained by photogrammetry, by total station or by echo-sounder on transects could be coordinated and investigated simultaneously for the purposes of hydrodynamic simulation or of sedimentary balance in zones strongly affected by the flood

    Environmental Control and Life Support Integration Strategy for 6-Crew Operations

    Get PDF
    The International Space Station (ISS) crew complement has increased in size from 3 to 6 crew members. In order to support this increase in crew on ISS, the United States on-orbit Segment (USOS) has been outfitted with a suite of regenerative Environmental Control and Life Support (ECLS) hardware including an Oxygen Generation System (OGS), Waste and Hygiene Compartment (WHC), and a Water Recovery System (WRS). The WRS includes the Urine Processor Assembly (UPA) and the Water Processor Assembly (WPA). With this additional life support hardware, the ISS has achieved full redundancy in its on-orbit life support system between the t OS and Russian Segment (RS). The additional redundancy created by the Regenerative ECLS hardware creates the opportunity for independent support capabilities between segments, and for the first time since the start of ISS, the necessity to revise Life Support strategy agreements. Independent operating strategies coupled with the loss of the Space Shuttle supply and return capabilities in 2010 offer new and unique challenges. This paper will discuss the evolution of the ISS Life Support hardware strategy in support of 6-Crew on ISS, as well as the continued work that is necessary to ensure the support of crew and ISS Program objectives through the life of statio

    Refined conformal spectra in the dimer model

    Full text link
    Working with Lieb's transfer matrix for the dimer model, we point out that the full set of dimer configurations may be partitioned into disjoint subsets (sectors) closed under the action of the transfer matrix. These sectors are labelled by an integer or half-integer quantum number we call the variation index. In the continuum scaling limit, each sector gives rise to a representation of the Virasoro algebra. We determine the corresponding conformal partition functions and their finitizations, and observe an intriguing link to the Ramond and Neveu-Schwarz sectors of the critical dense polymer model as described by a conformal field theory with central charge c=-2.Comment: 44 page
    • …
    corecore