1,417 research outputs found

    Using 21-cm absorption surveys to measure the average HI spin temperature in distant galaxies

    Full text link
    We present a statistical method for measuring the average HI spin temperature in distant galaxies using the expected detection yields from future wide-field 21cm absorption surveys. As a demonstrative case study we consider a simulated all-southern-sky survey of 2-h per pointing with the Australian Square Kilometre Array Pathfinder for intervening HI absorbers at intermediate cosmological redshifts between z=0.4z = 0.4 and 11. For example, if such a survey yielded 10001000 absorbers we would infer a harmonic-mean spin temperature of T‾spin∼100\overline{T}_\mathrm{spin} \sim 100K for the population of damped Lyman α\alpha (DLAs) absorbers at these redshifts, indicating that more than 5050 per cent of the neutral gas in these systems is in a cold neutral medium (CNM). Conversely, a lower yield of only 100 detections would imply T‾spin∼1000\overline{T}_\mathrm{spin} \sim 1000K and a CNM fraction less than 1010 per cent. We propose that this method can be used to provide independent verification of the spin temperature evolution reported in recent 21cm surveys of known DLAs at high redshift and for measuring the spin temperature at intermediate redshifts below z≈1.7z \approx 1.7, where the Lyman-α\alpha line is inaccessible using ground-based observatories. Increasingly more sensitive and larger surveys with the Square Kilometre Array should provide stronger statistical constraints on the average spin temperature. However, these will ultimately be limited by the accuracy to which we can determine the HI column density frequency distribution, the covering factor and the redshift distribution of the background radio source population.Comment: 11 pages, 9 figures, 1 table. Proof corrected versio

    Ultra-violet photo-ionisation in far-infrared selected sources

    Full text link
    It has been reported that there is a deficit of stellar heated dust, as evident from the lack of far-infrared (FIR) emission, in sources within the Herschel-SPIRE sample with X-ray luminosities exceeding a ``critical value'' of L~10^37 W. Such a scenario would be consistent with the suppression of star formation by the AGN, required by current theoretical models. Since absorption of the 21-cm transition of neutral hydrogen (HI), which traces the star-forming reservoir, also exhibits a critical value in the ultra-violet band (above ionising photon rates of Q ~ 3 x 10^56 s^-1), we test the SPIRE sample for the incidence of the detection of 250 micron emission with Q. The highest value at which FIR emission is detected above the SPIRE confusion limit is Q = 8.9 x 10^57 s^-1, which is ~30 times that for the HI, with no critical value apparent. Since complete ionisation of the neutral atomic gas is expected at Q > 3 x 10^56 s-1., this may suggest that much of the FIR must arise from heating of the dust by the AGN. However, integrating the ionising photon rate of each star over the initial mass function, we cannot rule out that the high observed ionising photon rates are due to a population of hot, massive stars.Comment: Accepted by A&

    Effect of rainfall patterns on soil surface CO(2 )efflux, soil moisture, soil temperature and plant growth in a grassland ecosystem of northern Ontario, Canada: implications for climate change

    Get PDF
    BACKGROUND: The effect of rainfall patterns on soil surface CO(2 )efflux, soil moisture, soil temperature and plant growth was investigated in a grassland ecosystem of northern Ontario, Canada, where climatic change is predicted to introduce new precipitation regimes. Rain shelters were established in a fallow field consisting mainly of Trifolium hybridum L., Trifolium pratense L., and Phleum pratense L. Daytime ambient air temperatures within the shelters increased by an average of 1.9°C similar to predicted future increases in air temperatures for this region. To simulate six precipitation regimes which cover the maximum range to be expected under climate change, a portable irrigation system was designed to modify the frequency of monthly rainfall events with a constant delivery rate of water, while maintaining contemporary average precipitation volumes. Controls consisted of blocks irrigated with frequencies and total monthly precipitation consistent with the 25 year average rainfall for this location. RESULTS: Seasonal soil moisture correlated with soil surface CO(2 )efflux (R = 0.756, P < 0.001) and above ground plant biomass (R = 0.447, P = 0.029). By reducing irrigation frequency, soil surface CO(2 )efflux decreased by 80%, P < 0.001, while soil moisture content decreased by 42%, P < 0.001. CONCLUSIONS: Manipulating the number of precipitation events and inter-rainfall intervals, while maintaining monthly rainfall averages impacted CO(2 )efflux and plant growth. Even with monthly rainfall averages that are similar to contemporary monthly precipitation averages, decreasing the number of monthly rainfall events reduced soil surface CO(2 )efflux and plant growth through soil moisture deficits. Although many have speculated that climate change will increase ecosystem productivity, our results show that a reduction in the number of monthly rainfall events while maintaining monthly averages will limit carbon dynamics

    Une méthodologie de modélisation numérique de terrain pour la simulation hydrodynamique bidimensionnelle

    Get PDF
    L'article pose la problématique de la construction du Modèle Numérique de Terrain (MNT) dans le contexte d'études hydrauliques à deux dimensions, ici reliées aux inondations. La difficulté est liée à l'hétérogénéité des ensembles de données qui diffèrent en précision, en couverture spatiale, en répartition et en densité, ainsi qu'en géoréférentiation, notamment. Dans le cadre d'un exercice de modélisation hydrodynamique, toute la région à l'étude doit être documentée et l'information portée sur un support homogène. L'article propose une stratégie efficace supportée par un outil informatique, le MODELEUR, qui permet de fusionner rapidement les divers ensembles disponibles pour chaque variable qu'elle soit scalaire comme la topographie ou vectorielle comme le vent, d'en préserver l'intégrité et d'y donner accès efficacement à toutes les étapes du processus d'analyse et de modélisation. Ainsi, quelle que soit l'utilisation environnementale du modèle numérique de terrain (planification d'aménagement, conservation d'habitats, inondations, sédimentologie), la méthode permet de travailler avec la projection des données sur un support homogène de type maillage d'éléments finis et de conserver intégralement l'original comme référence. Cette méthode est basée sur une partition du domaine d'analyse par type d'information : topographie, substrat, rugosité de surface, etc.. Une partition est composée de sous-domaines et chacun associe un jeu de données à une portion du domaine d'analyse par un procédé déclaratoire. Ce modèle conceptuel forme à notre sens le MNT proprement dit. Le processus de transfert des données des partitions à un maillage d'analyse est considéré comme un résultat du MNT et non le MNT lui-même. Il est réalisé à l'aide d'une technique d'interpolation comme la méthode des éléments finis. Suite aux crues du Saguenay en 1996, la méthode a pu être testée et validée pour en démontrer l'efficacité. Cet exemple nous sert d'illustration.This article exposes the problem of constructing a Numerical Terrain Model (NTM) in the particular context of two-dimensional (2D) hydraulic studies, herein related to floods. The main difficulty is related to the heterogeneity of the data sets that differ in precision, in spatial coverage, distribution and density, and in georeference, among others. Within the framework of hydrodynamic modelling, the entire region under study must be documented and the information carried on a homogeneous grid. One proposes here an efficient strategy entirely supported by a software tool called MODELEUR, which allows to import, gather and merge together very heterogeneous data sets, whatever type they are, scalar like topography or vectorial like wind, to preserve their integrity, and provide access to them in their original form at every step of the modelling exercise. Thus, whatever the environmental purpose of the modelling exercise (enhancement works, sedimentology, conservation of habitats, flood risks analysis), the method allows to work with the projection of the data sets on a homogeneous finite element grid and to conserves integrally the original sets as the ultimate reference. This method is based on a partition of the domain under study for each data type: topography, substrates, surface roughness, etc. Each partition is composed of sub-domains and each of them associates a data set to a portion of the domain in a declarative way. This conceptual model represents formally the NTM. The process of data transfer from the partitions to the final grid is considered as a result of the NTM and not the NTM itself. It is performed by interpolation with a technique like the finite element method. Following the huge Saguenay flood in 1996, the efficiency of this method has been tested and validated successfully and this example serves here as an illustration.The accurate characteristics description of both river main channel and flood plain is essential to any hydrodynamic simulation, especially if extreme discharges are considered and if the two-dimensional approach is used.The ground altitude and the different flow resistance factors are basic information that the modeler should pass on to the simulator. For too long, this task remained "the poor relative" of the modeling process because it does not a priori seem to raise any particular difficulty. In practice however, it represents a very significant workload for the mobilisation of the models, besides hiding many pitfalls susceptible to compromise the quality of the hydraulic results. As well as the velocity and water level fields are results of the hydrodynamic model, the variables describing the terrain and transferred on the simulation mesh constitute the results of the Numerical Terrain Model (NTM). Because this is strictly speaking a modeling exercise, a validation of the results that assess the quality of the model is necessary.In this paper, we propose a methodology to integrate the heterogeneous data sets for the construction of the NTM with the aim of simulating 2D hydrodynamics of natural streams with the finite element method. This methodology is completely supported by a software, MODELEUR, developed at INRS-Eau (Secretan and Leclerc, 1998; Secretan et al., 2000). This tool, which can be assimilated to a Geographical Information System (GIS) dedicated to the applications of 2D flow simulations, allows to carry out all the steps of integrating the raw data sets for the conception of a complete NTM. Furthermore, it facilitates the application and the piloting of hydrodynamic simulations with the simulator HYDROSIM (Heniche et al., 1999).Scenarios for flow analysis require frequent and important changes in the mesh carrying the data. A return to the basis data sets is then required, which obliges to preserve them in their entirety, to have easily access to them and to transfer them efficiently on the mesh. That is why the NTM should rather put emphasis on basic data rather than on their transformed and inevitably degraded aspect after their transfer to a mesh.The data integrity should be preserved as far as possible in the sense that it is imperative to keep distinct and to give access separately to different data sets. Two measuring campaigns will not be mixed; for example, the topography resulting from digitised maps will be maintained separated from that resulting from echo-sounding campaigns. This approach allows at any time to return to the measures, to control them, to validate them, to correct them and possibly, to substitute a data set a by another one.The homogeneity of the data support with respect to the location of the data points is essential to allow the algebraic interaction between the different information layers. The operational objective which bear up ultimately the creation of the NTM in the present context is to be able to transfer efficiently the spatial basic data (measurements, geometry of civil works, etc.) each carried by diverse discretisations towards a single carrying structure.With these objectives of integrity, accessibility, efficiency and homogeneity, the proposed method consists of the following steps:1. Import of the data sets into the database, which possibly implies to digitise maps and/or to reformat the raw files to a compatible file format; 2. Construction and assembly of the NTM properly which consists, for each variable (topography, roughness, etc.), to create a partition of the domain under study, that is to subdivide it into juxtaposed sub-domains and to associate to each sub-domain the data set which describes the variable on it. More exactly, this declaratory procedure uses irregular polygons allowing to specify in the corresponding sub-domains the data source to be used in the construction of the NTM. As it is also possible to transform regions of the domain with algebraic functions to represent for example civil works in river (dikes, levees, etc.), the NTM integrates all the validated data sets and the instructions to transform them locally. From this stage, the NTM exists as entity-model and it has a conceptual character;3. Construction of a finite element mesh;4. Transfer by interpolation and assembly of the data of the different components of the NTM on the finite element mesh according to the instructions contained in the various partitions. The result is an instance of the NTM and its quality depends on the density of the mesh and the variability of the data. So, it requires a validation with respect to the original data;5. Realisation of the analysis tasks and/or hydrodynamic simulations. If the mesh should be modified for a project variant or for an analysis scenario, only tasks 3 and 4 are to be redone and task 4 is completely automated in the MODELEUR.The heterogeneity of the data sources, which constitutes one of the main difficulties of the exercise, can be classified in three groups: according to the measuring technique used; according to the format or the representation model used; according to the geographic datum and projection system.For the topography, the measuring techniques include conventional or radar satellite, airborne techniques, photogrammetry or laser scanning, ground techniques, total station or GPS station, as well as embarked techniques as the echo-sounder. These data come in the form of paper maps that have to be digitised, in the form of regular or random data points, isolines of altitude, or even as transects. They can be expressed in different datums and projections and sometime are not even georeferenced and must be first positioned.As for the bed roughness that determines the resistance to the flow, also here the data sets differ one from the other in many aspects. Data can here also have been picked as regular or random points, as homogeneous zones or as transects. Data can represent the average grain size of the present materials, the dimension of the passing fraction (D85 or D50 or median), the represented % of the surface corresponding to every fraction of the grain assemblage, etc... In absence of this basic data, the NTM can only represent the value of the friction parameter, typically n of Manning, which should be obtained by calibration for the hydrodynamic model. For the vegetation present in the flood plain or for aquatic plants, source data can be as variable as for the bed roughness. Except for the cases where data exists, the model of vegetation often consists of the roughness parameter obtained during the calibration exercise. The method was successfully applied in numerous contexts as demonstrated by the application realised on the Chicoutimi River after the catastrophic flood in the Saguenay region in 1996. The huge heterogeneity of the available data in that case required the application of such a method as proposed. So, elevation data obtained by photogrammetry, by total station or by echo-sounder on transects could be coordinated and investigated simultaneously for the purposes of hydrodynamic simulation or of sedimentary balance in zones strongly affected by the flood

    Refined conformal spectra in the dimer model

    Full text link
    Working with Lieb's transfer matrix for the dimer model, we point out that the full set of dimer configurations may be partitioned into disjoint subsets (sectors) closed under the action of the transfer matrix. These sectors are labelled by an integer or half-integer quantum number we call the variation index. In the continuum scaling limit, each sector gives rise to a representation of the Virasoro algebra. We determine the corresponding conformal partition functions and their finitizations, and observe an intriguing link to the Ramond and Neveu-Schwarz sectors of the critical dense polymer model as described by a conformal field theory with central charge c=-2.Comment: 44 page
    • …
    corecore