68 research outputs found

    Chemical complexity in the Horsehead photodissociation region

    Full text link
    The interstellar medium is known to be chemically complex. Organic molecules with up to 11 atoms have been detected in the interstellar medium, and are believed to be formed on the ices around dust grains. The ices can be released into the gas-phase either through thermal desorption, when a newly formed star heats the medium around it and completely evaporates the ices; or through non-thermal desorption mechanisms, such as photodesorption, when a single far-UV photon releases only a few molecules from the ices. The first one dominates in hot cores, hot corinos and strongly UV-illuminated PDRs, while the second one dominates in colder regions, such as low UV-field PDRs. This is the case of the Horsehead were dust temperatures are ~20-30K, and therefore offers a clean environment to investigate what is the role of photodesorption. We have carried-out an unbiased spectral line survey at 3, 2 and 1mm with the IRAM-30m telescope in the Horsehead nebula, with an unprecedented combination of bandwidth high spectral resolution and sensitivity. Two positions were observed: the warm PDR and a cold condensation shielded from the UV field (dense core), located just behind the PDR edge. We summarize our recently published results from this survey and present the first detection of the complex organic molecules HCOOH, CH2CO, CH3CHO and CH3CCH in a PDR. These species together with CH3CN present enhanced abundances in the PDR compared to the dense core. This suggests that photodesorption is an efficient mechanism to release complex molecules into the gas-phase in far-UV illuminated regions.Comment: 15 pages, 7 figures, 7 tables, Accepted in Faraday discussions 16

    Herschel observations of interstellar chloronium

    Get PDF
    Using the Herschel Space Observatory's Heterodyne Instrument for the Far-Infrared (HIFI), we have observed para-chloronium (H2Cl+) toward six sources in the Galaxy. We detected interstellar chloronium absorption in foreground molecular clouds along the sight-lines to the bright submillimeter continuum sources Sgr A (+50 km/s cloud) and W31C. Both the para-H2-35Cl+ and para-H2-37Cl+ isotopologues were detected, through observations of their 1(11)-0(00) transitions at rest frequencies of 485.42 and 484.23 GHz, respectively. For an assumed ortho-to-para ratio of 3, the observed optical depths imply that chloronium accounts for ~ 4 - 12% of chlorine nuclei in the gas phase. We detected interstellar chloronium emission from two sources in the Orion Molecular Cloud 1: the Orion Bar photodissociation region and the Orion South condensation. For an assumed ortho-to-para ratio of 3 for chloronium, the observed emission line fluxes imply total beam-averaged column densities of ~ 2.0E+13 cm-2 and ~ 1.2E+13 cm-2, respectively, for chloronium in these two sources. We obtained upper limits on the para-H2-35Cl+ line strengths toward H2 Peak 1 in the Orion Molecular cloud and toward the massive young star AFGL 2591. The chloronium abundances inferred in this study are typically at least a factor ~10 larger than the predictions of steady-state theoretical models for the chemistry of interstellar molecules containing chlorine. Several explanations for this discrepancy were investigated, but none has proven satisfactory, and thus the large observed abundances of chloronium remain puzzling.Comment: Accepted for publication in the Astrophysical Journa

    Neural network-based emulation of interstellar medium models

    Full text link
    The interpretation of observations of atomic and molecular tracers in the galactic and extragalactic interstellar medium (ISM) requires comparisons with state-of-the-art astrophysical models to infer some physical conditions. Usually, ISM models are too time-consuming for such inference procedures, as they call for numerous model evaluations. As a result, they are often replaced by an interpolation of a grid of precomputed models. We propose a new general method to derive faster, lighter, and more accurate approximations of the model from a grid of precomputed models. These emulators are defined with artificial neural networks (ANNs) designed and trained to address the specificities inherent in ISM models. Indeed, such models often predict many observables (e.g., line intensities) from just a few input physical parameters and can yield outliers due to numerical instabilities or physical bistabilities. We propose applying five strategies to address these characteristics: 1) an outlier removal procedure; 2) a clustering method that yields homogeneous subsets of lines that are simpler to predict with different ANNs; 3) a dimension reduction technique that enables to adequately size the network architecture; 4) the physical inputs are augmented with a polynomial transform to ease the learning of nonlinearities; and 5) a dense architecture to ease the learning of simple relations. We compare the proposed ANNs with standard classes of interpolation methods to emulate the Meudon PDR code, a representative ISM numerical model. Combinations of the proposed strategies outperform all interpolation methods by a factor of 2 on the average error, reaching 4.5% on the Meudon PDR code. These networks are also 1000 times faster than accurate interpolation methods and require ten to forty times less memory. This work will enable efficient inferences on wide-field multiline observations of the ISM

    Gas kinematics around filamentary structures in the Orion B cloud

    Get PDF
    Context. Understanding the initial properties of star-forming material and how they affect the star formation process is key. From an observational point of view, the feedback from young high-mass stars on future star formation properties is still poorly constrained. Aims. In the framework of the IRAM 30m ORION-B large program, we obtained observations of the translucent (2 ≤ AV < 6 mag) and moderately dense gas (6 ≤ AV < 15 mag), which we used to analyze the kinematics over a field of 5 deg2 around the filamentary structures. Methods. We used the Regularized Optimization for Hyper-Spectral Analysis (ROHSA) algorithm to decompose and de-noise the C 18 O(1−0) and 13CO(1−0) signals by taking the spatial coherence of the emission into account. We produced gas column density and mean velocity maps to estimate the relative orientation of their spatial gradients. Results. We identified three cloud velocity layers at different systemic velocities and extracted the filaments in each velocity layer. The filaments are preferentially located in regions of low centroid velocity gradients. By comparing the relative orientation between the column density and velocity gradients of each layer from the ORION-B observations and synthetic observations from 3D kinematic toy models, we distinguish two types of behavior in the dynamics around filaments: (i) radial flows perpendicular to the filament axis that can be either inflows (increasing the filament mass) or outflows and (ii) longitudinal flows along the filament axis. The former case is seen in the Orion B data, while the latter is not identified. We have also identified asymmetrical flow patterns, usually associated with filaments located at the edge of an H II region. Conclusions. This is the first observational study to highlight feedback from H II regions on filament formation and, thus, on star formation in the Orion B cloud. This simple statistical method can be used for any molecular cloud to obtain coherent information on the kinematics

    Deep learning denoising by dimension reduction: Application to the ORION-B line cubes

    Get PDF
    Context. The availability of large bandwidth receivers for millimeter radio telescopes allows the acquisition of position-position-frequency data cubes over a wide field of view and a broad frequency coverage. These cubes contain much information on the physical, chemical, and kinematical properties of the emitting gas. However, their large size coupled with inhomogenous signal-to-noise ratio (SNR) are major challenges for consistent analysis and interpretation.Aims. We search for a denoising method of the low SNR regions of the studied data cubes that would allow to recover the low SNR emission without distorting the signals with high SNR.Methods. We perform an in-depth data analysis of the 13 CO and C 17 O (1 -- 0) data cubes obtained as part of the ORION-B large program performed at the IRAM 30m telescope. We analyse the statistical properties of the noise and the evolution of the correlation of the signal in a given frequency channel with that of the adjacent channels. This allows us to propose significant improvements of typical autoassociative neural networks, often used to denoise hyperspectral Earth remote sensing data. Applying this method to the 13 CO (1 -- 0) cube, we compare the denoised data with those derived with the multiple Gaussian fitting algorithm ROHSA, considered as the state of the art procedure for data line cubes.Results. The nature of astronomical spectral data cubes is distinct from that of the hyperspectral data usually studied in the Earth remote sensing literature because the observed intensities become statistically independent beyond a short channel separation. This lack of redundancy in data has led us to adapt the method, notably by taking into account the sparsity of the signal along the spectral axis. The application of the proposed algorithm leads to an increase of the SNR in voxels with weak signal, while preserving the spectral shape of the data in high SNR voxels.Conclusions. The proposed algorithm that combines a detailed analysis of the noise statistics with an innovative autoencoder architecture is a promising path to denoise radio-astronomy line data cubes. In the future, exploring whether a better use of the spatial correlations of the noise may further improve the denoising performances seems a promising avenue. In addition

    Linguagem e Cultura Empresarial: teria a empresa uma linguagem própria?

    No full text

    Numeroteur de sequences pour enregistreurs graphiques et magnetiques

    No full text

    Contention physique en psychiatrie (constatations et interrogations)

    No full text
    La contention est utilisée depuis l'antiquité. Depuis cette époque, les avis sont partagés entre un usage sécuritaire à visée thérapeutique et sa non utilisation. En France, la contention n'a pas de cadre juridique spécifique et n'est mentionnée dans aucun texte de loi. Le praticien a alors comme seul repère la loi du 27 juin 1990 et la circulaire Veil (du 19 juillet 1993) définissant le cadre de l'hospitalisation sous contrainte. Il peut aussi se référer aux protocoles d'établissements instaurés par les décrets du 16 février et 15 mars 1993. Pour leur part, les pouvoirs publics américains ont défini un cadre strict de recours à la contention : la déclaration obligatoire de chaque épisode, la déclaration de chaque incident y étant associé et une prescription réservée à la protection des personnes (patient et entourage). Ces mesures ont été mises en place en 1998, suite à une polémique médiatique. Un journal à grand tirage affirmait après enquête, que 50 à 150 personnes contenues seraient décédées de façon inattendue tous les ans aux Etats-Unis. Une recrudescence de publications a suivi ces événements. Il apparaît dans la littérature qu'il existe une très grande variabilité d'incidence de contention entre les différentes études. Celles-ci a été rapportée à des facteurs propres à la population soignée (pathologie, sexe, âge, antécédents, etc ...) mais aussi à des facteurs liés aux équipes (une formation spécifique) et à la politique d'établissement. Il existe deux types d'indication de contention : dans un but de protection ou à visée thérapeutique ; cette dernière étant plus controversée. Il existe plusieurs méthodes de contention dont la plus citée est la contention mécanique quatre ou cinq points en décubitus dorsal. En ce qui concerne les morts inattendues des patients contenus, les étiologies les plus fréquentes sont les troubles respiratoires et cardiaques. Mais, pour une grande partie des décès aucune cause n'a été déterminée. On s'étonnera de l'absence dans la littérature des complications thromboemboliques. Ceci peut en partie s'expliquer par le fait que d'après les différentes recommandations, les épisodes de contention ne devraient jamais excéder 24 heures. Enfin, plusieurs expériences montrent qu'une modification de politique et une formation spécifique des équipes conduisent à une diminution importante des incidences et des durées de contention. Il est à noter que cette diminution est associée à une réduction du nombre de blessures parmi les soignants et les patients. On peut alors penser que mieux contenir, c'est moins contenir.ST ETIENNE-BU Médecine (422182102) / SudocPARIS-BIUM (751062103) / SudocSudocFranceF
    corecore