1,067 research outputs found
Radiolysis of ammonia-containing ices by energetic, heavy and highly charged ions inside dense astrophysical environments
Deeply inside dense molecular clouds and protostellar disks, the interstellar
ices are protected from stellar energetic UV photons. However, X-rays and
energetic cosmic rays can penetrate inside these regions triggering chemical
reactions, molecular dissociation and evaporation processes. We present
experimental studies on the interaction of heavy, highly charged and energetic
ions (46 MeV Ni^13+) with ammonia-containing ices in an attempt to simulate the
physical chemistry induced by heavy ion cosmic rays inside dense astrophysical
environments. The measurements were performed inside a high vacuum chamber
coupled to the heavy ion accelerator GANIL (Grand Accelerateur National d'Ions
Lourds) in Caen, France.\textit{In-situ} analysis is performed by a Fourier
transform infrared spectrometer (FTIR) at different fluences. The averaged
values for the dissociation cross section of water, ammonia and carbon monoxide
due to heavy cosmic ray ion analogs are ~2x10^{-13}, 1.4x10^{-13} and
1.9x10^{-13} cm, respectively. In the presence of a typical heavy cosmic
ray field, the estimated half life for the studied species is 2-3x10^6 years.
The ice compaction (micropore collapse) due to heavy cosmic rays seems to be at
least 3 orders of magnitude higher than the one promoted by (0.8 MeV) protons .
In the case of the irradiated H2O:NH3:CO ice, the infrared spectrum at room
temperature reveals five bands that were tentatively assigned to vibration
modes of the zwitterionic glycine (+NH3CH2COO-).Comment: Accepted to be published in Astronomy and Astrophysics; Number of
pages: 12; Number of Figures: 7; Number of Tables:
Thermal radiation of various gravitational backgrounds
We present a simple and general procedure for calculating the thermal
radiation coming from any stationary metric. The physical picture is that the
radiation arises as the quasi--classical tunneling of particles through a
gravitational barrier. We show that our procedure can reproduce the results of
Hawking and Unruh radiation. We also show that under certain kinds of
coordinate transformations the temperature of the thermal radiation will change
in the case of the Schwarzschild black holes. In addition we apply our
procedure to a rotating/orbiting system and show that in this case there is no
radiation, which has experimental implications for the polarization of
particles in circular accelerators.Comment: 6 pages revtex, added references, publication version. To be
published IJMP
Systematic reduction of complex tropospheric chemical mechanisms using sensitivity and time-scale analyses
International audienceExplicit mechanisms describing the complex degradation pathways of atmospheric volatile organic compounds (VOCs) are important, since they allow the study of the contribution of individual VOCS to secondary pollutant formation. They are computationally expensive to solve however, since they contain large numbers of species and a wide range of time-scales causing stiffness in the resulting equation systems. This paper and the following companion paper describe the application of systematic and automated methods for reducing such complex mechanisms, whilst maintaining the accuracy of the model with respect to important species and features. The methods are demonstrated via application to version 2 of the Leeds Master Chemical Mechanism. The methods of local concentration sensitivity analysis and overall rate sensitivity analysis proved to be efficient and capable of removing the majority of redundant reactions and species in the scheme across a wide range of conditions relevant to the polluted troposphere. The application of principal component analysis of the rate sensitivity matrix was computationally expensive due to its use of the decomposition of very large matrices, and did not produce significant reduction over and above the other sensitivity methods. The use of the quasi-steady state approximation (QSSA) proved to be an extremely successful method of removing the fast time-scales within the system, as demonstrated by a local perturbation analysis at each stage of reduction. QSSA species were automatically selected via the calculation of instantaneous QSSA errors based on user-selected tolerances. The application of the QSSA led to the removal of a large number of alkoxy radicals and excited Criegee bi-radicals via reaction lumping. The resulting reduced mechanism was shown to reproduce the concentration profiles of the important species selected from the full mechanism over a wide range of conditions, including those outside of which the reduced mechanism was generated. As a result of a reduction in the number of species in the scheme of a factor of 2, and a reduction in stiffness, the computational time required for simulations was reduced by a factor of 4 when compared to the full scheme
A computerized Langmuir probe system
For low pressure plasmas it is important to record entire single or double Langmuir probe characteristics accurately. For plasmas with a depleted high energy tail, the accuracy of the recorded ion current plays a critical role in determining the electron temperature. Even for high density Maxwellian distributions, it is necessary to accurately model the ion current to obtain the correct electron density. Since the electron and ion current saturation values are, at best, orders of magnitude apart, a single current sensing resistor cannot provide the required resolution to accurately record these values. We present an automated, personal computer based data acquisition system for the determination of fundamental plasma properties in low pressure plasmas. The system is designed for single and double Langmuir probes, whose characteristics can be recorded over a bias voltage range of ±70 V with 12 bit resolution. The current flowing through the probes can be recorded within the range of 5 nA–100 mA. The use of a transimpedance amplifier for current sensing eliminates the requirement for traditional current sensing resistors and hence the need to correct the raw data. The large current recording range is realized through the use of a real time gain switching system in the negative feedback loop of the transimpedance amplifier
Systematic reduction of complex tropospheric chemical mechanisms, Part II: Lumping using a time-scale based approach
This paper presents a formal method of species lumping that can be applied automatically to intermediate compounds within detailed and complex tropospheric chemical reaction schemes. The method is based on grouping species with reference to their chemical lifetimes and reactivity structures. A method for determining the forward and reverse transformations between individual and lumped compounds is developed. Preliminary application to the Leeds Master Chemical Mechanism (MCMv2.0) has led to the removal of 734 species and 1777 reactions from the scheme, with minimal degradation of accuracy across a wide range of test trajectories relevant to polluted tropospheric conditions. The lumped groups are seen to relate to groups of peroxy acyl nitrates, nitrates, carbonates, oxepins, substituted phenols, oxeacids and peracids with similar lifetimes and reaction rates with OH. In combination with other reduction techniques, such as sensitivity analysis and the application of the quasi-steady state approximation (QSSA), a reduced mechanism has been developed that contains 35% of the number of species and 40% of the number of reactions compared to the full mechanism. This has led to a speed up of a factor of 8 in terms of computer calculation time within box model simulations
Systematic reduction of complex tropospheric chemical mechanisms, Part I: sensitivity and time-scale analyses
International audienceExplicit mechanisms describing the complex degradation pathways of atmospheric volatile organic compounds (VOCs) are important, since they allow the study of the contribution of individual VOCS to secondary pollutant formation. They are computationally expensive to solve however, since they contain large numbers of species and a wide range of time-scales causing stiffness in the resulting equation systems. This paper and the following companion paper describe the application of systematic and automated methods for reducing such complex mechanisms, whilst maintaining the accuracy of the model with respect to important species and features. The methods are demonstrated via application to version 2 of the Leeds Master Chemical Mechanism. The methods of Jacobian analysis and overall rate sensitivity analysis proved to be efficient and capable of removing the majority of redundant reactions and species in the scheme across a wide range of conditions relevant to the polluted troposphere. The application of principal component analysis of the rate sensitivity matrix was computationally expensive due to its use of the decomposition of very large matrices, and did not produce significant reduction over and above the other sensitivity methods. The use of the quasi-steady state approximation (QSSA) proved to be an extremely successful method of removing the fast time-scales within the system, as demonstrated by a local perturbation analysis at each stage of reduction. QSSA species were automatically selected via the calculation of instantaneous QSSA errors based on user-selected tolerances. The application of the QSSA led to the removal of a large number of alkoxy radicals and excited Criegee bi-radicals via reaction lumping. The resulting reduced mechanism was shown to reproduce the concentration profiles of the important species selected from the full mechanism over a wide range of conditions, including those outside of which the reduced mechanism was generated. As a result of a reduction in the number of species in the scheme of a factor of 2, and a reduction in stiffness, the computational time required for simulations was reduced by a factor of 4 when compared to the full scheme
Systematic lumping of complex tropospheric chemical mechanisms using a time-scale based approach
International audienceThis paper presents a formal method of species lumping that can be applied automatically to intermediate compounds within detailed and complex tropospheric chemical reaction schemes. The method is based on grouping species with reference to their chemical lifetimes and reactivity structures. A method for determining the forward and reverse transformations between individual and lumped compounds is developed. Preliminary application to the Leeds Master Chemical Mechanism (MCMv2.0) has led to the removal of 734 species and 1777 reactions from the scheme, with minimal degradation of accuracy across a wide range of test trajectories relevant to polluted tropospheric conditions. The lumped groups are seen to relate to groups of peroxy acyl nitrates, nitrates, carbonates, oxepins, substituted phenols, oxeacids and peracids with similar lifetimes and reaction rates with OH. In combination with other reduction techniques, such as sensitivity analysis and the application of the quasi-steady state approximation (QSSA), a reduced mechanism has been developed that contains 35% of the number of species and 40% of the number of reactions compared to the full mechanism. This has led to a speed up of a factor of 8 in terms of computer calculation time within box model simulations
Recommended from our members
Dynamic interpersonal therapy for moderate to severe depression: A pilot randomized controlled and feasibility trial
Background: Improving Access to Psychological Therapies (IAPT) services treat most patients in England who present to primary care with major depression. Psychodynamic psychotherapy is one of the psychotherapies offered. Dynamic Interpersonal Therapy (DIT) is a psychodynamic and mentalization-based treatment for depression. 16 sessions are delivered over approximately 5 months. Neither DIT's effectiveness relative to low-intensity treatment (LIT), nor the feasibility of randomizing patients to psychodynamic or cognitive-behavioural treatments (CBT) in an IAPT setting has been demonstrated.
Methods: 147 patients were randomized in a 3:2:1 ratio to DIT (n = 73), LIT (control intervention; n = 54) or CBT (n = 20) in four IAPT treatment services in a combined superiority and feasibility design. Patients meeting criteria for major depressive disorder were assessed at baseline, mid-treatment (3 months) and post-treatment (6 months) using the Hamilton Rating Scale for Depression (HRSD-17), Beck Depression Inventory-II (BDI-II) and other self-rated questionnaire measures. Patients receiving DIT were also followed up 6 months post-completion.
Results: The DIT arm showed significantly lower HRSD-17 scores at the 6-month primary end-point compared with LIT (d = 0.70). Significantly more DIT patients (51%) showed clinically significant change on the HRSD-17 compared with LIT (9%). The DIT and CBT arms showed equivalence on most outcomes. Results were similar with the BDI-II. DIT showed benefit across a range of secondary outcomes.ConclusionsDIT delivered in a primary care setting is superior to LIT and can be appropriately compared with CBT in future RCTs
Impact on sales of adding a smaller serving size of beer and cider in licensed premises: an A-B-A reversal design
Background: Smaller serving sizes of alcoholic drinks could reduce alcohol consumption across populations thereby lowering the risk of many diseases. The effect of modifying the available range of serving sizes of beer and cider in a real-world setting has yet to be studied. The current study assessed the impact on beer and cider sales of adding a serving size of draught beer and cider (2/3 pint) that was between the current smallest (1/2 pint) and largest (1 pint) standard serving sizes. Methods: Twenty-two licensed premises in England consented to taking part in the study. The study used an ABA reversal design, set over three 4-weekly periods, with A representing the non-intervention periods, during which standard serving sizes were served and B the intervention period when a 2/3 pint serving size of draught beer and cider was added to the existing range, along with smaller 1/2 pint and larger 1 pint serving sizes. The primary outcome was the daily volume of beer and cider sold, extracted from sales data. Results: Fourteen premises started the study, of which thirteen completed it. Twelve of those did so per protocol and were included in the primary analysis. After adjusting for pre-specified covariates, the intervention did not have a significant effect on the volume of beer and cider sold per day (3.14 ml; 95%CIs -2.29 to 8.58; p = 0.257). Conclusions: In licensed premises, there was no evidence that adding a smaller serving size for draught beer and cider (2/3 pint) when the smallest (1/2 pint) and largest (1 pint) sizes were still available, affected the volume of beer and cider sold. Studies are warranted to assess the impact of removing the largest serving size. Trial registration: ISRCTN: https://doi.org/10.1186/ISRCTN33169631 (08/09/2021), OSF: https://osf.io/xkgdb/ (08/09/2021)
- …