6,081 research outputs found

    High-throughput Quantum Chemistry: Empowering the Search for Molecular Candidates behind Unknown Spectral Signatures in Exoplanetary Atmospheres

    Full text link
    The identification of molecules in exoplanetary atmospheres is only possible thanks to the availability of high-resolution molecular spectroscopic data. However, due to its intensive and time-consuming generation process, at present, only on order 100 molecules have high-resolution spectroscopic data available, limiting new molecular detections. Using routine quantum chemistry calculations (i.e., scaled harmonic frequency calculations using the B97-1/def2-TZVPD model chemistry with median errors of 10cm-1), here we present a complementary high-throughput approach to rapidly generate approximate vibrational spectral data for 2743 molecules made from the biologically most important elements C, H, N, O, P and S. Though these data are not accurate enough to enable definitive molecular detections and does not seek to replace the need for high-resolution data, it has powerful applications in identifying potential molecular candidates responsible for unknown spectral features. We explore this application for the 4.1 micron (2439cm-1) feature in the atmospheric spectrum of WASP-39b, listing potential alternative molecular species responsible for this spectral line, together with SO2. Further applications of this big data compilation also include identifying molecules with strong absorption features that are likely detectable at quite low abundances, and training set for machine learning predictions of vibrational frequencies. Characterising exoplanetary atmospheres through molecular spectroscopy is essential to understand the planet's physico-chemical processes and likelihood of hosting life. Our rapidly generated quantum chemistry big data set will play a crucial role in supporting this understanding by giving directions into possible initial identifications of the more unusual molecules to emerge

    Automated Exploration of Reaction Network and Mechanism via Meta-dynamics Nanoreactor

    Full text link
    We developed an automated approach to construct the complex reaction network and explore the reaction mechanism for several reactant molecules. The nanoreactor type molecular dynamics was employed to generate possible chemical reactions, in which the meta-dynamics was taken to overcome reaction barriers and the semi-empirical GFN2-xTB method was used to reduce computational cost. The identification of reaction events from trajectories was conducted by using the hidden Markov model based on the evolution of the molecular connectivity. This provided the starting points for the further transition state searches at the more accurate electronic structure levels to obtain the reaction mechanism. Then the whole reaction network with multiply pathways was obtained. The feasibility and efficiency of this automated construction of the reaction network was examined by two examples. The first reaction under study was the HCHO + NH3 biomolecular reaction. The second example focused on the reaction network for a multi-species system composed of dozens of HCN and H2O compounds. The result indicated that the proposed approach was a valuable and effective tool for the automated exploration of reaction networks

    THz Instruments for Space

    Get PDF
    Terahertz technology has been driven largely by applications in astronomy and space science. For more than three decades cosmochemists, molecular spectroscopists, astrophysicists, and Earth and planetary scientists have used submillimeter-wave or terahertz sensors to identify, catalog and map lightweight gases, atoms and molecules in Earth and planetary atmospheres, in regions of interstellar dust and star formation, and in new and old galaxies, back to the earliest days of the universe, from both ground based and more recently, orbital platforms. The past ten years have witnessed the launch and successful deployment of three satellite instruments with spectral line heterodyne receivers above 300 GHz (SWAS, Odin, and MIRO) and a fourth platform, Aura MLS, that reaches to 2520 GHz, crossing the terahertz threshold from the microwave side for the first time. The former Soviet Union launched the first bolometric detectors for the submillimeter way back in 1974 and operated the first space based submillimeter wave telescope on the Salyut 6 station for four months in 1978. In addition, continuum, Fourier transform and spectrophotometer instruments on IRAS, ISO, COBE, the recent Spitzer Space Telescope and Japan's Akari satellite have all encroached into the submillimeter from the infrared using direct detection bolometers or photoconductors. At least two more major satellites carrying submillimeter wave instruments are nearing completion, Herschel and Planck, and many more are on the drawing boards in international and national space organizations such as NASA, ESA, DLR, CNES, and JAXA. This paper reviews some of the programs that have been proposed, completed and are still envisioned for space applications in the submillimeter and terahertz spectral range

    Measurement of stratospheric and mesospheric winds with a submillimeter wave limb sounder: results from JEM/SMILES and simulation study for SMILES-2

    Get PDF
    Satellite missions for measuring winds in the troposphere and thermosphere will be launched in a near future. There is no plan to observe winds in the altitude range between 30-90 km, though middle atmospheric winds are recognized as an essential parameter in various atmospheric research areas. Sub-millimetre limb sounders have the capability to fill this altitude gap. In this paper, we summarize the wind retrievals obtained from the Japanese Superconducting Submillimeter Wave Limb Emission Sounder (SMILES) which operated from the International Space Station between September 2009 and April 2010. The results illustrate the potential of such instruments to measure winds. They also show the need of improving the wind representation in the models in the Tropics, and globally in the mesosphere. A wind measurement sensitivity study has been conducted for its successor, SMILES-2, which is being studied in Japan. If it is realized, sub-millimeter and terahertz molecular lines suitable to determine line-of-sight winds will be measured. It is shown that with the current instrument definition, line-of-sight winds can be observed from 20 km up to more than 160 km. Winds can be retrieved with a precision better than 5 m s(-1) and a vertical resolution of 2-3 km between 35-90 km. Above 90 km, the precision is better than 10 m s(-1) with a vertical resolution of 3-5 km. Measurements can be performed day and night with a similar sensitivity. Requirements on observation parameters such as the antenna size, the satellite altitude are discussed. An alternative setting for the spectral bands is examined. The new setting is compatible with the general scientific objectives of the mission and the instrument design. It allows to improve the wind measurement sensitivity between 35 to 90 km by a factor 2. It is also shown that retrievals can be performed with a vertical resolution of 1 km and a precision of 5-10 m s(-1) between 50 and 90 km. RAGAM A, 1953, PHYSICAL REVIEW, V92, P144

    HCI and CIO Profiles Inside the Antarctic Vortex as Observed by Smiles in November 2009: Comparisons with MLS and ACE-FTS Instruments

    Get PDF
    We present vertical profiles of hydrogen chloride (HCl) and chlorine monoxide (ClO) as observed by the Superconducting Submillimeter-Wave Limb-Emission Sounder (SMILES) on the International Space Station (ISS) inside the Antarctic vortex on 19-24 November 2009. The SMILES HCl value reveals 2.8-3.1 ppbv between 450K and 500K levels in potential temperature (PT). The high value of HCl is highlighted since it is suggested that HCl is a main component of the total inorganic chlorine Cly, defined as Cly similar or equal to HCl + ClO + chlorine nitrate ClONO2, inside the Antarctic vortex in spring, owing to low ozone values. To confirm the quality of two SMILES level 2 (L2) data products provided by the Japan Aerospace Exploration Agency (JAXA) and Japan\u27s National Institute of Information and Communications Technology (NICT), vis-a-vis the partitioning of Cly, comparisons are made using other satellite data from the Aura Microwave Limb Sounder (MLS) and Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS). HCl values from the SMILES NICT L2 product agree to within 10% (0.3 ppbv) with the MLS HCl data between 450 and 575K levels in PT and with the ACE-FTS HCl data between 425 and 575 K. The SMILES JAXA L2 product is 10 to 20% (0.2-0.5 ppbv) lower than that from MLS between 400 and 700K and from ACE-FTS between 500 and 700 K. For ClO in daytime, the difference between SMILES (JAXA and NICT) and MLS is less than ±0.05 ppbv (100 %) between 500K and 650K with the ClO values less than 0.2 ppbv. ClONO2 values as measured by ACE-FTS also reveal 0.2 ppbv at 475-500K level, resulting in the HCl/Cly ratios of 0.91-0.95. The HCl/Cly ratios derived from each retrieval agree to within -5 to 8% with regard to their averages. The high HCl values and HCl/Cly ratios observed by the three instruments in the lower stratospheric Antarctic vortex are consistent with previous observations in late Austral spring

    Deep learning-based k(cat) prediction enables improved enzyme-constrained model reconstruction

    Get PDF
    Enzyme turnover numbers (k(cat)) are key to understanding cellular metabolism, proteome allocation and physiological diversity, but experimentally measured k(cat) data are sparse and noisy. Here we provide a deep learning approach (DLKcat) for high-throughput k(cat) prediction for metabolic enzymes from any organism merely from substrate structures and protein sequences. DLKcat can capture k(cat) changes for mutated enzymes and identify amino acid residues with a strong impact on k(cat) values. We applied this approach to predict genome-scale k(cat) values for more than 300 yeast species. Additionally, we designed a Bayesian pipeline to parameterize enzyme-constrained genome-scale metabolic models from predicted k(cat) values. The resulting models outperformed the corresponding original enzyme-constrained genome-scale metabolic models from previous pipelines in predicting phenotypes and proteomes, and enabled us to explain phenotypic differences. DLKcat and the enzyme-constrained genome-scale metabolic model construction pipeline are valuable tools to uncover global trends of enzyme kinetics and physiological diversity, and to further elucidate cellular metabolism on a large scale

    Artificial Intelligence and Machine Learning in Computational Nanotoxicology: Unlocking and Empowering Nanomedicine.

    Get PDF
    AbstractAdvances in nanomedicine, coupled with novel methods of creating advanced materials at the nanoscale, have opened new perspectives for the development of healthcare and medical products. Special attention must be paid toward safe design approaches for nanomaterial‐based products. Recently, artificial intelligence (AI) and machine learning (ML) gifted the computational tool for enhancing and improving the simulation and modeling process for nanotoxicology and nanotherapeutics. In particular, the correlation of in vitro generated pharmacokinetics and pharmacodynamics to in vivo application scenarios is an important step toward the development of safe nanomedicinal products. This review portrays how in vitro and in vivo datasets are used in in silico models to unlock and empower nanomedicine. Physiologically based pharmacokinetic (PBPK) modeling and absorption, distribution, metabolism, and excretion (ADME)‐based in silico methods along with dosimetry models as a focus area for nanomedicine are mainly described. The computational OMICS, colloidal particle determination, and algorithms to establish dosimetry for inhalation toxicology, and quantitative structure–activity relationships at nanoscale (nano‐QSAR) are revisited. The challenges and opportunities facing the blind spots in nanotoxicology in this computationally dominated era are highlighted as the future to accelerate nanomedicine clinical translation

    Artificial Intelligence based Approach for Rapid Material Discovery: From Chemical Synthesis to Quantum Materials

    Get PDF
    With the advent of machine learning (ML) in the field of Materials Science, it has become obvious that trained models are limited by the amount and quality of the data used for training. Where researchers do not have access to the breadth and depth of labeled data that fields like image processing and natural language processing enjoy. In the specific application of materials discovery, there is the issue of continuity in atomistic datasets. Often if one relies on experimental data mined from literature and patents this data is only available for the most favorable of atomistic data. This ultimately leads to bias in the training dataset. In providing a solution, this research focuses on investigating the deployment of ML models trained on synthetic data and the development of a language-based approach for synthetically generating training datasets. It has been applied to three material science-related problems to prove these approaches work. The first problem was the prediction of dielectric properties, the second problem was the synthetic generation of chemical reaction datasets, and the third problem was the synthetic generation of quantum material datasets. All three applications proved successful and demonstrated the ability to generate continuous datasets that resolve the issue of dataset bias. This first study investigated the synthetic generation of complex dielectric properties of granular powders and their ability to train a ML network. The neural network was trained using a supervised learning approach and a common backpropagation. The network was double-validated using experimental data collected from a coaxial airline experiment. The second study demonstrated the synthetic generation of a chemical reaction database. An artificial intelligence model based on a Variational Autoencoder (VAE) has been developed and investigated to synthetically generate continuous datasets. The approach involves sampling the latent space to generate new chemical reactions that were assembled into the synthetic dataset. This developed technique is demonstrated by generating over 7,000,000 new reactions from a training dataset containing only 7,000 reactions. The generated reactions include molecular species that are larger and more diverse than the training set. The third study investigated a similar variational autoencoder approach to the second study but with the application of generating a synthetic dataset for quantum materials focusing on quantum sensing applications. The specific quantum sensors of interest are two-level quantum molecules that exhibit dipole blockade. This study offers an improved sampling algorithm by continuously feeding newly generated materials into a sampling algorithm to help generate a more normally distributed dataset. This technique was able to generate over 1,000,000 new quantum materials from a small dataset of only 8,000 materials. From the generated dataset it was identified that several iodine-containing molecules are candidate quantum sensor materials for future studies
    corecore