29,598 research outputs found

    MolGAN: An implicit generative model for small molecular graphs

    Get PDF
    Deep generative models for graph-structured data offer a new angle on the problem of chemical synthesis: by optimizing differentiable models that directly generate molecular graphs, it is possible to side-step expensive search procedures in the discrete and vast space of chemical structures. We introduce MolGAN, an implicit, likelihood-free generative model for small molecular graphs that circumvents the need for expensive graph matching procedures or node ordering heuristics of previous likelihood-based methods. Our method adapts generative adversarial networks (GANs) to operate directly on graph-structured data. We combine our approach with a reinforcement learning objective to encourage the generation of molecules with specific desired chemical properties. In experiments on the QM9 chemical database, we demonstrate that our model is capable of generating close to 100% valid compounds. MolGAN compares favorably both to recent proposals that use string-based (SMILES) representations of molecules and to a likelihood-based method that directly generates graphs, albeit being susceptible to mode collapse.Comment: 11 pages, 3 figures, 3 table

    Assessing Deep Generative Models in Chemical Composition Space

    Get PDF
    The computational discovery of novel materials has been one of the main motivations behind research in theoretical chemistry for several decades. Despite much effort, this is far from a solved problem, however. Among other reasons, this is due to the enormous space of possible structures and compositions that could potentially be of interest. In the case of inorganic materials, this is exacerbated by the combinatorics of the periodic table since even a single-crystal structure can in principle display millions of compositions. Consequently, there is a need for tools that enable a more guided exploration of the materials design space. Here, generative machine learning models have recently emerged as a promising technology. In this work, we assess the performance of a range of deep generative models based on reinforcement learning, variational autoencoders, and generative adversarial networks for the prototypical case of designing Elpasolite compositions with low formation energies. By relying on the fully enumerated space of 2 million main-group Elpasolites, the precision, coverage, and diversity of the generated materials are rigorously assessed. Additionally, a hyperparameter selection scheme for generative models in chemical composition space is developed

    Inverse Design of Solid-State Materials via a Continuous Representation

    Get PDF
    The non-serendipitous discovery of materials with targeted properties is the ultimate goal of materials research, but to date, materials design lacks the incorporation of all available knowledge to plan the synthesis of the next material. This work presents a framework for learning a continuous representation of materials and building a model for new discovery using latent space representation. The ability of autoencoders to generate experimental materials is demonstrated with vanadium oxides via rediscovery of experimentally known structures when the model was trained without them. Approximately 20,000 hypothetical materials are generated, leading to several completely new metastable V_xO_y materials that may be synthesizable. Comparison with genetic algorithms suggests computational efficiency of generative models that can explore chemical compositional space effectively by learning the distributions of known materials for crystal structure prediction. These results are an important step toward machine-learned inverse design of inorganic functional materials using generative models
    • …
    corecore