345 research outputs found

    Linear Scaling Relationships to Predict p <i>K</i><sub>a</sub>'s and Reduction Potentials for Bioinspired Hydrogenase Catalysis

    Get PDF
    [Image: see text] Biomimetic catalysts inspired by the active site of the [FeFe] hydrogenase enzyme can convert protons into molecular hydrogen. Minimizing the overpotential of the electrocatalytic process remains a major challenge for practical application of the catalyst. The catalytic cycle of the hydrogen production follows an ECEC mechanism (E represents an electron transfer step, and C refers to a chemical step), in which the electron and proton transfer steps can be either sequential or coupled (PCET). In this study, we have calculated the pK(a)’s and the reduction potentials for a series of commonly used ligands (80 different complexes) using density functional theory. We establish that the required acid strength for protonation at the Fe–Fe site correlates with the standard reduction potential of the di-iron complexes with a linear energy relationship. These linear relationships allow for fast screening of ligands and tuning of the properties of the catalyst. Our study also suggests that bridgehead ligand properties, such as bulkiness and aromaticity, can be exploited to alter or even break the linear scaling relationships

    Multi-Objective Optimization via Equivariant Deep Hypervolume Approximation

    Get PDF
    Optimizing multiple competing objectives is a common problem across science and industry. The inherent inextricable trade-off between those objectives leads one to the task of exploring their Pareto front. A meaningful quantity for the purpose of the latter is the hypervolume indicator, which is used in Bayesian Optimization (BO) and Evolutionary Algorithms (EAs). However, the computational complexity for the calculation of the hypervolume scales unfavorably with increasing number of objectives and data points, which restricts its use in those common multi-objective optimization frameworks. To overcome these restrictions we propose to approximate the hypervolume function with a deep neural network, which we call DeepHV. For better sample efficiency and generalization, we exploit the fact that the hypervolume is scale-equivariant in each of the objectives as well as permutation invariant w.r.t. both the objectives and the samples, by using a deep neural network that is equivariant w.r.t. the combined group of scalings and permutations. We evaluate our method against exact, and approximate hypervolume methods in terms of accuracy, computation time, and generalization. We also apply and compare our methods to state-of-the-art multi-objective BO methods and EAs on a range of synthetic benchmark test cases. The results show that our methods are promising for such multi-objective optimization tasks

    Chemistry in Water : First Principles Computer Simulations

    Get PDF
    Baerends, E.J. [Promotor
    • …
    corecore