297 research outputs found

    Uncertainty Quantification for Molecular Property Predictions with Graph Neural Architecture Search

    Full text link
    Graph Neural Networks (GNNs) have emerged as a prominent class of data-driven methods for molecular property prediction. However, a key limitation of typical GNN models is their inability to quantify uncertainties in the predictions. This capability is crucial for ensuring the trustworthy use and deployment of models in downstream tasks. To that end, we introduce AutoGNNUQ, an automated uncertainty quantification (UQ) approach for molecular property prediction. AutoGNNUQ leverages architecture search to generate an ensemble of high-performing GNNs, enabling the estimation of predictive uncertainties. Our approach employs variance decomposition to separate data (aleatoric) and model (epistemic) uncertainties, providing valuable insights for reducing them. In our computational experiments, we demonstrate that AutoGNNUQ outperforms existing UQ methods in terms of both prediction accuracy and UQ performance on multiple benchmark datasets. Additionally, we utilize t-SNE visualization to explore correlations between molecular features and uncertainty, offering insight for dataset improvement. AutoGNNUQ has broad applicability in domains such as drug discovery and materials science, where accurate uncertainty quantification is crucial for decision-making

    Environmentally Friendly Method for the Separation of Cellulose from Steam-Exploded Rice Straw and Its High-Value Applications

    Get PDF
    Separation of cellulose from agricultural straw is one of the key bottlenecks hindering the application of such kind of biomass resources. In this chapter, we provide three environmental-friendly ways for separation of cellulose from agricultural straw pretreated with steam explosion, which include delignification with recyclable water-polar aprotic organic solvent, selective bio-degradation of the lignin component, and extraction of cellulose with imidazolium-based ionic liquids from the steam-exploded rice straw. The isolated rice straw celluloses have been adopted as an enhancement for all-cellulose composites (ACCs) and cellulose/cement composites. Ultra-high tensile strength (650.2 MPa) can be achieved for the ACCs containing the activated straw cellulose fiber (A-SCF). The cellulose/cement composites show a significant promotion in the flexural strength and fracture toughness. The new nonderivative solvent for cellulose, tetrabutylammonium hydroxide (TBAH) aqueous solution with urea as additives has been proved to be manipulable for dissolving cellulose

    New Image Restoration Method Based on Multiple Aperture Defocus Images for Microscopic Images

    Get PDF
    Abstract: Image deconvolution is an effective image restoration technique to improve the quality of digital microscopic images resulting from out-of-focus blur. To solve the severely ill-posed problem of traditional Richardson-Lucy method, considering the point spread difference of various directions, a new microscope image restoration method based on multiple defocused images of different aperture is proposed. The maximumlikelihood estimation is used to suppress the ringing artifacts and noises sensitivity of microscope image. Experimental results show that the proposed algorithm performs better than Richardson-Lucy method and improve peak-signal-to-noise-rate about 4 dB

    Bayesian Stochastic Neural Network Model for Turbomachinery Damage Prediction

    Get PDF
    Turbomachinery often suffers various defects such as impeller cracking, resulting in forced outage, increased maintenance costs, and reduced productivity. Condition monitoring and damage prognostics has been widely used as an increasingly important and powerful tool to improve the system availability, reliability, performance, and maintainability, but still very challenging due to multiple sources of data uncertainties and the complexity of analytics modeling. This paper presents an intelligent probabilistic methodology for anomaly prediction of high-fidelity turbomachine, considering multiple data imperfections and multivariate correlation. The proposed method adeptly integrates several advanced state-of-the-art signal processing and artificial intelligence techniques: wavelet multi-resolution decomposition, Bayesian hypothesis testing, probabilistic principal component analysis, and fuzzy stochastic neural network modeling. The advanced signal processing is employed to reduce dimensionality and to address multivariate correlation and data uncertainty for damage prediction. Instead of conventionally using raw time series data, principal components are utilized in the establishment of stochastic neural network model and anomaly prediction. Bayesian interval hypothesis testing metric is then presented to quantitatively compare the predicted and measured data for model validation and anomaly evaluation, thus providing a confidence indicator to judge the model quality and evaluate the equipment status. Bayesian method is developed in this study for denoising the raw data with multiresolution wavelet decomposition, quantifying the model accuracy, and assessing the equipment status. The dynamic stochastic neural network model is established via the nonlinear autoregressive moving average with exogenous inputs approach. It seamlessly integrates the fuzzy clustering and independent Bernoulli random function into radial basis function neural network. A natural gradient method based on Kullback-Leibler distance criterion is employed to maximize the log-likelihood loss function. The effectiveness of the proposed methodology and procedure is demonstrated with the 11-variable time series data and the forced outage event of a real-world centrifugal compressor

    CNV discovery for milk composition traits in dairy cattle using whole genome resequencing

    Get PDF
    General statistics of 487 differential CNVRs between high and low group based on UMD3.1. (XLSX 28 kb

    A Multilayer Perceptron-based Fast Sunlight Assessment for the Conceptual Design of Residential Neighborhoods under Chinese Policy

    Full text link
    In Chinese building codes, it is required that residential buildings receive a minimum number of hours of natural, direct sunlight on a specified winter day, which represents the worst sunlight condition in a year. This requirement is a prerequisite for obtaining a building permit during the conceptual design of a residential project. Thus, officially sanctioned software is usually used to assess the sunlight performance of buildings. These software programs predict sunlight hours based on repeated shading calculations, which is time-consuming. This paper proposed a multilayer perceptron-based method, a one-stage prediction approach, which outputs a shading time interval caused by the inputted cuboid-form building. The sunlight hours of a site can be obtained by calculating the union of the sunlight time intervals (complement of shading time interval) of all the buildings. Three numerical experiments, i.e., horizontal level and slope analysis, and simulation-based optimization are carried out; the results show that the method reduces the computation time to 1/84~1/50 with 96.5%~98% accuracies. A residential neighborhood layout planning plug-in for Rhino 7/Grasshopper is also developed based on the proposed model. This paper indicates that deep learning techniques can be adopted to accelerate sunlight hour simulations at the conceptual design phase
    corecore