4 research outputs found

    Using deep neural network with small dataset to predict material defects

    No full text
    Deep neural network (DNN) exhibits state-of-the-art performance in many fields including microstructure recognition where big dataset is used in training. However, DNN trained by conventional methods with small datasets commonly shows worse performance than traditional machine learning methods, e.g. shallow neural network and support vector machine. This inherent limitation prevented the wide adoption of DNN in material study because collecting and assembling big dataset in material science is a challenge. In this study, we attempted to predict solidification defects by DNN regression with a small dataset that contains 487 data points. It is found that a pre-trained and fine-tuned DNN shows better generalization performance over shallow neural network, support vector machine, and DNN trained by conventional methods. The trained DNN transforms scattered experimental data points into a map of high accuracy in high-dimensional chemistry and processing parameters space. Though DNN with big datasets is the optimal solution, DNN with small datasets and pre-training can be a reasonable choice when big datasets are unavailable in material study

    Application of deep transfer learning to predicting crystal structures of inorganic substances

    No full text
    Convolutional neural network (CNN) consists of shallow learning machine and automatic feature extractor. The feature extractor of a well-trained CNN on a big dataset can be reused in related tasks with small datasets. This technique is called deep transfer learning which not only bypasses manual feature engineering but also improves the generalization of new models. In this study, we attempted to predict crystal structures of inorganic substances, a challenge for material science, with CNN and transfer learning. CNNs were trained on a big dataset of 228 k compounds from open quantum materials database (OQMD). The feature extractors of the well-trained CNNs were reused for extracting features on a phase prototypes dataset (containing 17 k inorganic substances and involving 170 crystal structures) and two high-entropy alloy datasets. The extracted features were then fed into random forest classifier as input. High classification accuracy (above 0.9) was achieved in three datasets. The visualization of the extracted features proved the effectiveness of the transferable feature extractors. This method can be easily adopted in quickly building machine learning models of good performance without resorting to time-consuming manual feature engineering routes

    A general and transferable deep learning framework for predicting phase formation in materials

    No full text
    Machine learning has been widely exploited in developing new materials. However, challenges still exist: small dataset is common for most tasks; new datasets, special descriptors and specific models need to be built from scratch when facing a new task; knowledge cannot be readily transferred between independent models. In this paper we propose a general and transferable deep learning (GTDL) framework for predicting phase formation in materials. The proposed GTDL framework maps raw data to pseudo-images with some special 2-D structure, e.g., periodic table, automatically extracts features and gains knowledge through convolutional neural network, and then transfers knowledge by sharing features extractors between models. Application of the GTDL framework in case studies on glass-forming ability and high-entropy alloys show that the GTDL framework for glass-forming ability outperformed previous models and can correctly predicted the newly reported amorphous alloy systems; for high-entropy alloys the GTDL framework can discriminate five types phases (BCC, FCC, HCP, amorphous, mixture) with accuracy and recall above 94% in fivefold cross-validation. In addition, periodic table knowledge embedded in data representations and knowledge shared between models is beneficial for tasks with small dataset. This method can be easily applied to new materials development with small dataset by reusing well-trained models for related materials

    SCOPE: SCUBA-2 Continuum Observations of Pre-protostellar Evolution - survey description and compact source catalogue

    Full text link
    We present the first release of the data and compact-source catalogue for the JCMT Large Program SCUBA-2 Continuum Observations of Pre-protostellar Evolution (SCOPE). SCOPE consists of 850 μm continuum observations of 1235 Planck Galactic Cold Clumps (PGCCs) made with the Submillimetre Common-User Bolometer Array 2 on the James Clerk Maxwell Telescope. These data are at an angular resolution of 14.4 arcsec, significantly improving upon the 353 GHz resolution of Planck at 5 arcmin, and allowing for a catalogue of 3528 compact sources in 558 PGCCs. We find that the detected PGCCs have significant sub-structure, with 61 per cent of detected PGCCs having three or more compact sources, with filamentary structure also prevalent within the sample. A detection rate of 45 per cent is found across the survey, which is 95 per cent complete to Planck column densities of N(H2) > 5 × 10^21 cm^−2. By positionally associating the SCOPE compact sources with young stellar objects, the star formation efficiency, as measured by the ratio of luminosity to mass, in nearby clouds is found to be similar to that in the more distant Galactic Plane, with the column density distributions also indistinguishable from each other
    corecore