92 research outputs found

    Effect of sample selection on the susceptibility assessment of geological hazards: A case study in Liulin County, Shanxi Province

    Get PDF
    The rational selection of non-geological hazard samples is of great significance to improve the accuracy of geological hazard susceptibility prediction. This study uses Liulin County as a case study, where appropriate impact factors were selected, and the random forest (RF) model was employed for susceptibility assessment based on GIS technology. A total of twenty sets of models were created by varying the ratio of geological hazard to non-geological hazard points (1∶1, 1∶1.5, 1∶3, 1∶5 and 1∶10) and the distance from non-geological hazard points to known hazard points (100,500,800,1000 m). The results demonstrate that: (1) Through error index, confusion matrix, and ROC curve tests, the sample proportion and distance from the known hazard point significantly influenced the geological hazard susceptibility evaluation. As the sample proportion decreased and the distance from known hazard points increased, the overall MAE and RMSE of the models decreased, while the overall ACC increased. All models achieved AUC value greater than 0.8, indicating excellent predictive performance. When the sample proportion was less than 1∶3, the increasing distance from the known hazard points on model error and accuracy became less pronounced, stabilizing the results. The most suitable model for the study area was found to have a sample ratio of 1∶10 and a distance of 1000 m from known hazard points. (2) High and very high susceptibility areas were primarily located in the central and northern regions, adjacent to roads and rivers, making them key areas for hazard prevention and reduction in Liulin County. (3) Differences in sample selection led to varying susceptibility results mainly due to changes in the RF model's data feature collection and judgment during the modeling process, as well as the representativeness of the samples. These research findings hold significant implications for the implementation of hazard prevention and reduction measures

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals <1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    Combined Gaussian Mixture Model and Pathfinder Algorithm for Data Clustering

    No full text
    Data clustering is one of the most influential branches of machine learning and data analysis, and Gaussian Mixture Models (GMMs) are frequently adopted in data clustering due to their ease of implementation. However, there are certain limitations to this approach that need to be acknowledged. GMMs need to determine the cluster numbers manually, and they may fail to extract the information within the dataset during initialization. To address these issues, a new clustering algorithm called PFA-GMM has been proposed. PFA-GMM is based on GMMs and the Pathfinder algorithm (PFA), and it aims to overcome the shortcomings of GMMs. The algorithm automatically determines the optimal number of clusters based on the dataset. Subsequently, PFA-GMM considers the clustering problem as a global optimization problem for getting trapped in local convergence during initialization. Finally, we conducted a comparative study of our proposed clustering algorithm against other well-known clustering algorithms using both synthetic and real-world datasets. The results of our experiments indicate that PFA-GMM outperformed the competing approaches

    Observer-based stabilization of nonhomogeneous semi-Markov jump linear systems with mode-switching delays

    No full text
    This technical note addresses the problem of simultaneous design of observer and controller for discrete-time nonhomogeneous semi-Markov jump linear systems (S-MJLSs) against mode mismatches between system and observer-based controller. The considered systems are more general than homogeneous S-MJLSs, and the mode mismatches are caused by time delays of mode switchings of controller. Based on the semi-Markov kernel approach, together with the Lyapunov function depending upon not only both the system and controller modes but also the time that has elapsed in the current mode, sufficient conditions on the existence of the desired mode-dependent observer-based controller are presented such that the augmented nonhomogeneous S-MJLS composed of a closed-loop control system and an estimation error system is sigma-error mean square stable. Finally, a practical example of a quarter-car active suspension is provided to validate the superiorities of the proposed control strategy
    • …
    corecore