81,603 research outputs found
Earthquake Hazard Safety Assessment of Existing Buildings Using Optimized Multi-Layer Perceptron Neural Network
The latest earthquakes have proven that several existing buildings, particularly in developing countries, are not secured from damages of earthquake. A variety of statistical and machine-learning approaches have been proposed to identify vulnerable buildings for the prioritization of retrofitting. The present work aims to investigate earthquake susceptibility through the combination of six building performance variables that can be used to obtain an optimal prediction of the damage state of reinforced concrete buildings using artificial neural network (ANN). In this regard, a multi-layer perceptron network is trained and optimized using a database of 484 damaged buildings from the DĂĽzce earthquake in Turkey. The results demonstrate the feasibility and effectiveness of the selected ANN approach to classify concrete structural damage that can be used as a preliminary assessment technique to identify vulnerable buildings in disaster risk-management programs
Universal spatial correlations in the anisotropic Kondo screening cloud: analytical insights and numerically exact results from a coherent state expansion
We analyze the spatial correlations in the spin density of an electron gas in
the vicinity of a Kondo impurity. Our analysis extends to the spin-anisotropic
regime, which was not investigated in the literature. We use an original and
numerically exact method, based on a systematic coherent-state expansion of the
ground state of the underlying spin-boson Hamiltonian, which we apply to the
computation of observables that are specific to the fermionic Kondo model. We
also present an important technical improvement to the method, that obviates
the need to discretize modes of the Fermi sea, and allows one to tackle the
problem in the thermodynamic limit. One can thus obtain excellent spatial
resolution over arbitrary length scales, for a relatively low computational
cost, a feature that gives the method an advantage over popular techniques such
as NRG and DMRG. We find that the anisotropic Kondo model shows rich universal
scaling behavior in the spatial structure of the entanglement cloud. First,
SU(2) spin-symmetry is dynamically restored in a finite domain in parameter
space in vicinity of the isotropic line, as expected from poor man's scaling.
We are also able to obtain in closed analytical form a set of different, yet
universal, scaling curves for strong exchange asymmetry, which are parametrized
by the longitudinal exchange coupling. Deep inside the cloud, i.e. for
distances smaller than the Kondo length, the correlation between the electron
spin density and the impurity spin oscillates between ferromagnetic and
antiferromagnetic values at the scale of the Fermi wavelength, an effect that
is drastically enhanced at strongly anisotropic couplings. Our results also
provide further numerical checks and alternative analytical approximations for
the recently computed Kondo overlaps [PRL 114, 080601 (2015)].Comment: 27 pages + 2 pages of Supplementary materials. The manuscript was
largely extended in V2, and contains now a comparison to the Toulouse limit,
and well as a detailed study of the restoration of SU(2) symmetry. The
displayed html abstract has been shortened compared to the pdf versio
Recommended from our members
Determining Utility System Value of Demand Flexibility From Grid-interactive Efficient Buildings
This report focuses on ways current methods and practices that establish the value to electric utility systems of distributed energy resource (DER) investments can be enhanced to determine the value of demand flexibility in grid-interactive efficient buildings that can provide grid services. The report introduces key valuation concepts that are applicable to demand flexibility that these buildings can provide and links to other documents that describe these concepts and their implementation in more detail.The scope of this report is limited to the valuation of economic benefits to the utility system. These are the foundational values on which other benefits (and costs) can be built. Establishing the economic value to the grid of demand flexibility provides the information needed to design programs, market rules, and rates that align the economic interest of utility customers with building owners and occupants. By nature, DERs directly impact customers and provide societal benefits external to the utility system. Jurisdictions can use utility system benefits and costs as the foundation of their economic analysis but align their primary cost-effectiveness metric with all applicable policy objectives, which may include customer and societal (non-utility system) impacts.This report suggests enhancements to current methods and practices that state and local policymakers, public utility commissions, state energy offices, utilities, state utility consumer representatives, and other stakeholders might support. These enhancements can improve the consistency and robustness of economic valuation of demand flexibility for grid services. The report concludes with a discussion of considerations for prioritizing implementation of these improvements
Recommended from our members
Effect of a machine learning-based severe sepsis prediction algorithm on patient survival and hospital length of stay: a randomised clinical trial.
IntroductionSeveral methods have been developed to electronically monitor patients for severe sepsis, but few provide predictive capabilities to enable early intervention; furthermore, no severe sepsis prediction systems have been previously validated in a randomised study. We tested the use of a machine learning-based severe sepsis prediction system for reductions in average length of stay and in-hospital mortality rate.MethodsWe conducted a randomised controlled clinical trial at two medical-surgical intensive care units at the University of California, San Francisco Medical Center, evaluating the primary outcome of average length of stay, and secondary outcome of in-hospital mortality rate from December 2016 to February 2017. Adult patients (18+) admitted to participating units were eligible for this factorial, open-label study. Enrolled patients were assigned to a trial arm by a random allocation sequence. In the control group, only the current severe sepsis detector was used; in the experimental group, the machine learning algorithm (MLA) was also used. On receiving an alert, the care team evaluated the patient and initiated the severe sepsis bundle, if appropriate. Although participants were randomly assigned to a trial arm, group assignments were automatically revealed for any patients who received MLA alerts.ResultsOutcomes from 75 patients in the control and 67 patients in the experimental group were analysed. Average length of stay decreased from 13.0 days in the control to 10.3 days in the experimental group (p=0.042). In-hospital mortality decreased by 12.4 percentage points when using the MLA (p=0.018), a relative reduction of 58.0%. No adverse events were reported during this trial.ConclusionThe MLA was associated with improved patient outcomes. This is the first randomised controlled trial of a sepsis surveillance system to demonstrate statistically significant differences in length of stay and in-hospital mortality.Trial registrationNCT03015454
Application of support vector machines on the basis of the first Hungarian bankruptcy model
In our study we rely on a data mining procedure known as support vector machine (SVM) on the database of the first Hungarian bankruptcy model. The models constructed are then contrasted with the results of earlier bankruptcy models with the use of classification accuracy and the area under the ROC curve. In using the SVM technique, in addition to conventional kernel functions, we also examine the possibilities of applying the ANOVA kernel function and take a detailed look at data preparation tasks recommended in using the SVM method (handling of outliers). The results of the models assembled suggest that a significant improvement of classification accuracy can be achieved on the database of the first Hungarian bankruptcy model when using the SVM method as opposed to neural networks
Lattice QCD at finite temperature and density
QCD at finite temperature and density is becoming increasingly important for
various experimental programmes, ranging from heavy ion physics to
astro-particle physics. The non-perturbative nature of non-abelian quantum
field theories at finite temperature leaves lattice QCD as the only tool by
which we may hope to come to reliable predictions from first principles. This
requires careful extrapolations to the thermodynamic, chiral and continuum
limits in order to eliminate systematic effects introduced by the
discretization procedure. After an introduction to lattice QCD at finite
temperature and density, its possibilities and current systematic limitations,
a review of present numerical results is given. In particular, plasma
properties such as the equation of state, screening masses, static quark free
energies and spectral functions are discussed, as well as the critical
temperature and the QCD phase structure at zero and finite density.Comment: 32 pages, typos corrected, reference added. Lectures given at 45.
Internationale Universitatswochen fur Theoretische Physik: (Schladming Winter
School on Theoretical Physics): Conceptual and Numerical Challenges in
Femto-Scale and Peta-Scale Physics, Schladming, Styria, Austria, 24 Feb - 3
Mar 200
Freeze-Thaw Durability and Long-Term Performance Evaluation of Shotcrete in Cold Regions
This study’s aim was to evaluate the freeze-thaw durability of shotcrete in cold regions and predict its long-term performance. One benchmark mix design from the WSDOT was chosen to prepare samples for performance evaluation. Shotcrete specimens were conditioned in accordance with ASTM C666. The long-term freeze-thaw performance after certain cycles was evaluated using the dynamic modulus of elasticity test (ASTM C215), fracture energy test (RILEM 50-FMC), and X-ray CT microstructure imaging analysis. Probabilistic damage analysis was conducted to establish the relation between the durability life and the damage parameter for different probabilities of reliability using the three-parameter Weibull distribution model. The fracture energy test was found to be a more sensitive test method than the dynamic modulus of elasticity for screening material deterioration over time and for capturing accumulative material damage caused by rapid freeze-thaw action, because of smaller durability factors (degradation ratios) obtained from the fracture energy test. X-ray CT imaging analysis is capable of detecting microcracks that form and pore evolution in the aggregate and interface transition zone of conditioned samples. Moreover, the continuum damage mechanic-based model shows potential in predicting long-term material degradation and the service life of shotcrete
Highly accurate model for prediction of lung nodule malignancy with CT scans
Computed tomography (CT) examinations are commonly used to predict lung
nodule malignancy in patients, which are shown to improve noninvasive early
diagnosis of lung cancer. It remains challenging for computational approaches
to achieve performance comparable to experienced radiologists. Here we present
NoduleX, a systematic approach to predict lung nodule malignancy from CT data,
based on deep learning convolutional neural networks (CNN). For training and
validation, we analyze >1000 lung nodules in images from the LIDC/IDRI cohort.
All nodules were identified and classified by four experienced thoracic
radiologists who participated in the LIDC project. NoduleX achieves high
accuracy for nodule malignancy classification, with an AUC of ~0.99. This is
commensurate with the analysis of the dataset by experienced radiologists. Our
approach, NoduleX, provides an effective framework for highly accurate nodule
malignancy prediction with the model trained on a large patient population. Our
results are replicable with software available at
http://bioinformatics.astate.edu/NoduleX
CrY2H-seq: a massively multiplexed assay for deep-coverage interactome mapping.
Broad-scale protein-protein interaction mapping is a major challenge given the cost, time, and sensitivity constraints of existing technologies. Here, we present a massively multiplexed yeast two-hybrid method, CrY2H-seq, which uses a Cre recombinase interaction reporter to intracellularly fuse the coding sequences of two interacting proteins and next-generation DNA sequencing to identify these interactions en masse. We applied CrY2H-seq to investigate sparsely annotated Arabidopsis thaliana transcription factors interactions. By performing ten independent screens testing a total of 36 million binary interaction combinations, and uncovering a network of 8,577 interactions among 1,453 transcription factors, we demonstrate CrY2H-seq's improved screening capacity, efficiency, and sensitivity over those of existing technologies. The deep-coverage network resource we call AtTFIN-1 recapitulates one-third of previously reported interactions derived from diverse methods, expands the number of known plant transcription factor interactions by three-fold, and reveals previously unknown family-specific interaction module associations with plant reproductive development, root architecture, and circadian coordination
- …