172 research outputs found

    Assessment of Financial Risk Prediction Models with Multi-criteria Decision Making Methods

    Get PDF
    A wide range of classification models have been explored for financial risk prediction, but conclusions on which technique behaves better may vary when different performance evaluation measures are employed. Accordingly, this paper proposes the use of multiple criteria decision making tools in order to give a ranking of algorithms. More specifically, the selection of the most appropriate credit risk prediction method is here modeled as a multi-criteria decision making problem that involves a number of performance measures (criteria) and classification techniques (alternatives). An empirical study is carried out to evaluate the performance of ten algorithms over six real-life credit risk data sets. The results reveal that the use of a unique performance measure may lead to unreliable conclusions, whereas this situation can be overcome by the application of multi-criteria decision making techniques

    Modeling oscillatory Microtubule--Polymerization

    Get PDF
    Polymerization of microtubules is ubiquitous in biological cells and under certain conditions it becomes oscillatory in time. Here simple reaction models are analyzed that capture such oscillations as well as the length distribution of microtubules. We assume reaction conditions that are stationary over many oscillation periods, and it is a Hopf bifurcation that leads to a persistent oscillatory microtubule polymerization in these models. Analytical expressions are derived for the threshold of the bifurcation and the oscillation frequency in terms of reaction rates as well as typical trends of their parameter dependence are presented. Both, a catastrophe rate that depends on the density of {\it guanosine triphosphate} (GTP) liganded tubulin dimers and a delay reaction, such as the depolymerization of shrinking microtubules or the decay of oligomers, support oscillations. For a tubulin dimer concentration below the threshold oscillatory microtubule polymerization occurs transiently on the route to a stationary state, as shown by numerical solutions of the model equations. Close to threshold a so--called amplitude equation is derived and it is shown that the bifurcation to microtubule oscillations is supercritical.Comment: 21 pages and 12 figure

    3D zero-thickness coupled interface finite element:Formulation and application

    Get PDF
    In many fields of geotechnical engineering, the modelling of interfaces requires special numerical tools. This paper presents the formulation of a 3D fully coupled hydro-mechanical finite element of interface. The element belongs to the zero-thickness family and the contact constraint is enforced by the penalty method. Fluid flow is discretised through a three-node scheme, discretising the inner flow by additional nodes. The element is able to reproduce the contact/loss of contact between two solids as well as shearing/sliding of the interface. Fluid flow through and across the interface can be modelled. Opening of a gap within the interface influences the longitudinal transmissivity as well as the storage of water inside the interface. Moreover the computation of an effective pressure within the interface, according to the Terzaghi’s principle creates an additional hydro-mechanical coupling. The uplifting simulation of a suction caisson embedded in a soil layer illustrates the main features of the element. Friction is progressively mobilised along the shaft of the caisson and sliding finally takes place. A gap is created below the top of the caisson and filled with water. It illustrates the storage capacity within the interface and the transversal flow. Longitudinal fluid flow is highlighted between the shaft of the caisson and the soil. The fluid flow depends on the opening of the gap and is related to the cubic law

    Measurement of the p-pbar -> Wgamma + X cross section at sqrt(s) = 1.96 TeV and WWgamma anomalous coupling limits

    Full text link
    The WWgamma triple gauge boson coupling parameters are studied using p-pbar -> l nu gamma + X (l = e,mu) events at sqrt(s) = 1.96 TeV. The data were collected with the DO detector from an integrated luminosity of 162 pb^{-1} delivered by the Fermilab Tevatron Collider. The cross section times branching fraction for p-pbar -> W(gamma) + X -> l nu gamma + X with E_T^{gamma} > 8 GeV and Delta R_{l gamma} > 0.7 is 14.8 +/- 1.6 (stat) +/- 1.0 (syst) +/- 1.0 (lum) pb. The one-dimensional 95% confidence level limits on anomalous couplings are -0.88 < Delta kappa_{gamma} < 0.96 and -0.20 < lambda_{gamma} < 0.20.Comment: Submitted to Phys. Rev. D Rapid Communication

    Completion Dissection or Observation for Sentinel-Node Metastasis in Melanoma.

    Get PDF
    Sentinel-lymph-node biopsy is associated with increased melanoma-specific survival (i.e., survival until death from melanoma) among patients with node-positive intermediate-thickness melanomas (1.2 to 3.5 mm). The value of completion lymph-node dissection for patients with sentinel-node metastases is not clear. In an international trial, we randomly assigned patients with sentinel-node metastases detected by means of standard pathological assessment or a multimarker molecular assay to immediate completion lymph-node dissection (dissection group) or nodal observation with ultrasonography (observation group). The primary end point was melanoma-specific survival. Secondary end points included disease-free survival and the cumulative rate of nonsentinel-node metastasis. Immediate completion lymph-node dissection was not associated with increased melanoma-specific survival among 1934 patients with data that could be evaluated in an intention-to-treat analysis or among 1755 patients in the per-protocol analysis. In the per-protocol analysis, the mean (±SE) 3-year rate of melanoma-specific survival was similar in the dissection group and the observation group (86±1.3% and 86±1.2%, respectively; P=0.42 by the log-rank test) at a median follow-up of 43 months. The rate of disease-free survival was slightly higher in the dissection group than in the observation group (68±1.7% and 63±1.7%, respectively; P=0.05 by the log-rank test) at 3 years, based on an increased rate of disease control in the regional nodes at 3 years (92±1.0% vs. 77±1.5%; P&lt;0.001 by the log-rank test); these results must be interpreted with caution. Nonsentinel-node metastases, identified in 11.5% of the patients in the dissection group, were a strong, independent prognostic factor for recurrence (hazard ratio, 1.78; P=0.005). Lymphedema was observed in 24.1% of the patients in the dissection group and in 6.3% of those in the observation group. Immediate completion lymph-node dissection increased the rate of regional disease control and provided prognostic information but did not increase melanoma-specific survival among patients with melanoma and sentinel-node metastases. (Funded by the National Cancer Institute and others; MSLT-II ClinicalTrials.gov number, NCT00297895 .)

    Gap-filling eddy covariance methane fluxes:Comparison of machine learning model predictions and uncertainties at FLUXNET-CH4 wetlands

    Get PDF
    Time series of wetland methane fluxes measured by eddy covariance require gap-filling to estimate daily, seasonal, and annual emissions. Gap-filling methane fluxes is challenging because of high variability and complex responses to multiple drivers. To date, there is no widely established gap-filling standard for wetland methane fluxes, with regards both to the best model algorithms and predictors. This study synthesizes results of different gap-filling methods systematically applied at 17 wetland sites spanning boreal to tropical regions and including all major wetland classes and two rice paddies. Procedures are proposed for: 1) creating realistic artificial gap scenarios, 2) training and evaluating gap-filling models without overstating performance, and 3) predicting half-hourly methane fluxes and annual emissions with realistic uncertainty estimates. Performance is compared between a conventional method (marginal distribution sampling) and four machine learning algorithms. The conventional method achieved similar median performance as the machine learning models but was worse than the best machine learning models and relatively insensitive to predictor choices. Of the machine learning models, decision tree algorithms performed the best in cross-validation experiments, even with a baseline predictor set, and artificial neural networks showed comparable performance when using all predictors. Soil temperature was frequently the most important predictor whilst water table depth was important at sites with substantial water table fluctuations, highlighting the value of data on wetland soil conditions. Raw gap-filling uncertainties from the machine learning models were underestimated and we propose a method to calibrate uncertainties to observations. The python code for model development, evaluation, and uncertainty estimation is publicly available. This study outlines a modular and robust machine learning workflow and makes recommendations for, and evaluates an improved baseline of, methane gap-filling models that can be implemented in multi-site syntheses or standardized products from regional and global flux networks (e.g., FLUXNET)

    Supplemental Association of Clonal Hematopoiesis With Incident Heart Failure

    Get PDF
    Background: Age-related clonal hematopoiesis of indeterminate potential (CHIP), defined as clonally expanded leukemogenic sequence variations (particularly in DNMT3A, TET2, ASXL1, and JAK2) in asymptomatic individuals, is associated with cardiovascular events, including recurrent heart failure (HF). Objectives: This study sought to evaluate whether CHIP is associated with incident HF. Methods: CHIP status was obtained from whole exome or genome sequencing of blood DNA in participants without prevalent HF or hematological malignancy from 5 cohorts. Cox proportional hazards models were performed within each cohort, adjusting for demographic and clinical risk factors, followed by fixed-effect meta-analyses. Large CHIP clones (defined as variant allele frequency >10%), HF with or without baseline coronary heart disease, and left ventricular ejection fraction were evaluated in secondary analyses. Results: Of 56,597 individuals (59% women, mean age 58 years at baseline), 3,406 (6%) had CHIP, and 4,694 developed HF (8.3%) over up to 20 years of follow-up. CHIP was prospectively associated with a 25% increased risk of HF in meta-analysis (hazard ratio: 1.25; 95% confidence interval: 1.13-1.38) with consistent associations across cohorts. ASXL1, TET2, and JAK2 sequence variations were each associated with an increased risk of HF, whereas DNMT3A sequence variations were not associated with HF. Secondary analyses suggested large CHIP was associated with a greater risk of HF (hazard ratio: 1.29; 95% confidence interval: 1.15-1.44), and the associations for CHIP on HF with and without prior coronary heart disease were homogenous. ASXL1 sequence variations were associated with reduced left ventricular ejection fraction. Conclusions: CHIP, particularly sequence variations in ASXL1, TET2, and JAK2, represents a new risk factor for HF

    DES science portal: Computing photometric redshifts

    Get PDF
    A significant challenge facing photometric surveys for cosmological purposes is the need to produce reliable redshift estimates. The estimation of photometric redshifts (photo-zs) has been consolidated as the standard strategy to bypass the high production costs and incompleteness of spectroscopic redshift samples. Training-based photo-z methods require the preparation of a high-quality list of spectroscopic redshifts, which needs to be constantly updated. The photo-z training, validation, and estimation must be performed in a consistent and reproducible way in order to accomplish the scientific requirements. To meet this purpose, we developed an integrated web-based data interface that not only provides the framework to carry out the above steps in a systematic way, enabling the ease testing and comparison of different algorithms, but also addresses the processing requirements by parallelizing the calculation in a transparent way for the user. This framework called the Science Portal (hereafter Portal) was developed in the context the Dark Energy Survey (DES) to facilitate scientific analysis. In this paper, we show how the Portal can provide a reliable environment to access vast datasets, provide validation algorithms and metrics, even in the case of multiple photo-zs methods. It is possible to maintain the provenance between the steps of a chain of workflows while ensuring reproducibility of the results. We illustrate how the Portal can be used to provide photo-z estimates using the DES first year (Y1A1) data. While the DES collaboration is still developing techniques to obtain more precise photo-zs, having a structured framework like the one presented here is critical for the systematic vetting of DES algorithmic improvements and the consistent production of photo-zs in future DES releases

    A method for detection of muon induced electromagnetic showers with the ANTARES detector

    Get PDF
    The primary aim of ANTARES is neutrino astronomy with upward going muons created in charged current muon neutrino interactions in the detector and its surroundings. Downward going muons are background for neutrino searches. These muons are the decay products of cosmic-ray collisions in the Earth's atmosphere far above the detector. This paper presents a method to identify and count electromagnetic showers induced along atmospheric muon tracks with the ANTARES detector. The method is applied to both cosmic muon data and simulations and its applicability to the reconstruction of muon event energies is demonstrated.Comment: 20 pages, 7 figure
    corecore