3,680 research outputs found

    SATzilla: Portfolio-based Algorithm Selection for SAT

    Full text link
    It has been widely observed that there is no single "dominant" SAT solver; instead, different solvers perform best on different instances. Rather than following the traditional approach of choosing the best solver for a given class of instances, we advocate making this decision online on a per-instance basis. Building on previous work, we describe SATzilla, an automated approach for constructing per-instance algorithm portfolios for SAT that use so-called empirical hardness models to choose among their constituent solvers. This approach takes as input a distribution of problem instances and a set of component solvers, and constructs a portfolio optimizing a given objective function (such as mean runtime, percent of instances solved, or score in a competition). The excellent performance of SATzilla was independently verified in the 2007 SAT Competition, where our SATzilla07 solvers won three gold, one silver and one bronze medal. In this article, we go well beyond SATzilla07 by making the portfolio construction scalable and completely automated, and improving it by integrating local search solvers as candidate solvers, by predicting performance score instead of runtime, and by using hierarchical hardness models that take into account different types of SAT instances. We demonstrate the effectiveness of these new techniques in extensive experimental results on data sets including instances from the most recent SAT competition

    Receiver Function and Gravity Constraints on Crustal Structure and Vertical Movements of the Upper Mississippi Embayment and Ozark Uplift

    Get PDF
    The Upper Mississippi Embayment (UME), where the seismically active New Madrid Seismic Zone resides, experienced two phases of subsidence commencing in the Late Precambrian and Cretaceous, respectively. To provide new constraints on models proposed for the mechanisms responsible for the subsidence, we computed and stacked P-to-S receiver functions recorded by 49 USArray and other seismic stations located in the UME and the adjacent Ozark Uplift and modeled Bouguer gravity anomaly data. The inferred thickness, density, and Vp/Vs of the upper and lower crustal layers suggest that the UME is characterized by a mafic and high-density upper crustal layer of ~30 km thickness, which is underlain by a higher-density lower crustal layer of up to ~15 km. Those measurements, in the background of previously published geological observations on the subsidence and uplift history of the UME, are in agreement with the model that the Cretaceous subsidence, which was suggested to be preceded by an approximately 2 km uplift, was the consequence of the passage of a previously proposed thermal plume. The thermoelastic effects of the plume would have induced wide-spread intrusion of mafic mantle material into the weak UME crust fractured by Precambrian rifting and increased its density, resulting in renewed subsidence after the thermal source was removed. In contrast, the Ozark Uplift has crustal density, thickness, and Vp/Vs measurements that are comparable to those observed on cratonic areas, suggesting an overall normal crust without significant modification by the proposed plume, probably owing to the relatively strong and thick lithosphere

    Self-Titrating Anticoagulant Nanocomplexes That Restore Homeostatic Regulation of the Coagulation Cascade

    Get PDF
    Antithrombotic therapy is a critical portion of the treatment regime for a number of life-threatening conditions, including cardiovascular disease, stroke, and cancer; yet, proper clinical management of anticoagulation remains a challenge because existing agents increase the propensity for bleeding in patients. Here, we describe the development of a bioresponsive peptide–polysaccharide nanocomplex that utilizes a negative feedback mechanism to self-titrate the release of anticoagulant in response to varying levels of coagulation activity. This nanoscale self-titrating activatable therapeutic, or nanoSTAT, consists of a cationic thrombin-cleavable peptide and heparin, an anionic polysaccharide and widely used clinical anticoagulant. Under nonthrombotic conditions, nanoSTATs circulate inactively, neither releasing anticoagulant nor significantly prolonging bleeding time. However, in response to life-threatening pulmonary embolism, nanoSTATs locally release their drug payload and prevent thrombosis. This autonomous negative feedback regulator may improve antithrombotic therapy by increasing the therapeutic window and decreasing the bleeding risk of anticoagulants.National Institutes of Health (U.S.) (R01CA124427-01)National Cancer Institute (U.S.) (U54CA119349)National Cancer Institute (U.S.) (U54CA119335)National Cancer Institute (U.S.) (Center of Cancer Nanotechnology Excellence at MIT-Harvard U54CA151884)David & Lucile Packard Foundation (Fellowship)David H. Koch Institute for Integrative Cancer Research at MIT (Marie D. and Pierre Casimir-Lambert Fund)National Cancer Institute (U.S.) (Koch Institute Support (Core) Grant P30-CA14051)MIT-Harvard Center of Cancer Nanotechnology Excellence (5 U54 CA151884-03)National Institutes of Health (U.S.). Medical Scientist Training Program (T32GM007753)National Institutes of Health (U.S.) (Ruth L. Kirschstein National Research Service Award F32CA159496-02)Burroughs Wellcome Fund (Career Award at the Scientific Interface

    Tailoring Capture-Recapture Methods to Estimate Registry-Based Case Counts Based on Error-Prone Diagnostic Signals

    Full text link
    Surveillance research is of great importance for effective and efficient epidemiological monitoring of case counts and disease prevalence. Taking specific motivation from ongoing efforts to identify recurrent cases based on the Georgia Cancer Registry, we extend recently proposed "anchor stream" sampling design and estimation methodology. Our approach offers a more efficient and defensible alternative to traditional capture-recapture (CRC) methods by leveraging a relatively small random sample of participants whose recurrence status is obtained through a principled application of medical records abstraction. This sample is combined with one or more existing signaling data streams, which may yield data based on arbitrarily non-representative subsets of the full registry population. The key extension developed here accounts for the common problem of false positive or negative diagnostic signals from the existing data stream(s). In particular, we show that the design only requires documentation of positive signals in these non-anchor surveillance streams, and permits valid estimation of the true case count based on an estimable positive predictive value (PPV) parameter. We borrow ideas from the multiple imputation paradigm to provide accompanying standard errors, and develop an adapted Bayesian credible interval approach that yields favorable frequentist coverage properties. We demonstrate the benefits of the proposed methods through simulation studies, and provide a data example targeting estimation of the breast cancer recurrence case count among Metro Atlanta area patients from the Georgia Cancer Registry-based Cancer Recurrence Information and Surveillance Program (CRISP) database

    J/Psi Propagation in Hadronic Matter

    Full text link
    We study J/ψ\psi propagation in hot hadronic matter using a four-flavor chiral Lagrangian to model the dynamics and using QCD sum rules to model the finite size effects manifested in vertex interactions through form factors. Charmonium breakup due to scattering with light mesons is the primary impediment to continued propagation. Breakup rates introduce nontrivial temperature and momentum dependence into the J/ψ\psi spectral function.Comment: 6 Pages LaTeX, 3 postscript figures. Proceedings for Strangeness in Quark Matter 2003, Atlantic Beach, NC, March 12-17, 2003; minor corrections in version 2, to appear in J. Phys.

    Telomere dysfunction accurately predicts clinical outcome in chronic lymphocytic leukaemia, even in patients with early stage disease

    Get PDF
    © 2014 John Wiley & Sons Ltd. Defining the prognosis of individual cancer sufferers remains a significant clinical challenge. Here we assessed the ability of high-resolution single telomere length analysis (STELA), combined with an experimentally derived definition of telomere dysfunction, to predict the clinical outcome of patients with chronic lymphocytic leukaemia (CLL). We defined the upper telomere length threshold at which telomere fusions occur and then used the mean of the telomere 'fusogenic' range as a prognostic tool. Patients with telomeres within the fusogenic range had a significantly shorter overall survival (P  <  0·0001; Hazard ratio [HR] = 13·2, 95% confidence interval [CI]  = 11·6-106·4) and this was preserved in early-stage disease patients (P  <  0·0001, HR=19·3, 95% CI = 17·8-802·5). Indeed, our assay allowed the accurate stratification of Binet stage A patients into those with indolent disease (91% survival at 10 years) and those with poor prognosis (13% survival at 10 years). Furthermore, patients with telomeres above the fusogenic mean showed superior prognosis regardless of their IGHV mutation status or cytogenetic risk group. In keeping with this finding, telomere dysfunction was the dominant variable in multivariate analysis. Taken together, this study provides compelling evidence for the use of high-resolution telomere length analysis coupled with a definition of telomere dysfunction in the prognostic assessment of CLL
    corecore