124 research outputs found

    Linear-time algorithms for testing the satisfiability of propositional horn formulae

    Get PDF
    AbstractNew algorithms for deciding whether a (propositional) Horn formula is satisfiable are presented. If the Horn formula A contains K distinct propositional letters and if it is assumed that they are exactly P1,…, PK, the two algorithms presented in this paper run in time O(N), where N is the total number of occurrences of literals in A. By representing a Horn proposition as a graph, the satisfiability problem can be formulated as a data flow problem, a certain type of pebbling. The difference between the two algorithms presented here is the strategy used for pebbling the graph. The first algorithm is based on the principle used for finding the set of nonterminals of a context-free grammar from which the empty string can be derived. The second algorithm is a graph traversal and uses a “call-by-need” strategy. This algorithm uses an attribute grammar to translate a propositional Horn formula to its corresponding graph in linear time. Our formulation of the satisfiability problem as a data flow problem appears to be new and suggests the possibility of improving efficiency using parallel processors

    Detonation wave diffraction in H₂-O₂-Ar mixtures

    Get PDF
    In the present study, we have examined the diffraction of detonation in weakly unstable hydrogen–oxygen–argon mixtures. High accuracy and computational efficiency are obtained using a high-order WENO scheme together with adaptive mesh refinement, which enables handling realistic geometries with resolution at the micrometer level. Both detailed chemistry and spectroscopic models of laser induced fluorescence and chemiluminescence were included to enable a direct comparison with experimental data. Agreement was found between the experiments and the simulations in terms of detonation diffraction structure both for sub-critical and super-critical regimes. The predicted wall reflection distance is about 12–14 cell widths, in accordance with previous experimental studies. Computations show that the re-initiation distance is relatively constant, at about 12–15 cell widths, slightly above the experimental value of 11 cell widths. The predicted critical channel height is 10–11 cell widths, which differs from experiments in circular tubes but is consistent with rectangular channel results

    Structural Insights into the Inhibition of Cytosolic 5′-Nucleotidase II (cN-II) by Ribonucleoside 5′-Monophosphate Analogues

    Get PDF
    Cytosolic 5′-nucleotidase II (cN-II) regulates the intracellular nucleotide pools within the cell by catalyzing the dephosphorylation of 6-hydroxypurine nucleoside 5′-monophosphates. Beside this physiological function, high level of cN-II expression is correlated with abnormal patient outcome when treated with cytotoxic nucleoside analogues. To identify its specific role in the resistance phenomenon observed during cancer therapy, we screened a particular class of chemical compounds, namely ribonucleoside phosphonates to predict them as potential cN-II inhibitors. These compounds incorporate a chemically and enzymatically stable phosphorus-carbon linkage instead of a regular phosphoester bond. Amongst them, six compounds were predicted as better ligands than the natural substrate of cN-II, inosine 5′-monophosphate (IMP). The study of purine and pyrimidine containing analogues and the introduction of chemical modifications within the phosphonate chain has allowed us to define general rules governing the theoretical affinity of such ligands. The binding strength of these compounds was scrutinized in silico and explained by an impressive number of van der Waals contacts, highlighting the decisive role of three cN-II residues that are Phe 157, His 209 and Tyr 210. Docking predictions were confirmed by experimental measurements of the nucleotidase activity in the presence of the three best available phosphonate analogues. These compounds were shown to induce a total inhibition of the cN-II activity at 2 mM. Altogether, this study emphasizes the importance of the non-hydrolysable phosphonate bond in the design of new competitive cN-II inhibitors and the crucial hydrophobic stacking promoted by three protein residues

    Force plate monitoring of human hemodynamics

    Get PDF
    which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Background: Noninvasive recording of movements caused by the heartbeat and the blood circulation is known as ballistocardiography. Several studies have shown the capability of a force plate to detect cardiac activity in the human body. The aim of this paper is to present a new method based on differential geometry of curves to handle multivariate time series obtained by ballistocardiographic force plate measurements. Results: We show that the recoils of the body caused by cardiac motion and blood circulation provide a noninvasive method of displaying the motions of the heart muscle and the propagation of the pulse wave along the aorta and its branches. The results are compared with the data obtained invasively during a cardiac catheterization. We show that the described noninvasive method is able to determine the moment of a particular heart movement or the time when the pulse wave reaches certain morphological structure. Conclusions: Monitoring of heart movements and pulse wave propagation may be used e.g. to estimate the aortic pulse wave velocity, which is widely accepted as an index of aortic stiffness wit

    Explaining Support Vector Machines: A Color Based Nomogram.

    Get PDF
    PROBLEM SETTING: Support vector machines (SVMs) are very popular tools for classification, regression and other problems. Due to the large choice of kernels they can be applied with, a large variety of data can be analysed using these tools. Machine learning thanks its popularity to the good performance of the resulting models. However, interpreting the models is far from obvious, especially when non-linear kernels are used. Hence, the methods are used as black boxes. As a consequence, the use of SVMs is less supported in areas where interpretability is important and where people are held responsible for the decisions made by models. OBJECTIVE: In this work, we investigate whether SVMs using linear, polynomial and RBF kernels can be explained such that interpretations for model-based decisions can be provided. We further indicate when SVMs can be explained and in which situations interpretation of SVMs is (hitherto) not possible. Here, explainability is defined as the ability to produce the final decision based on a sum of contributions which depend on one single or at most two input variables. RESULTS: Our experiments on simulated and real-life data show that explainability of an SVM depends on the chosen parameter values (degree of polynomial kernel, width of RBF kernel and regularization constant). When several combinations of parameter values yield the same cross-validation performance, combinations with a lower polynomial degree or a larger kernel width have a higher chance of being explainable. CONCLUSIONS: This work summarizes SVM classifiers obtained with linear, polynomial and RBF kernels in a single plot. Linear and polynomial kernels up to the second degree are represented exactly. For other kernels an indication of the reliability of the approximation is presented. The complete methodology is available as an R package and two apps and a movie are provided to illustrate the possibilities offered by the method

    Adoption of high-sensitivity cardiac troponin for risk stratification of patients with suspected myocardial infarction: a multicentre cohort study

    Get PDF
    Background: Guidelines recommend high-sensitivity cardiac troponin to risk stratify patients with possible myocardial infarction and identify those eligible for discharge. Our aim was to evaluate adoption of this approach in practice and to determine whether effectiveness and safety varies by age, sex, ethnicity, or socioeconomic deprivation status. Methods: A multi-centre cohort study was conducted in 13 hospitals across the United Kingdom from November 1st, 2021, to October 31st, 2022. Routinely collected data including high-sensitivity cardiac troponin I or T measurements were linked to outcomes. The primary effectiveness and safety outcomes were the proportion discharged from the Emergency Department, and the proportion dead or with a subsequent myocardial infarction at 30 days, respectively. Patients were stratified using peak troponin concentration as low (sex-specific 99th percentile). Findings: In total 137,881 patients (49% [67,709/137,881] female) were included of whom 60,707 (44%), 42,727 (31%), and 34,447 (25%) were stratified as low-, intermediate- and high-risk, respectively. Overall, 65.8% (39,918/60,707) of low-risk patients were discharged from the Emergency Department, but this varied from 26.8% [2200/8216] to 93.5% [918/982] by site. The safety outcome occurred in 0.5% (277/60,707) and 11.4% (3917/34,447) of patients classified as low- or high-risk, of whom 0.03% (18/60,707) and 1% (304/34,447) had a subsequent myocardial infarction at 30 days, respectively. A similar proportion of male and female patients were discharged (52% [36,838/70,759] versus 54% [36,113/67,109]), but discharge was more likely if patients were <70 years old (61% [58,533/95,227] versus 34% [14,428/42,654]), from areas of low socioeconomic deprivation (48% [6697/14,087] versus 43% [12,090/28,116]) or were black or asian compared to caucasian (62% [5458/8877] and 55% [10,026/18,231] versus 46% [35,138/75,820]). Interpretation: Despite high-sensitivity cardiac troponin correctly identifying half of all patients with possible myocardial infarction as being at low risk, only two-thirds of these patients were discharged. Substantial variation in the discharge of patients by age, ethnicity, socioeconomic deprivation, and site was observed identifying important opportunities to improve care. Funding: UK Research and Innovation
    corecore