21,309 research outputs found

    An optimized energy potential can predict SH2 domain-peptide interactions

    Get PDF
    Peptide recognition modules (PRMs) are used throughout biology to mediate protein-protein interactions, and many PRMs are members of large protein domain families. Members of these families are often quite similar to each other, but each domain recognizes a distinct set of peptides, raising the question of how peptide recognition specificity is achieved using similar protein domains. The analysis of individual protein complex structures often gives answers that are not easily applicable to other members of the same PRM family. Bioinformatics-based approaches, one the other hand, may be difficult to interpret physically. Here we integrate structural information with a large, quantitative data set of SH2-peptide interactions to study the physical origin of domain-peptide specificity. We develop an energy model, inspired by protein folding, based on interactions between the amino acid positions in the domain and peptide. We use this model to successfully predict which SH2 domains and peptides interact and uncover the positions in each that are important for specificity. The energy model is general enough that it can be applied to other members of the SH2 family or to new peptides, and the cross-validation results suggest that these energy calculations will be useful for predicting binding interactions. It can also be adapted to study other PRM families, predict optimal peptides for a given SH2 domain, or study other biological interactions, e.g. protein-DNA interactions

    THE PROPOSITION VALUE OF CORPORATE RATINGS - A RELIABILITY TESTING OF CORPORATE RATINGS BY APPLYING ROC AND CAP TECHNIQUES

    Get PDF
    We analyze the Altman model, a Logit model as well as the KMV model in order to evaluate their performance. Therefore, we use a random sample of 132 US firms. We create a yearly and a quarterly sample set to construct a portfolio of defaulting and a counter portfolio of non-defaulting companies. As we stay close to the recommendations of the Basel Capital Accord framework in order to evaluate the models, we use Receiver Operating Characteristic (ROC) and Cumulative Accuracy Profile (CAP) techniques. We find that the Logit model outperforms the Altman as well as the KMV model. Furthermore, we find that the Altman model outperforms the KMV model, which is nearly as accurate as a random model.Altman Model, Cumulative Accuracy Profile (CAP), Distance to Default, Logit Model, Moody’s KMV, Receiver Operating Characteristic (ROC), Z-score.

    How shoud prey animals respond to uncertain threats?

    Get PDF
    A prey animal surveying its environment must decide whether there is a dangerous predator present or not. If there is, it may flee. Flight has an associated cost, so the animal should not flee if there is no danger. However, the prey animal cannot know the state of its environment with certainty, and is thus bound to make some errors. We formulate a probabilistic automaton model of a prey animal's life and use it to compute the optimal escape decision strategy, subject to the animal's uncertainty. The uncertainty is a major factor in determining the decision strategy: only in the presence of uncertainty do economic factors (like mating opportunities lost due to flight) influence the decision. We performed computer simulations and found that \emph{in silico} populations of animals subject to predation evolve to display the strategies predicted by our model, confirming our choice of objective function for our analytic calculations. To the best of our knowledge, this is the first theoretical study of escape decisions to incorporate the effects of uncertainty, and to demonstrate the correctness of the objective function used in the model.Comment: 5 figures, 10 pages of tex

    ROC and the bounds on tail probabilities via theorems of Dubins and F. Riesz

    Full text link
    For independent XX and YY in the inequality P(X≤Y+μ)P(X\leq Y+\mu), we give sharp lower bounds for unimodal distributions having finite variance, and sharp upper bounds assuming symmetric densities bounded by a finite constant. The lower bounds depend on a result of Dubins about extreme points and the upper bounds depend on a symmetric rearrangement theorem of F. Riesz. The inequality was motivated by medical imaging: find bounds on the area under the Receiver Operating Characteristic curve (ROC).Comment: Published in at http://dx.doi.org/10.1214/08-AAP536 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Measures of metacognition on signal-detection theoretic models

    Get PDF
    Analysing metacognition, specifically knowledge of accuracy of internal perceptual, memorial or other knowledge states, is vital for many strands of psychology, including determining the accuracy of feelings of knowing, and discriminating conscious from unconscious cognition. Quantifying metacognitive sensitivity is however more challenging than quantifying basic stimulus sensitivity. Under popular signal detection theory (SDT) models for stimulus classification tasks, approaches based on type II receiver-operator characteristic (ROC) curves or type II d-prime risk confounding metacognition with response biases in either the type I (classification) or type II (metacognitive) tasks. A new approach introduces meta-d′: the type I d-prime that would have led to the observed type II data had the subject used all the type I information. Here we (i) further establish the inconsistency of the type II d-prime and ROC approaches with new explicit analyses of the standard SDT model, and (ii) analyse, for the first time, the behaviour of meta-d′ under non-trivial scenarios, such as when metacognitive judgments utilize enhanced or degraded versions of the type I evidence. Analytically, meta-d′ values typically reflect the underlying model well, and are stable under changes in decision criteria; however, in relatively extreme cases meta-d′ can become unstable. We explore bias and variance of in-sample measurements of meta-d′ and supply MATLAB code for estimation in general cases. Our results support meta-d′ as a useful measure of metacognition, and provide rigorous methodology for its application. Our recommendations are useful for any researchers interested in assessing metacognitive accuracy

    PENGARUH KADAR GLUKOSA DARAH SEWAKTU TERHADAP KELUARAN NEUROLOGIK PADA PENDERITA STROKE ISKEMIK FASE AKUT NONDIABETIK

    Get PDF
    Background : An increasing of blood glucose levels is presumed to aggravate the neurological outcome of ischemic stroke patients. The research about association between increased blood glucose levels on nondiabetic patients with ischemic stroke and the neurological outcome measured by NIHSS is still small. Objective : To determine the effect of blood glucose levels on neurological outcome in nondiabetic patients with acute ischemic stroke. Methods : a cohort study with 32 nondiabetic patients with acute ischemic stroke who underwent treatment at Dr. Kariadi Hospital. The first blood glucose was drawn at 48 hours and the second at 72 hours after the onset of ischemic stroke. The neurological examination measured by NIHSS scale at 48 hours, 72 hours, and day 7 after the onset of ischemic stroke. The NIHSS score differences between 48 hours, 72 hours and day 7 were analyzed by Friedman test and Wilcoxon Post Hoc test. The correlation between blood glucose levels at 48 hours and 72 hours after the onset of ischemic stroke and neurological outcome were analyzed by Spearman correlation test. Results : There were differences in the mean NIHSS score at 48 hours, 72 hours, and day 7 with p value <0,0001. The Spearman correlation test between blood glucose levels at 48 hours and the patients neurological outcome at day 7 showed p value = 0,386. Conclusion : There is no correlation between blood glucose levels and the neurological outcome of the nondiabetic patients with acute ischemic strok
    • …
    corecore