101 research outputs found

    Cirurgia para o controle de danos: Sua evolução durante os últimos 20 anos

    Get PDF
    In less than twenty years, what began as a concept for the treatment of exsanguinating truncal trauma patients has become the primary treatment model for numerous emergent, life threatening surgical conditions incapable of tolerating traditional methods. Its core concepts are relative straightforward and simple in nature: first, proper identification of the patient who is in need of following this paradigm; second, truncation of the initial surgical procedure to the minimal necessary operation; third, aggressive, focused resuscitation in the intensive care unit; fourth, definitive care only once the patient is optimized to tolerate the procedure. These simple underlying principles can be molded to a variety of emergencies, from its original application in combined major vascular and visceral trauma to the septic abdomen and orthopedics. A host of new resuscitation strategies and technologies have been developed over the past two decades, from permissive hypotension and damage control resuscitation to advanced ventilators and hemostatic agents, which have allowed for a more focused resuscitation, allowing some of the morbidity of this model to be reduced. The combination of the simple, malleable paradigm along with better understanding of resuscitation has proven to be a potent blend. As such, what was once an almost lethal injury (combined vascular and visceral injury) has become a survivable one

    Fairness Expectations and Altruistic Sharing in 15-Month-Old Human Infants

    Get PDF
    Human cooperation is a key driving force behind the evolutionary success of our hominin lineage. At the proximate level, biologists and social scientists have identified other-regarding preferences – such as fairness based on egalitarian motives, and altruism – as likely candidates for fostering large-scale cooperation. A critical question concerns the ontogenetic origins of these constituents of cooperative behavior, as well as whether they emerge independently or in an interrelated fashion. The answer to this question will shed light on the interdisciplinary debate regarding the significance of such preferences for explaining how humans become such cooperative beings. We investigated 15-month-old infants' sensitivity to fairness, and their altruistic behavior, assessed via infants' reactions to a third-party resource distribution task, and via a sharing task. Our results challenge current models of the development of fairness and altruism in two ways. First, in contrast to past work suggesting that fairness and altruism may not emerge until early to mid-childhood, 15-month-old infants are sensitive to fairness and can engage in altruistic sharing. Second, infants' degree of sensitivity to fairness as a third-party observer was related to whether they shared toys altruistically or selfishly, indicating that moral evaluations and prosocial behavior are heavily interconnected from early in development. Our results present the first evidence that the roots of a basic sense of fairness and altruism can be found in infancy, and that these other-regarding preferences develop in a parallel and interwoven fashion. These findings support arguments for an evolutionary basis – most likely in dialectical manner including both biological and cultural mechanisms – of human egalitarianism given the rapidly developing nature of other-regarding preferences and their role in the evolution of human-specific forms of cooperation. Future work of this kind will help determine to what extent uniquely human sociality and morality depend on other-regarding preferences emerging early in life

    Retest effects in operational selection settings: Development and test of a framework

    Get PDF
    This study proposes a framework for examining the effects of retaking tests in operational selection settings. A central feature of this frame-work is the distinction between within-person and between-person retest effects. This framework is used to develop hypotheses about retest ef-fects for exemplars of 3 types of tests (knowledge tests, cognitive ability tests, and situational judgment tests) and to test these hypotheses in a high stakes selection setting (admission to medical studies in Belgium). Analyses of within-person retest effects showed that mean scores of re-peat test takers were one-third of a standard deviation higher for the knowledge test and situational judgment test and one-half of a standard deviation higher for the cognitive ability test. The validity coefficients for the knowledge test differed significantly depending on whether ex-aminees ’ test scores on the first versus second administration were used, with the latter being more valid. Analyses of between-person retest ef-fects on the prediction of academic performance showed that the same test score led to higher levels of performance for those passing on the first attempt than for those passing on the second attempt. The implications of these results are discussed in light of extant retesting practice. In employment settings, the Uniform Guidelines on Employee Selec-tion Procedures (1978) state that organizations should provide a reasonable opportunity to test takers for retesting. Hence, most organizations in the private and public sector have installed retesting policies in promotion and hiring situations (e.g., Campbell, 2004; McElreath, Bayless, Reilly, & Hayes, 2004). In the educational domain, the Standards for Educational and Psychological Testing (APA/AERA/NCME, 1999) state that retest op-portunities should be provided for tests used for promotion or graduation decisions. The opportunity for retesting is also mandated for tests used in making admission, licensing, or certification decisions. A previous version of this manuscript was presented at the Annual Convention of th

    Progress and prospects toward our understanding of the evolution of dosage compensation

    Get PDF
    In many eukaryotic organisms, gender is determined by a pair of heteromorphic sex chromosomes. Degeneration of the non-recombining Y chromosome is a general facet of sex chromosome evolution. Selective pressure to restore expression levels of X-linked genes relative to autosomes accompanies Y-chromosome degeneration, thus driving the evolution of dosage compensation mechanisms. This review focuses on evolutionary aspects of dosage compensation, in light of recent advances in comparative and functional genomics that have substantially increased our understanding of the molecular mechanisms of dosage compensation and how it evolved. We review processes involved in sex chromosome evolution, and discuss the dynamic interaction between Y degeneration and the acquisition of dosage compensation. We compare mechanisms of dosage compensation and the origin of dosage compensation genes between different taxa and comment on sex chromosomes that apparently lack compensation mechanisms. Finally, we discuss how dosage compensation systems can also influence the evolution of well-established sex chromosomes

    ExoClock Project. III. 450 New Exoplanet Ephemerides from Ground and Space Observations

    Get PDF
    The ExoClock project has been created to increase the efficiency of the Ariel mission. It will achieve this by continuously monitoring and updating the ephemerides of Ariel candidates, in order to produce a consistent catalog of reliable and precise ephemerides. This work presents a homogenous catalog of updated ephemerides for 450 planets, generated by the integration of ∼18,000 data points from multiple sources. These sources include observations from ground-based telescopes (the ExoClock network and the Exoplanet Transit Database), midtime values from the literature, and light curves from space telescopes (Kepler, K2, and TESS). With all the above, we manage to collect observations for half of the postdiscovery years (median), with data that have a median uncertainty less than 1 minute. In comparison with the literature, the ephemerides generated by the project are more precise and less biased. More than 40% of the initial literature ephemerides had to be updated to reach the goals of the project, as they were either of low precision or drifting. Moreover, the integrated approach of the project enables both the monitoring of the majority of the Ariel candidates (95%), and also the identification of missing data. These results highlight the need for continuous monitoring to increase the observing coverage of the candidate planets. Finally, the extended observing coverage of planets allows us to detect trends (transit-timing variations) for a sample of 19 planets. All the products, data, and codes used in this work are open and accessible to the wider scientific community

    Incorporating clinical guidelines through clinician decision-making

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>It is generally acknowledged that a disparity between knowledge and its implementation is adversely affecting quality of care. An example commonly cited is the failure of clinicians to follow clinical guidelines. A guiding assumption of this view is that adherence should be gauged by a standard of conformance. At least some guideline developers dispute this assumption and claim that their efforts are intended to inform and assist clinical practice, not to function as standards of performance. However, their ability to assist and inform will remain limited until an alternative to the conformance criterion is proposed that gauges how evidence-based guidelines are incorporated into clinical decisions.</p> <p>Methods</p> <p>The proposed investigation has two specific aims to identify the processes that affect decisions about incorporating clinical guidelines, and then to develop ad test a strategy that promotes the utilization of evidence-based practices. This paper focuses on the first aim. It presents the rationale, introduces the clinical paradigm of treatment-resistant schizophrenia, and discusses an exemplar of clinician non-conformance to a clinical guideline. A modification of the original study is proposed that targets psychiatric trainees and draws on a cognitively rich theory of decision-making to formulate hypotheses about how the guideline is incorporated into treatment decisions. Twenty volunteer subjects recruited from an accredited psychiatry training program will respond to sixty-four vignettes that represent a fully crossed 2 × 2 × 2 × 4 within-subjects design. The variables consist of criteria contained in the clinical guideline and other relevant factors. Subjects will also respond to a subset of eight vignettes that assesses their overall impression of the guideline. Generalization estimating equation models will be used to test the study's principal hypothesis and perform secondary analyses.</p> <p>Implications</p> <p>The original design of phase two of the proposed investigation will be changed in recognition of newly published literature on the relative effectiveness of treatments for schizophrenia. It is suggested that this literature supports the notion that guidelines serve a valuable function as decision tools, and substantiates the importance of decision-making as the means by which general principles are incorporated into clinical practice.</p

    Do physician outcome judgments and judgment biases contribute to inappropriate use of treatments? Study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There are many examples of physicians using treatments inappropriately, despite clear evidence about the circumstances under which the benefits of such treatments outweigh their harms. When such over- or under- use of treatments occurs for common diseases, the burden to the healthcare system and risks to patients can be substantial. We propose that a major contributor to inappropriate treatment may be how clinicians judge the likelihood of important treatment outcomes, and how these judgments influence their treatment decisions. The current study will examine the role of judged outcome probabilities and other cognitive factors in the context of two clinical treatment decisions: 1) prescription of antibiotics for sore throat, where we hypothesize overestimation of benefit and underestimation of harm leads to over-prescription of antibiotics; and 2) initiation of anticoagulation for patients with atrial fibrillation (AF), where we hypothesize that underestimation of benefit and overestimation of harm leads to under-prescription of warfarin.</p> <p>Methods</p> <p>For each of the two conditions, we will administer surveys of two types (Type 1 and Type 2) to different samples of Canadian physicians. The primary goal of the Type 1 survey is to assess physicians' perceived outcome probabilities (both good and bad outcomes) for the target treatment. Type 1 surveys will assess judged outcome probabilities in the context of a representative patient, and include questions about how physicians currently treat such cases, the recollection of rare or vivid outcomes, as well as practice and demographic details. The primary goal of the Type 2 surveys is to measure the specific factors that drive individual clinical judgments and treatment decisions, using a 'clinical judgment analysis' or 'lens modeling' approach. This survey will manipulate eight clinical variables across a series of sixteen realistic case vignettes. Based on the survey responses, we will be able to identify which variables have the greatest effect on physician judgments, and whether judgments are affected by inappropriate cues or incorrect weighting of appropriate cues. We will send antibiotics surveys to family physicians (300 per survey), and warfarin surveys to both family physicians and internal medicine specialists (300 per group per survey), for a total of 1,800 physicians. Each Type 1 survey will be two to four pages in length and take about fifteen minutes to complete, while each Type 2 survey will be eight to ten pages in length and take about thirty minutes to complete.</p> <p>Discussion</p> <p>This work will provide insight into the extent to which clinicians' judgments about the likelihood of important treatment outcomes explain inappropriate treatment decisions. This work will also provide information necessary for the development of an individualized feedback tool designed to improve treatment decisions. The techniques developed here have the potential to be applicable to a wide range of clinical areas where inappropriate utilization stems from biased judgments.</p

    Identification of gene targets against dormant phase Mycobacterium tuberculosis infections

    Get PDF
    <p>Abstract</p> <p>Background</p> <p><it>Mycobacterium tuberculosis</it>, the causative agent of tuberculosis (TB), infects approximately 2 billion people worldwide and is the leading cause of mortality due to infectious disease. Current TB therapy involves a regimen of four antibiotics taken over a six month period. Patient compliance, cost of drugs and increasing incidence of drug resistant <it>M. tuberculosis </it>strains have added urgency to the development of novel TB therapies. Eradication of TB is affected by the ability of the bacterium to survive up to decades in a dormant state primarily in hypoxic granulomas in the lung and to cause recurrent infections.</p> <p>Methods</p> <p>The availability of <it>M. tuberculosis </it>genome-wide DNA microarrays has lead to the publication of several gene expression studies under simulated dormancy conditions. However, no single model best replicates the conditions of human pathogenicity. In order to identify novel TB drug targets, we performed a meta-analysis of multiple published datasets from gene expression DNA microarray experiments that modeled infection leading to and including the dormant state, along with data from genome-wide insertional mutagenesis that examined gene essentiality.</p> <p>Results</p> <p>Based on the analysis of these data sets following normalization, several genome wide trends were identified and used to guide the selection of targets for therapeutic development. The trends included the significant up-regulation of genes controlled by <it>devR</it>, down-regulation of protein and ATP synthesis, and the adaptation of two-carbon metabolism to the hypoxic and nutrient limited environment of the granuloma. Promising targets for drug discovery were several regulatory elements (<it>devR/devS</it>, <it>relA</it>, <it>mprAB</it>), enzymes involved in redox balance and respiration, sulfur transport and fixation, pantothenate, isoprene, and NAD biosynthesis. The advantages and liabilities of each target are discussed in the context of enzymology, bacterial pathways, target tractability, and drug development.</p> <p>Conclusion</p> <p>Based on our bioinformatics analysis and additional discussion of in-depth biological rationale, several novel anti-TB targets have been proposed as potential opportunities to improve present therapeutic treatments for this disease.</p
    corecore