1,567 research outputs found

    The impact of repeat hospitalizations on hospitalization rates for selected conditions among adults with and without diabetes, 12 US states, 2011

    Get PDF
    Introduction Hospitalization data typically cannot be used to estimate the number of individuals hospitalized annually because individuals are not tracked over time and may be hospitalized multiple times annually. We examined the impact of repeat hospitalizations on hospitalization rates for various conditions and on comparison of rates by diabetes status. Methods We analyzed hospitalization data for which repeat hospitalizations could be distinguished among adults aged 18 or older from 12 states using the 2011 Agency for Healthcare Research and Quality’s State Inpatient Databases. The Behavioral Risk Factor Surveillance System was used to estimate the number of adults with and without diagnosed diabetes in each state (denominator). We calculated percentage increases due to repeat hospitalizations in rates and compared the ratio of diabetes with non-diabetes rates while excluding and including repeat hospitalizations. Results Regardless of diabetes status, hospitalization rates were considerably higher when repeat hospitalizations within a calendar year were included. The magnitude of the differences varied by condition. Among adults with diabetes, rates ranged from 13.0% higher for stroke to 41.6% higher for heart failure; for adults without diabetes, these rates ranged from 9.5% higher for stroke to 25.2% higher for heart failure. Ratios of diabetes versus non-diabetes rates were similar with and without repeat hospitalizations. Conclusion Hospitalization rates that include repeat hospitalizations overestimate rates in individuals, and this overestimation is especially pronounced for some causes. However, the inclusion of repeat hospitalizations for common diabetes-related causes had little impact on rates by diabetes status

    The operationalized psychodynamic diagnostics system. Clinical relevance, reliability and validity

    Get PDF
    In this paper, we present a multiaxial system for psychodynamic diagnosis, which has attained wide usage in Germany in the last 10 years. First we will discuss the 4 operationalized psychodynamic diagnostics (OPD) axes: illness experience and treatment assumptions, relationships, mental conflicts, and structure, then clinical applications will be outlined. Focus psychodynamic formulations can be employed both with inpatients and with outpatients. Studies show good reliability in a research context and acceptable reliability for clinical purposes. Validity will be separately summarized as content, criterion, and construct validity. Validity studies indicate good validity for the individual axes. Numerous studies on the OPD indicate areas of possible improvement, for example for clinical purposes the OPD should be more practically formulated

    Use of re-randomized data in meta-analysis

    Get PDF
    BACKGROUND: Outcomes collected in randomized clinical trials are observations of random variables that should be independent and identically distributed. However, in some trials, the patients are randomized more than once thus violating both of these assumptions. The probability of an event is not always the same when a patient is re-randomized; there is probably a non-zero covariance coming from observations on the same patient. This is of particular importance to the meta-analysts. METHODS: We developed a method to estimate the relative error in the risk differences with and without re-randomization of the patients. The relative error can be estimated by an expression depending on the percentage of the patients who were re-randomized, multipliers (how many times more likely it is to repeat an event) for the probability of reoccurrences, and the ratio of the total events reported and the initial number of patients entering the trial. RESULTS: We illustrate our methods using two randomized trials testing growth factors in febrile neutropenia. We showed that under some circumstances the relative error of taking into account re-randomized patients was sufficiently small to allow using the results in the meta-analysis. Our findings indicate that if the study in question is of similar size to other studies included in the meta-analysis, the error introduced by re-randomization will only minimally affect meta-analytic summary point estimate. We also show that in our model the risk ratio remains constant during the re-randomization, and therefore, if a meta-analyst is concerned about the effect of re-randomization on the meta-analysis, one way to sidestep the issue and still obtain reliable results is to use risk ratio as the measure of interest. CONCLUSION: Our method should be helpful in the understanding of the results of clinical trials and particularly helpful to the meta-analysts to assess if re-randomized patient data can be used in their analyses

    Force-dependent focal adhesion assembly and disassembly: A computational study

    Get PDF
    Cells interact with the extracellular matrix (ECM) via cell–ECM adhesions. These physical interactions are transduced into biochemical signals inside the cell which influence cell behaviour. Although cell–ECM interactions have been studied extensively, it is not completely understood how immature (nascent) adhesions develop into mature (focal) adhesions and how mechanical forces influence this process. Given the small size, dynamic nature and short lifetimes of nascent adhesions, studying them using conventional microscopic and experimental techniques is challenging. Computational modelling provides a valuable resource for simulating and exploring various “what if?” scenarios in silico and identifying key molecular components and mechanisms for further investigation. Here, we present a simplified mechano-chemical model based on ordinary differential equations with three major proteins involved in adhesions: integrins, talin and vinculin. Additionally, we incorporate a hypothetical signal molecule that influences adhesion (dis)assembly rates. We find that assembly and disassembly rates need to vary dynamically to limit maturation of nascent adhesions. The model predicts biphasic variation of actin retrograde velocity and maturation fraction with substrate stiffness, with maturation fractions between 18–35%, optimal stiffness of ∼1 pN/nm, and a mechanosensitive range of 1-100 pN/nm, all corresponding to key experimental findings. Sensitivity analyses show robustness of outcomes to small changes in parameter values, allowing model tuning to reflect specific cell types and signaling cascades. The model proposes that signal-dependent disassembly rate variations play an underappreciated role in maturation fraction regulation, which should be investigated further. We also provide predictions on the changes in traction force generation under increased/decreased vinculin concentrations, complementing previous vinculin overexpression/knockout experiments in different cell types. In summary, this work proposes a model framework to robustly simulate the mechanochemical processes underlying adhesion maturation and maintenance, thereby enhancing our fundamental knowledge of cell–ECM interactions

    Increased insolation threshold for runaway greenhouse processes on Earth like planets

    Full text link
    Because the solar luminosity increases over geological timescales, Earth climate is expected to warm, increasing water evaporation which, in turn, enhances the atmospheric greenhouse effect. Above a certain critical insolation, this destabilizing greenhouse feedback can "runaway" until all the oceans are evaporated. Through increases in stratospheric humidity, warming may also cause oceans to escape to space before the runaway greenhouse occurs. The critical insolation thresholds for these processes, however, remain uncertain because they have so far been evaluated with unidimensional models that cannot account for the dynamical and cloud feedback effects that are key stabilizing features of Earth's climate. Here we use a 3D global climate model to show that the threshold for the runaway greenhouse is about 375 W/m2^2, significantly higher than previously thought. Our model is specifically developed to quantify the climate response of Earth-like planets to increased insolation in hot and extremely moist atmospheres. In contrast with previous studies, we find that clouds have a destabilizing feedback on the long term warming. However, subsident, unsaturated regions created by the Hadley circulation have a stabilizing effect that is strong enough to defer the runaway greenhouse limit to higher insolation than inferred from 1D models. Furthermore, because of wavelength-dependent radiative effects, the stratosphere remains cold and dry enough to hamper atmospheric water escape, even at large fluxes. This has strong implications for Venus early water history and extends the size of the habitable zone around other stars.Comment: Published in Nature. Online publication date: December 12, 2013. Accepted version before journal editing and with Supplementary Informatio

    Evolutionary distances in the twilight zone -- a rational kernel approach

    Get PDF
    Phylogenetic tree reconstruction is traditionally based on multiple sequence alignments (MSAs) and heavily depends on the validity of this information bottleneck. With increasing sequence divergence, the quality of MSAs decays quickly. Alignment-free methods, on the other hand, are based on abstract string comparisons and avoid potential alignment problems. However, in general they are not biologically motivated and ignore our knowledge about the evolution of sequences. Thus, it is still a major open question how to define an evolutionary distance metric between divergent sequences that makes use of indel information and known substitution models without the need for a multiple alignment. Here we propose a new evolutionary distance metric to close this gap. It uses finite-state transducers to create a biologically motivated similarity score which models substitutions and indels, and does not depend on a multiple sequence alignment. The sequence similarity score is defined in analogy to pairwise alignments and additionally has the positive semi-definite property. We describe its derivation and show in simulation studies and real-world examples that it is more accurate in reconstructing phylogenies than competing methods. The result is a new and accurate way of determining evolutionary distances in and beyond the twilight zone of sequence alignments that is suitable for large datasets.Comment: to appear in PLoS ON

    Radiation management and credentialing of fluoroscopy users

    Get PDF
    During the last 15 years, developments in X-ray technologies have substantially improved the ability of practitioners to treat patients using fluoroscopically guided interventional techniques. Many of these procedures require a greater use of fluoroscopy and more recording of images. This increases the potential for radiation-induced dermatitis and epilation, as well as severe radiation-induced burns to patients. Many fluoroscope operators are untrained in radiation management and do not realize that these procedures increase the risk of radiation injury and radiation-induced cancer in personnel as well as patients. The hands of long-time fluoroscope operators in some cases exhibit radiation damage—especially when sound radiation protection practices have not been followed. In response, the Center for Devices and Radiological Health of the United States Food and Drug Administration has issued an Advisory calling for proper training of operators. Hospitals and administrators need to support and enforce the need for this training by requiring documentation of credentials in radiation management as a prerequisite for obtaining fluoroscopy privileges. A concerted effort on the part of professional medical organizations and regulatory agencies will be required to train fluoroscopy users to prevent physicians from unwittingly imparting serious radiation injuries to their patients

    Comparison of Pittsburgh compound B and florbetapir in cross-sectional and longitudinal studies.

    Get PDF
    IntroductionQuantitative in vivo measurement of brain amyloid burden is important for both research and clinical purposes. However, the existence of multiple imaging tracers presents challenges to the interpretation of such measurements. This study presents a direct comparison of Pittsburgh compound B-based and florbetapir-based amyloid imaging in the same participants from two independent cohorts using a crossover design.MethodsPittsburgh compound B and florbetapir amyloid PET imaging data from three different cohorts were analyzed using previously established pipelines to obtain global amyloid burden measurements. These measurements were converted to the Centiloid scale to allow fair comparison between the two tracers. The mean and inter-individual variability of the two tracers were compared using multivariate linear models both cross-sectionally and longitudinally.ResultsGlobal amyloid burden measured using the two tracers were strongly correlated in both cohorts. However, higher variability was observed when florbetapir was used as the imaging tracer. The variability may be partially caused by white matter signal as partial volume correction reduces the variability and improves the correlations between the two tracers. Amyloid burden measured using both tracers was found to be in association with clinical and psychometric measurements. Longitudinal comparison of the two tracers was also performed in similar but separate cohorts whose baseline amyloid load was considered elevated (i.e., amyloid positive). No significant difference was detected in the average annualized rate of change measurements made with these two tracers.DiscussionAlthough the amyloid burden measurements were quite similar using these two tracers as expected, difference was observable even after conversion into the Centiloid scale. Further investigation is warranted to identify optimal strategies to harmonize amyloid imaging data acquired using different tracers

    A transient homotypic interaction model for the influenza A virus NS1 protein effector domain

    Get PDF
    Influenza A virus NS1 protein is a multifunctional virulence factor consisting of an RNA binding domain (RBD), a short linker, an effector domain (ED), and a C-terminal 'tail'. Although poorly understood, NS1 multimerization may autoregulate its actions. While RBD dimerization seems functionally conserved, two possible apo ED dimers have been proposed (helix-helix and strand-strand). Here, we analyze all available RBD, ED, and full-length NS1 structures, including four novel crystal structures obtained using EDs from divergent human and avian viruses, as well as two forms of a monomeric ED mutant. The data reveal the helix-helix interface as the only strictly conserved ED homodimeric contact. Furthermore, a mutant NS1 unable to form the helix-helix dimer is compromised in its ability to bind dsRNA efficiently, implying that ED multimerization influences RBD activity. Our bioinformatical work also suggests that the helix-helix interface is variable and transient, thereby allowing two ED monomers to twist relative to one another and possibly separate. In this regard, we found a mAb that recognizes NS1 via a residue completely buried within the ED helix-helix interface, and which may help highlight potential different conformational populations of NS1 (putatively termed 'helix-closed' and 'helix-open') in virus-infected cells. 'Helix-closed' conformations appear to enhance dsRNA binding, and 'helix-open' conformations allow otherwise inaccessible interactions with host factors. Our data support a new model of NS1 regulation in which the RBD remains dimeric throughout infection, while the ED switches between several quaternary states in order to expand its functional space. Such a concept may be applicable to other small multifunctional proteins
    corecore