858 research outputs found

    Random survival forests

    Full text link
    We introduce random survival forests, a random forests method for the analysis of right-censored survival data. New survival splitting rules for growing survival trees are introduced, as is a new missing data algorithm for imputing missing data. A conservation-of-events principle for survival forests is introduced and used to define ensemble mortality, a simple interpretable measure of mortality that can be used as a predicted outcome. Several illustrative examples are given, including a case study of the prognostic implications of body mass for individuals with coronary artery disease. Computations for all examples were implemented using the freely available R-software package, randomSurvivalForest.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS169 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Analysis of morbid events and risk factors for death after cardiac transplantation

    Get PDF
    AbstractRisk factors for death after cardiac transplantation performed at the University of Alabana at Birmingham from January 1981 to July 1985 included (by multivariate analysis) higher calculated preoperative pulmonary vascular resistance (early and constant phases), murphology of cardiomyopathy (versus ischemic heart disease) (constant phase only) and black race (constant phase). overall actuarial survival was 71% at 1 year and 48% at 3 years (including azalhioprine and cyclosporine eras). The hazard function for death was highest immediately after operation and declined rapidly thereafter, merging with a constant phase of risk at about 3 months.The most favorable group for long-term survival was the group of white patients with ischemic heart disease and low pulmonary vascular resistance. When such patients had a pulmonary vascular resistance < 3 units m2, the 3 year survival rate exceeded 85%; The most common causes of death were acute rejection (24%) and infection (17%) The risk of infection remained highest during the first several months after any period of augmented immunosuppression

    Geriatric Hip Fracture Quality Initiative

    Get PDF
    Introduction: Multiple studies demonstrate increased morbidity, mortality, and loss of independence after hip fractures in geriatric patients. The 1-year mortality rate after a hip fracture has been estimated at anywhere from 14% to 58%. Hip fractures are one of the most common injuries evaluated by the UNM Orthopedic department. Geriatric hip fracture protocols have shown improved outcomes at many other centers with regard to improved functionality and decreased morbidity. The goal of this initiative is to improve outcomes with regard to length of hospital stay, functionality after surgery, and as a result, decreased morbidity and mortality. Materials/methods: All deaths in the orthopedic department were reviewed and analyzed from June 2009 to July 2019. Deaths were identified from morbidity and mortality submissions and NSQIP data. The geriatric hip fracture protocol was developed and implemented in Fall 2019, with non-critical care patients being primarily admitted to orthopedics, with hospitalist co-management. Specific post-operative and pain order sets were developed for efficiency and improved standard of care. Results: Early results of the newly developed geriatric hip fracture protocol demonstrate decreased length of stay in the hospital and earlier time to surgical intervention. It is too early to determine if morbidity and mortality has seen any decrease, however this can be anticipated with earlier time to surgery and decreased time in the hospital. Conclusions: We identified a need and successfully developed an initiative to improve care for geriatric patients with hip fractures. Implementation of this protocol decreased length of hospital stay as well as time to surgery. The analysis of the effect of this protocol on overall morbidity and mortality is ongoing

    Does right thoracotomy increase the risk of mitral valve reoperation?

    Get PDF
    ObjectiveThe study objective was to determine whether a right thoracotomy approach increases the risk of mitral valve reoperation.MethodsBetween January of 1993 and January of 2004, 2469 patients with mitral valve disease underwent 2570 reoperations (1508 replacements, 1062 repairs). The approach was median sternotomy in 2444 patients, right thoracotomy in 80 patients, and other in 46 patients. Multivariable logistic regression was used to identify factors associated with median sternotomy versus right thoracotomy, mitral valve repair versus replacement, hospital death, and stroke. Factors favoring median sternotomy (P < .03) included coronary artery bypass grafting (30% vs 2%), aortic valve replacement (39% vs 2%), tricuspid valve repair (27% vs 13%), fewer previous cardiac operations, more recent reoperation, and no prior left internal thoracic artery graft. These factors were used to construct a propensity score for risk-adjusting outcomes.ResultsHospital mortality was 6.7% (163/2444) for the median sternotomy approach and 6.3% (5/80) for the thoracotomy approach (P = .9). Risk factors (P < .04) included earlier surgery date, higher New York Heart Association class, emergency operation, multiple reoperations, and mitral valve replacement. Stroke occurred in 66 patients (2.7%) who underwent a median sternotomy and in 6 patients (7.5%) who underwent a thoracotomy (P = .006). Mitral valve replacement (vs repair) was more common in those receiving a thoracotomy (P < .04).ConclusionsCompared with median sternotomy, right thoracotomy is associated with a higher occurrence of stroke and less frequent mitral valve repair. Specific strategies for conducting the operation should be used to reduce the risk of stroke when right thoracotomy is used for mitral valve reoperation. In most instances, repeat median sternotomy, with its better exposure and greater latitude for concomitant procedures, is preferred

    Adverse events during reoperative cardiac surgery: Frequency, characterization, and rescue

    Get PDF
    ObjectivesTo (1) determine frequency of occurrence and risk factors for intraoperative adverse events (IAE) during reoperative cardiac surgery, (2) characterize them with respect to structure injured, timing, and use of preventive strategies, and (3) identify the impact on outcome in terms of successful and unsuccessful rescue and cost.MethodsOperative notes of 1847 patients undergoing reoperative cardiac surgery were reviewed to identify and characterize documented intraoperative adverse events. Logistic regression modeling was used to identify risk factors for intraoperative adverse events and outcomes. Expected versus observed poor outcomes (stroke, myocardial infarction, death) was used to measure rescue.ResultsAmong 127 patients, 145 (7%) intraoperative adverse events occurred. These included injuries to bypass grafts (n = 47), heart (n = 38), and great vessels (n = 28) and ischemia without graft injury (n = 22). Most occurred on opening (n = 34, 23%) and during prebypass dissection (n = 57, 39%). Risk incremented as reoperations increased. Seventy-seven patients experienced 1 or more lapses in preventive strategies. Patients with intraoperative adverse events had a greater number of poor outcomes (n = 24 [19%] vs n = 107 [6.2%]; P < .0001) and incurred higher direct technical intraoperative and postoperative costs (ratio 1.3). Twelve patients with intraoperative adverse events were predicted to have poor outcomes versus 24 who did (P < .0001), indicating 12 “failures to rescue.”ConclusionsAdverse events still occur regularly during cardiac reoperation, are related to complexity of the procedure, and occur particularly during dissection and often when preventive strategies have not been used. Compensatory rescue measures are not always successful. Adverse events lead to poor patient outcome and higher cost

    Clostridium difficile infection after cardiac surgery: Prevalence, morbidity, mortality, and resource utilization

    Get PDF
    ObjectiveDespite increasing efforts to prevent infection, the prevalence of hospital-associated Clostridium difficile infections (CDI) is increasing. Heightened awareness prompted this study of the prevalence and morbidity associated with CDI after cardiac surgery.MethodsA total of 22,952 patients underwent cardiac surgery at Cleveland Clinic from January 2005 to January 2011. CDI was diagnosed by enzyme immunoassay for toxins and, more recently, polymerase chain reaction (PCR) testing. Hospital outcomes and long-term survival were compared with those of the remaining population in propensity-matched groups.ResultsOne hundred forty-five patients (0.63%) tested positive for CDI at a median of 9 days postoperatively, 135 by enzyme immunoassay and 11 by PCR. Its prevalence more than doubled over the study period. Seventy-seven patients (48%) were transfers from outside hospitals. Seventy-three patients (50%) were exposed preoperatively to antibiotics and 79 (56%) to proton-pump inhibitors. Patients with CDI had more baseline comorbidities, more reoperations, and received more blood products than patients who did not have CDI. Presenting symptoms included diarrhea (107; 75%), distended abdomen (48; 34%), and abdominal pain (27; 19%). All were treated with metronidazole or vancomycin. Sixteen patients (11%) died in hospital, including 5 of 10 who developed toxic colitis; 3 of 4 undergoing total colectomy survived. Among matched patients, those with CDI had more septicemia (P < .0001), renal failure (P = .0002), reoperations (P < .0001), prolonged postoperative ventilation (P < .0001), longer hospital stay (P < .0001), and lower 3-year survival, 52% versus 64% (P = .03), than patients who did not have CDI.ConclusionsAlthough rare, the prevalence of CDI is increasing, contributing importantly to morbidity and mortality after cardiac surgery. If toxic colitis develops, mortality is high, but colectomy may be lifesaving

    An improved constraint satisfaction adaptive neural network for job-shop scheduling

    Get PDF
    Copyright @ Springer Science + Business Media, LLC 2009This paper presents an improved constraint satisfaction adaptive neural network for job-shop scheduling problems. The neural network is constructed based on the constraint conditions of a job-shop scheduling problem. Its structure and neuron connections can change adaptively according to the real-time constraint satisfaction situations that arise during the solving process. Several heuristics are also integrated within the neural network to enhance its convergence, accelerate its convergence, and improve the quality of the solutions produced. An experimental study based on a set of benchmark job-shop scheduling problems shows that the improved constraint satisfaction adaptive neural network outperforms the original constraint satisfaction adaptive neural network in terms of computational time and the quality of schedules it produces. The neural network approach is also experimentally validated to outperform three classical heuristic algorithms that are widely used as the basis of many state-of-the-art scheduling systems. Hence, it may also be used to construct advanced job-shop scheduling systems.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of UK under Grant EP/E060722/01 and in part by the National Nature Science Fundation of China under Grant 60821063 and National Basic Research Program of China under Grant 2009CB320601

    Pretransplant gastroesophageal reflux compromises early outcomes after lung transplantation

    Get PDF
    ObjectivesGastroesophageal reflux disease (GERD) is implicated as a risk factor for bronchiolitis obliterans syndrome after lung transplantation, but its effects on acute rejection, early allograft function, and survival are unclear. Therefore, we sought to systematically understand the time-related impact of pretransplant GERD on graft function (spirometry), mortality, and acute rejection early after lung transplantation.MethodsFrom January 2005 to July 2008, 215 patients underwent lung transplantation; 114 had preoperative pH testing, and 32 (28%) had objective evidence of GERD. Lung function was assessed by forced 1-second expiratory volume (FEV1; percent of predicted) in 97 patients, mortality by follow-up (median, 2.2 years), and acute rejection by transbronchial biopsy.ResultsPretransplant GERD was associated with decreased FEV1 early after lung transplantation (P = .01) such that by 18 months, FEV1 was 70% of predicted in double lung transplant patients with GERD versus 83% among non-GERD patients (P = .05). A similar decrease was observed in single lung transplantation (50% vs 60%, respectively; P = .09). GERD patients had lower survival early after transplant ( P = .02)—75% versus 90%. Presence of GERD did not affect acute rejection (P = .6).ConclusionsFor lung transplant recipients, pretransplant GERD is associated with worse early allograft function and survival, but not increased acute rejection. The compromise in lung function is substantial, such that FEV1 after double lung transplant in GERD patients approaches that of single lung transplant in non-GERD patients. We advocate thorough testing for GERD before lung transplantation; if identified, aggressive therapy early after transplant, including fundoplication, may prove efficacious

    Lysosomal abnormalities in hereditary spastic paraplegia types SPG15 and SPG11

    Get PDF
    Objective Hereditary spastic paraplegias (HSPs) are among the most genetically diverse inherited neurological disorders, with over 70 disease loci identified (SPG1-71) to date. SPG15 and SPG11 are clinically similar, autosomal recessive disorders characterized by progressive spastic paraplegia along with thin corpus callosum, white matter abnormalities, cognitive impairment, and ophthalmologic abnormalities. Furthermore, both have been linked to early-onset parkinsonism. Methods We describe two new cases of SPG15 and investigate cellular changes in SPG15 and SPG11 patient-derived fibroblasts, seeking to identify shared pathogenic themes. Cells were evaluated for any abnormalities in cell division, DNA repair, endoplasmic reticulum, endosomes, and lysosomes. Results Fibroblasts prepared from patients with SPG15 have selective enlargement of LAMP1-positive structures, and they consistently exhibited abnormal lysosomal storage by electron microscopy. A similar enlargement of LAMP1-positive structures was also observed in cells from multiple SPG11 patients, though prominent abnormal lysosomal storage was not evident. The stabilities of the SPG15 protein spastizin/ZFYVE26 and the SPG11 protein spatacsin were interdependent. Interpretation Emerging studies implicating these two proteins in interactions with the late endosomal/lysosomal adaptor protein complex AP-5 are consistent with shared abnormalities in lysosomes, supporting a converging mechanism for these two disorders. Recent work withZfyve26−/− mice revealed a similar phenotype to human SPG15, and cells in these mice had endolysosomal abnormalities. SPG15 and SPG11 are particularly notable among HSPs because they can also present with juvenile parkinsonism, and this lysosomal trafficking or storage defect may be relevant for other forms of parkinsonism associated with lysosomal dysfunction

    Ethical issues in implementation research: a discussion of the problems in achieving informed consent

    Get PDF
    Background: Improved quality of care is a policy objective of health care systems around the world. Implementation research is the scientific study of methods to promote the systematic uptake of clinical research findings into routine clinical practice, and hence to reduce inappropriate care. It includes the study of influences on healthcare professionals' behaviour and methods to enable them to use research findings more effectively. Cluster randomized trials represent the optimal design for evaluating the effectiveness of implementation strategies. Various codes of medical ethics, such as the Nuremberg Code and the Declaration of Helsinki inform medical research, but their relevance to cluster randomised trials in implementation research is unclear. This paper discusses the applicability of various ethical codes to obtaining consent in cluster trials in implementation research. Discussion: The appropriate application of biomedical codes to implementation research is not obvious. Discussion of the nature and practice of informed consent in implementation research cluster trials must consider the levels at which consent can be sought, and for what purpose it can be sought. The level at which an intervention is delivered can render the idea of patient level consent meaningless. Careful consideration of the ownership of information, and rights of access to and exploitation of data is required. For health care professionals and organizations, there is a balance between clinical freedom and responsibility to participate in research. Summary: While ethical justification for clinical trials relies heavily on individual consent, for implementation research aspects of distributive justice, economics, and political philosophy underlie the debate. Societies may need to trade off decisions on the choice between individualized consent and valid implementation research. We suggest that social sciences codes could usefully inform the consideration of implementation research by members of Research Ethics Committees
    corecore