51 research outputs found

    Recurrence after correction of acquired ankle equinus deformity in children using Ilizarov technique

    Get PDF
    To describe our Ilizarov technique for the treatment of acquired equinus deformity in children and to determine if compliance with continuous use of an ankle foot orthosis (after removal of the fixator and until skeletal maturity) can influence the severity of recurrence. A cohort of 26 children with post-traumatic or post-burn contractures producing an equinus deformity was followed up for a minimum of 2 years after skeletal maturity. Cases with a bony deformity and/or nerve injury were excluded from this study. All patients were managed by a percutaneous tendo-Achilles lengthening followed by application of an Ilizarov external fixator. Post-operative treatment was in the form of gradual correction at a rate of 0.5 mm per day. Correction started from the second postoperative day until an over-correction of 15 degrees dorsiflexion was achieved. Ankle range of movement was encouraged 4 weeks prior to removal of the external fixator. On removal of the fixator, a posterior splint was applied until substituted by an ankle foot orthoses (AFO). The AFO was used continuously during the first 2–3 months and at nighttime thereafter until skeletal maturity. Fifteen children were compliant with the use of the AFO until skeletal maturity and 11 non-compliant. We compared the recurrence and the size of deformity between the two groups. The rate of recurrence, degree of equinus at recurrence and number of episodes of external fixation surgery showed statistical significant differences (P < 0.01) between the groups. The Ilizarov technique for treatment of acquired equinus deformity secondary to soft tissue scarring is a safe and effective technique. The use of an AFO until skeletal maturity can decrease the risk and degree of recurrence

    Behçet’s disease: Spectrum of MDCT chest and pulmonary angiography findings in patients with chest complaints

    Get PDF
    AbstractObjectiveThe aim of the work was directed to evaluate the value of multi-detector computed tomography pulmonary angiography study in evaluation of known patients with Behcet’s disease.Materials and methodsThis study was done retrospectively and included eighteen known patients with Behcet’s disease and referred for MDCT pulmonary angiography.ResultsPulmonary artery aneurysm was the most common finding as it was found in 16 patients, followed by pulmonary embolism which was found in 14 patients, 12 patients with pulmonary hypertension, right ventricular strain in 6 patients, intracardiac thrombus in 4 patients, dilated bronchial arteries in 8 patients, venous occlusion in 4 patients, mosaic attenuation of the lung in 12 patients, pulmonary infarcts in 4 patients, and pleural effusion in 4 patients.ConclusionMDCT pulmonary angiography is an important diagnostic imaging tool for diagnosis of vascular complications in patients with Behcet’s disease

    Efficient framework for brain tumor detection using different deep learning techniques

    Get PDF
    The brain tumor is an urgent malignancy caused by unregulated cell division. Tumors are classified using a biopsy, which is normally performed after the final brain surgery. Deep learning technology advancements have assisted the health professionals in medical imaging for the medical diagnosis of several symptoms. In this paper, transfer-learning-based models in addition to a Convolutional Neural Network (CNN) called BRAIN-TUMOR-net trained from scratch are introduced to classify brain magnetic resonance images into tumor or normal cases. A comparison between the pre-trained InceptionResNetv2, Inceptionv3, and ResNet50 models and the proposed BRAIN-TUMOR-net is introduced. The performance of the proposed model is tested on three publicly available Magnetic Resonance Imaging (MRI) datasets. The simulation results show that the BRAIN-TUMOR-net achieves the highest accuracy compared to other models. It achieves 100%, 97%, and 84.78% accuracy levels for three different MRI datasets. In addition, the k-fold cross-validation technique is used to allow robust classification. Moreover, three different unsupervised clustering techniques are utilized for segmentation

    GWAS revealed effect of genotype × environment interactions for grain yield of Nebraska winter wheat

    Get PDF
    Background: Improving grain yield in cereals especially in wheat is a main objective for plant breeders. One of the main constrains for improving this trait is the G × E interaction (GEI) which affects the performance of wheat genotypes in different environments. Selecting high yielding genotypes that can be used for a target set of environments is needed. Phenotypic selection can be misleading due to the environmental conditions. Incorporating information from phenotypic and genomic analyses can be useful in selecting the higher yielding genotypes for a group of environments. Results: A set of 270 F3:6 wheat genotypes in the Nebraska winter wheat breeding program was tested for grain yield in nine environments. High genetic variation for grain yield was found among the genotypes. G × E interaction was also highly significant. The highest yielding genotype differed in each environment. The correlation for grain yield among the nine environments was low (0 to 0.43). Genome-wide association study revealed 70 marker traits association (MTAs) associated with increased grain yield. The analysis of linkage disequilibrium revealed 16 genomic regions with a highly significant linkage disequilibrium (LD). The candidate parents’ genotypes for improving grain yield in a group of environments were selected based on three criteria; number of alleles associated with increased grain yield in each selected genotype, genetic distance among the selected genotypes, and number of different alleles between each two selected parents. Conclusion: Although G × E interaction was present, the advances in DNA technology provided very useful tools and analyzes. Such features helped to genetically select the highest yielding genotypes that can be used to cross grain production in a group of environments

    Genetic Diversity and Population Structure of F3:6 Nebraska Winter Wheat Genotypes Using Genotyping-By-Sequencing

    Get PDF
    The availability of information on the genetic diversity and population structure in wheat (Triticum aestivum L.) breeding lines will help wheat breeders to better use their genetic resources and manage genetic variation in their breeding program. The recent advances in sequencing technology provide the opportunity to identify tens or hundreds of thousands of single nucleotide polymorphism (SNPs) in large genome species (e.g., wheat). These SNPs can be utilized for understanding genetic diversity and performing genome wide association studies (GWAS) for complex traits. In this study, the genetic diversity and population structure were investigated in a set of 230 genotypes (F3:6) derived from various crosses as a prerequisite for GWAS and genomic selection. Genotyping-by-sequencing provided 25,566 high-quality SNPs. The polymorphism information content (PIC) across chromosomes ranged from 0.09 to 0.37 with an average of 0.23. The distribution of SNPs markers on the 21 chromosomes ranged from 319 on chromosome 3D to 2,370 on chromosome 3B. The analysis of population structure revealed three subpopulations (G1, G2, and G3). Analysis of molecular variance identified 8% variance among and 92% within subpopulations. Of the three subpopulations, G2 had the highest level of genetic diversity based on three genetic diversity indices: Shannon’s information index (I) = 0.494, diversity index (h) = 0.328 and unbiased diversity index (uh) = 0.331, while G3 had lowest level of genetic diversity (I = 0.348, h = 0.226 and uh = 0.236). This high genetic diversity identified among the subpopulations can be used to develop new wheat cultivars

    Identification of Prognostic Metabolomic Biomarkers at the Interface of Mortality and Morbidity in Pre-Existing TB Cases Infected With SARS-CoV-2

    Get PDF
    Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection currently remains one of the biggest global challenges that can lead to acute respiratory distress syndrome (CARDS) in severe cases. In line with this, prior pulmonary tuberculosis (TB) is a risk factor for long-term respiratory impairment. Post-TB lung dysfunction often goes unrecognized, despite its relatively high prevalence and its association with reduced quality of life. In this study, we used a metabolomics analysis to identify potential biomarkers that aid in the prognosis of COVID-19 morbidity and mortality in post-TB infected patients. This analysis involved blood samples from 155 SARS-CoV-2 infected adults, of which 23 had a previous diagnosis of TB (post-TB), while 132 did not have a prior or current TB infection. Our analysis indicated that the vast majority (~92%) of post-TB individuals showed severe SARS-CoV-2 infection, required intensive oxygen support with a significantly high mortality rate (52.2%). Amongst individuals with severe COVID-19 symptoms, we report a significant decline in the levels of amino acids, notably the branched chains amino acids (BCAAs), more so in the post-TB cohort (FDR <= 0.05) in comparison to mild and asymptomatic cases. Indeed, we identified betaine and BCAAs as potential prognostic metabolic biomarkers of severity and mortality, respectively, in COVID-19 patients who have been exposed to TB. Moreover, we identified serum alanine as an important metabolite at the interface of severity and mortality. Hence, our data associated COVID-19 mortality and morbidity with a long-term metabolically driven consequence of TB infection. In summary, our study provides evidence for a higher mortality rate among COVID-19 infection patients who have history of prior TB infection diagnosis, which mandates validation in larger population cohorts

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study

    Get PDF
    Background Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide.Methods A multimethods analysis was performed as part of the GlobalSurg 3 study-a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital.Findings Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3.85 [95% CI 2.58-5.75]; p&lt;0.0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63.0% vs 82.7%; OR 0.35 [0.23-0.53]; p&lt;0.0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer.Interpretation Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Epidemiology of surgery associated acute kidney injury (EPIS-AKI): a prospective international observational multi-center clinical study

    Get PDF
    Purpose: The incidence, patient features, risk factors and outcomes of surgery-associated postoperative acute kidney injury (PO-AKI) across different countries and health care systems is unclear. Methods: We conducted an international prospective, observational, multi-center study in 30 countries in patients undergoing major surgery (&gt; 2-h duration and postoperative intensive care unit (ICU) or high dependency unit admission). The primary endpoint was the occurrence of PO-AKI within 72&nbsp;h of surgery defined by the Kidney Disease: Improving Global Outcomes (KDIGO) criteria. Secondary endpoints included PO-AKI severity and duration, use of renal replacement therapy (RRT), mortality, and ICU and hospital length of stay. Results: We studied 10,568 patients and 1945 (18.4%) developed PO-AKI (1236 (63.5%) KDIGO stage 1500 (25.7%) KDIGO stage 2209 (10.7%) KDIGO stage 3). In 33.8% PO-AKI was persistent, and 170/1945 (8.7%) of patients with PO-AKI received RRT in the ICU. Patients with PO-AKI had greater ICU (6.3% vs. 0.7%) and hospital (8.6% vs. 1.4%) mortality, and longer ICU (median 2 (Q1-Q3, 1-3) days vs. 3 (Q1-Q3, 1-6) days) and hospital length of stay (median 14 (Q1-Q3, 9-24) days vs. 10 (Q1-Q3, 7-17) days). Risk factors for PO-AKI included older age, comorbidities (hypertension, diabetes, chronic kidney disease), type, duration and urgency of surgery as well as intraoperative vasopressors, and aminoglycosides administration. Conclusion: In a comprehensive multinational study, approximately one in five patients develop PO-AKI after major surgery. Increasing severity of PO-AKI is associated with a progressive increase in adverse outcomes. Our findings indicate that PO-AKI represents a significant burden for health care worldwide
    corecore