50 research outputs found

    Mini-Mental State Examination (MMSE) for the detection of dementia in clinically unevaluated people aged 65 and over in community and primary care populations

    Get PDF
    BACKGROUND: The Mini Mental State Examination (MMSE) is a cognitive test that is commonly used as part of the evaluation for possible dementia. OBJECTIVES: To determine the diagnostic accuracy of the Mini‐Mental State Examination (MMSE) at various cut points for dementia in people aged 65 years and over in community and primary care settings who had not undergone prior testing for dementia. SEARCH METHODS: We searched the specialised register of the Cochrane Dementia and Cognitive Improvement Group, MEDLINE (OvidSP), EMBASE (OvidSP), PsycINFO (OvidSP), LILACS (BIREME), ALOIS, BIOSIS previews (Thomson Reuters Web of Science), and Web of Science Core Collection, including the Science Citation Index and the Conference Proceedings Citation Index (Thomson Reuters Web of Science). We also searched specialised sources of diagnostic test accuracy studies and reviews: MEDION (Universities of Maastricht and Leuven, www.mediondatabase.nl), DARE (Database of Abstracts of Reviews of Effects, via the Cochrane Library), HTA Database (Health Technology Assessment Database, via the Cochrane Library), and ARIF (University of Birmingham, UK, www.arif.bham.ac.uk). We attempted to locate possibly relevant but unpublished data by contacting researchers in this field. We first performed the searches in November 2012 and then fully updated them in May 2014. We did not apply any language or date restrictions to the electronic searches, and we did not use any methodological filters as a method to restrict the search overall. SELECTION CRITERIA: We included studies that compared the 11‐item (maximum score 30) MMSE test (at any cut point) in people who had not undergone prior testing versus a commonly accepted clinical reference standard for all‐cause dementia and subtypes (Alzheimer disease dementia, Lewy body dementia, vascular dementia, frontotemporal dementia). Clinical diagnosis included all‐cause (unspecified) dementia, as defined by any version of the Diagnostic and Statistical Manual of Mental Disorders (DSM); International Classification of Diseases (ICD) and the Clinical Dementia Rating. DATA COLLECTION AND ANALYSIS: At least three authors screened all citations.Two authors handled data extraction and quality assessment. We performed meta‐analysis using the hierarchical summary receiver‐operator curves (HSROC) method and the bivariate method. MAIN RESULTS: We retrieved 24,310 citations after removal of duplicates. We reviewed the full text of 317 full‐text articles and finally included 70 records, referring to 48 studies, in our synthesis. We were able to perform meta‐analysis on 28 studies in the community setting (44 articles) and on 6 studies in primary care (8 articles), but we could not extract usable 2 x 2 data for the remaining 14 community studies, which we did not include in the meta‐analysis. All of the studies in the community were in asymptomatic people, whereas two of the six studies in primary care were conducted in people who had symptoms of possible dementia. We judged two studies to be at high risk of bias in the patient selection domain, three studies to be at high risk of bias in the index test domain and nine studies to be at high risk of bias regarding flow and timing. We assessed most studies as being applicable to the review question though we had concerns about selection of participants in six studies and target condition in one study. The accuracy of the MMSE for diagnosing dementia was reported at 18 cut points in the community (MMSE score 10, 14‐30 inclusive) and 10 cut points in primary care (MMSE score 17‐26 inclusive). The total number of participants in studies included in the meta‐analyses ranged from 37 to 2727, median 314 (interquartile range (IQR) 160 to 647). In the community, the pooled accuracy at a cut point of 24 (15 studies) was sensitivity 0.85 (95% confidence interval (CI) 0.74 to 0.92), specificity 0.90 (95% CI 0.82 to 0.95); at a cut point of 25 (10 studies), sensitivity 0.87 (95% CI 0.78 to 0.93), specificity 0.82 (95% CI 0.65 to 0.92); and in seven studies that adjusted accuracy estimates for level of education, sensitivity 0.97 (95% CI 0.83 to 1.00), specificity 0.70 (95% CI 0.50 to 0.85). There was insufficient data to evaluate the accuracy of the MMSE for diagnosing dementia subtypes.We could not estimate summary diagnostic accuracy in primary care due to insufficient data. AUTHORS' CONCLUSIONS: The MMSE contributes to a diagnosis of dementia in low prevalence settings, but should not be used in isolation to confirm or exclude disease. We recommend that future work evaluates the diagnostic accuracy of tests in the context of the diagnostic pathway experienced by the patient and that investigators report how undergoing the MMSE changes patient‐relevant outcomes

    Inheritance and relationships of flowering time and seed size in kabuli chickpea

    Get PDF
    Flowering time and seed size are the important traits for adaptation in chickpea. Early phenology (time of flowering, podding and maturity) enhance chickpea adaptation to short season environments. Along with a trait of consumer preference, seed size has also been considered as an important factor for subsequent plant growth parameters including germination, seedling vigour and seedling mass. Small seeded kabuli genotype ICC 16644 was crossed with four genotypes (JGK 2, KAK 2, KRIPA and ICC 17109) to study inheritance of flowering time and seed size. The relationships of phenology with seed size, grain yield and its component traits were studied. The study included parents, F1, F2 and F3 of four crosses. The segregation data of F2 indicated flowering time in chickpea was governed by two genes with duplicate recessive epistasis and lateness was dominant to earliness. Two genes were controlling 100-seed weight where small seed size was dominant over large seed size. Early phenology had significant negative or no association (ICC 16644 × ICC 17109) with 100-seed weight. Yield per plant had significant positive association with number of seeds per plant, number of pods per plant, biological yield per plant, 100-seed weight, harvest index and plant height and hence could be considered as factors for seed yield improvement. Phenology had no correlation with yield per se (seed yield per plant) in any of the crosses studied. Thus, present study shows that in certain genetic background it might be possible to breed early flowering genotypes with large seed size in chickpea and selection of early flowering genotypes may not essentially have a yield penalty

    Genomic-Assisted Enhancement in Stress Tolerance for Productivity Improvement in Sorghum

    Get PDF
    Sorghum [Sorghum bicolor (L.) Moench], the fifth most important cereal crop in the world after wheat, rice, maize, and barley, is a multipurpose crop widely grown for food, feed, fodder, forage, and fuel, vital to the food security of many of the world’s poorest people living in fragile agroecological zones. Globally, sorghum is grown on ~42 million hectares area in ~100 countries of Africa, Asia, Oceania, and the Americas. Sorghum grain is used mostly as food (~55%), in the form of flat breads and porridges in Asia and Africa, and as feed (~33%) in the Americas. Stover of sorghum is an increasingly important source of dry season fodder for livestock, especially in South Asia. In India, area under sorghum cultivation has been drastically come down to less than one third in the last six decades but with a limited reduction in total production suggesting the high-yield potential of this crop. Sorghum productivity is far lower compared to its genetic potential owing to a limited exploitation of genetic and genomic resources developed in the recent past. Sorghum production is challenged by various abiotic and biotic stresses leading to a significant reduction in yield. Advances in modern genetics and genomics resources and tools could potentially help to further strengthen sorghum production by accelerating the rate of genetic gains and expediting the breeding cycle to develop cultivars with enhanced yield stability under stress. This chapter reviews the advances made in generating the genetic and genomics resources in sorghum and their interventions in improving the yield stability under abiotic and biotic stresses to improve the productivity of this climate-smart cereal

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Chickpea

    Get PDF
    The narrow genetic base of cultivated chickpea warrants systematic collection, documentation and evaluation of chickpea germplasm and particularly wild Cicer species for effective and efficient use in chickpea breeding programmes. Limiting factors to crop production, possible solutions and ways to overcome them, importance of wild relatives and barriers to alien gene introgression and strategies to overcome them and traits for base broadening have been discussed. It has been clearly demonstrated that resistance to major biotic and abiotic stresses can be successfully introgressed from the primary gene pool comprising progenitor species. However, many desirable traits including high degree of resistance to multiple stresses that are present in the species belonging to secondary and tertiary gene pools can also be introgressed by using special techniques to overcome pre- and post-fertilization barriers. Besides resistance to various biotic and abiotic stresses, the yield QTLs have also been introgressed from wild Cicer species to cultivated varieties. Status and importance of molecular markers, genome mapping and genomic tools for chickpea improvement are elaborated. Because of major genes for various biotic and abiotic stresses, the transfer of agronomically important traits into elite cultivars has been made easy and practical through marker-assisted selection and marker-assisted backcross. The usefulness of molecular markers such as SSR and SNP for the construction of high-density genetic maps of chickpea and for the identification of genes/QTLs for stress resistance, quality and yield contributing traits has also been discussed

    Neurofibromatosis: chronological history and current issues

    Full text link

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Exploring the somatic NF1 mutational spectrum associated with NF1 cutaneous neurofibromas

    No full text
    Neurofibromatosis type-1 (NF1), caused by heterozygous inactivation of the NF1 tumour suppressor gene, is associated with the development of benign and malignant peripheral nerve sheath tumours (MPNSTs). Although numerous germline NF1 mutations have been identified, relatively few somatic NF1 mutations have been described in neurofibromas. Here we have screened 109 cutaneous neurofibromas, excised from 46 unrelated NF1 patients, for somatic NF1 mutations. NF1 mutation screening (involving loss-of-heterozygosity (LOH) analysis, multiplex ligation-dependent probe amplification and DNA sequencing) identified 77 somatic NF1 point mutations, of which 53 were novel. LOH spanning the NF1 gene region was evident in 25 neurofibromas, but in contrast to previous data from MPNSTs, it was absent at the TP53, CDKN2A and RB1 gene loci. Analysis of DNA/RNA from neurofibroma-derived Schwann cell cultures revealed NF1 mutations in four tumours whose presence had been overlooked in the tumour DNA. Bioinformatics analysis suggested that four of seven novel somatic NF1 missense mutations (p.A330T, p.Q519P, p.A776T, p.S1463F) could be of functional/clinical significance. Functional analysis confirmed this prediction for p.S1463F, located within the GTPase-activating protein-related domain, as this mutation resulted in a 150-fold increase in activated GTP-bound Ras. Comparison of the relative frequencies of the different types of somatic NF1 mutation observed with those of their previously reported germline counterparts revealed significant (P=0.001) differences. Although non-identical somatic mutations involving either the same or adjacent nucleotides were identified in three pairs of tumours from the same patients (P<0.0002), no association was noted between the type of germline and somatic NF1 lesion within the same individual
    corecore