93 research outputs found

    Genotype-informed estimation of risk of coronary heart disease based on genome-wide association data linked to the electronic medical record

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Susceptibility variants identified by genome-wide association studies (GWAS) have modest effect sizes. Whether such variants provide incremental information in assessing risk for common 'complex' diseases is unclear. We investigated whether measured and imputed genotypes from a GWAS dataset linked to the electronic medical record alter estimates of coronary heart disease (CHD) risk.</p> <p>Methods</p> <p>Study participants (<it>n </it>= 1243) had no known cardiovascular disease and were considered to be at high, intermediate, or low 10-year risk of CHD based on the Framingham risk score (FRS) which includes age, sex, total and HDL cholesterol, blood pressure, diabetes, and smoking status. Of twelve SNPs identified in prior GWAS to be associated with CHD, four were genotyped in the participants as part of a GWAS. Genotypes for seven SNPs were imputed from HapMap CEU population using the program MACH. We calculated a multiplex genetic risk score for each patient based on the odds ratios of the susceptibility SNPs and incorporated this into the FRS.</p> <p>Results</p> <p>The mean (SD) number of risk alleles was 12.31 (1.95), range 6-18. The mean (SD) of the weighted genetic risk score was 12.64 (2.05), range 5.75-18.20. The CHD genetic risk score was not correlated with the FRS (<it>P </it>= 0.78). After incorporating the genetic risk score into the FRS, a total of 380 individuals (30.6%) were reclassified into higher-(188) or lower-risk groups (192).</p> <p>Conclusion</p> <p>A genetic risk score based on measured/imputed genotypes at 11 susceptibility SNPs, led to significant reclassification in the 10-y CHD risk categories. Additional prospective studies are needed to assess accuracy and clinical utility of such reclassification.</p

    The learners' perspective on internal medicine ward rounds: a cross-sectional study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Ward rounds form an integral part of Internal Medicine teaching. This study aimed to determine the trainees' opinions regarding various aspects of their ward rounds, including how well they cover their learning needs, how they would like the rounds to be conducted, and differences of opinion between medical students and postgraduates.</p> <p>Methods</p> <p>A cross-sectional study was conducted on a total of 134 trainees in Internal Medicine, comprising medical students, interns, residents and fellows, who were asked to fill in a structured, self-designed questionnaire. Most of the responses required a rating on a scale of 1-5 (1 being highly unsatisfactory and 5 being highly satisfactory).</p> <p>Results</p> <p>Teaching of clinical skills and bedside teaching received the lowest overall mean score (Mean ± SD 2.48 ± 1.02 and 2.49 ± 1.12 respectively). They were rated much lower by postgraduates as compared to students (p < 0.001). All respondents felt that management of patients was the aspect best covered by the current ward rounds (Mean ± SD 3.71 ± 0.72). For their desired ward rounds, management of patients received the highest score (Mean ± SD 4.64 ± 0.55), followed by bedside examinations (Mean ± SD 4.60 ± 0.61) and clinical skills teaching (Mean ± SD 4.50 ± 0.68). The postgraduates desired a lot more focus on communication skills, counselling and medical ethics as compared to students, whose primary focus was teaching of bedside examination and management. A majority of the respondents (87%) preferred bedside rounds over conference room rounds. Even though the duration of rounds was found to be adequate, a majority of the trainees (68%) felt there was a lack of individual attention during ward rounds.</p> <p>Conclusions</p> <p>This study highlights important areas where ward rounds need improvement in order to maximize their benefit to the learners. There is a need to modify the current state of ward rounds in order to address the needs and expectations of trainees.</p

    Improved cardiovascular diagnostic accuracy by pocket size imaging device in non-cardiologic outpatients: the NaUSiCa (Naples Ultrasound Stethoscope in Cardiology) study

    Get PDF
    Miniaturization has evolved in the creation of a pocket-size imaging device which can be utilized as an ultrasound stethoscope. This study assessed the additional diagnostic power of pocket size device by both experts operators and trainees in comparison with physical examination and its appropriateness of use in comparison with standard echo machine in a non-cardiologic population

    Quality assurance in psychiatry: quality indicators and guideline implementation

    Get PDF
    In many occasions, routine mental health care does not correspond to the standards that the medical profession itself puts forward. Hope exists to improve the outcome of severe mental illness by improving the quality of mental health care and by implementing evidence-based consensus guidelines. Adherence to guideline recommendations should reduce costly complications and unnecessary procedures. To measure the quality of mental health care and disease outcome reliably and validly, quality indicators have to be available. These indicators of process and outcome quality should be easily measurable with routine data, should have a strong evidence base, and should be able to describe quality aspects across all sectors over the whole disease course. Measurement-based quality improvement will not be successful when it results in overwhelming documentation reducing the time for clinicians for active treatment interventions. To overcome difficulties in the implementation guidelines and to reduce guideline non-adherence, guideline implementation and quality assurance should be embedded in a complex programme consisting of multifaceted interventions using specific psychological methods for implementation, consultation by experts, and reimbursement of documentation efforts. There are a number of challenges to select appropriate quality indicators in order to allow a fair comparison across different approaches of care. Carefully used, the use of quality indicators and improved guideline adherence can address suboptimal clinical outcomes, reduce practice variations, and narrow the gap between optimal and routine care

    From Disease Association to Risk Assessment: An Optimistic View from Genome-Wide Association Studies on Type 1 Diabetes

    Get PDF
    Genome-wide association studies (GWAS) have been fruitful in identifying disease susceptibility loci for common and complex diseases. A remaining question is whether we can quantify individual disease risk based on genotype data, in order to facilitate personalized prevention and treatment for complex diseases. Previous studies have typically failed to achieve satisfactory performance, primarily due to the use of only a limited number of confirmed susceptibility loci. Here we propose that sophisticated machine-learning approaches with a large ensemble of markers may improve the performance of disease risk assessment. We applied a Support Vector Machine (SVM) algorithm on a GWAS dataset generated on the Affymetrix genotyping platform for type 1 diabetes (T1D) and optimized a risk assessment model with hundreds of markers. We subsequently tested this model on an independent Illumina-genotyped dataset with imputed genotypes (1,008 cases and 1,000 controls), as well as a separate Affymetrix-genotyped dataset (1,529 cases and 1,458 controls), resulting in area under ROC curve (AUC) of ∼0.84 in both datasets. In contrast, poor performance was achieved when limited to dozens of known susceptibility loci in the SVM model or logistic regression model. Our study suggests that improved disease risk assessment can be achieved by using algorithms that take into account interactions between a large ensemble of markers. We are optimistic that genotype-based disease risk assessment may be feasible for diseases where a notable proportion of the risk has already been captured by SNP arrays

    Genetic Signatures of Exceptional Longevity in Humans

    Get PDF
    Like most complex phenotypes, exceptional longevity is thought to reflect a combined influence of environmental (e.g., lifestyle choices, where we live) and genetic factors. To explore the genetic contribution, we undertook a genome-wide association study of exceptional longevity in 801 centenarians (median age at death 104 years) and 914 genetically matched healthy controls. Using these data, we built a genetic model that includes 281 single nucleotide polymorphisms (SNPs) and discriminated between cases and controls of the discovery set with 89% sensitivity and specificity, and with 58% specificity and 60% sensitivity in an independent cohort of 341 controls and 253 genetically matched nonagenarians and centenarians (median age 100 years). Consistent with the hypothesis that the genetic contribution is largest with the oldest ages, the sensitivity of the model increased in the independent cohort with older and older ages (71% to classify subjects with an age at death>102 and 85% to classify subjects with an age at death>105). For further validation, we applied the model to an additional, unmatched 60 centenarians (median age 107 years) resulting in 78% sensitivity, and 2863 unmatched controls with 61% specificity. The 281 SNPs include the SNP rs2075650 in TOMM40/APOE that reached irrefutable genome wide significance (posterior probability of association = 1) and replicated in the independent cohort. Removal of this SNP from the model reduced the accuracy by only 1%. Further in-silico analysis suggests that 90% of centenarians can be grouped into clusters characterized by different “genetic signatures” of varying predictive values for exceptional longevity. The correlation between 3 signatures and 3 different life spans was replicated in the combined replication sets. The different signatures may help dissect this complex phenotype into sub-phenotypes of exceptional longevity

    ‘‘Beet-ing’’ the Mountain: A Review of the Physiological and Performance Effects of Dietary Nitrate Supplementation at Simulated and Terrestrial Altitude

    Get PDF
    Exposure to altitude results in multiple physiological consequences. These include, but are not limited to, a reduced maximal oxygen consumption, drop in arterial oxygen saturation, and increase in muscle metabolic perturbations at a fixed sub-maximal work rate. Exercise capacity during fixed work rate or incremental exercise and time-trial performance are also impaired at altitude relative to sea-level. Recently, dietary nitrate (NO3-) supplementation has attracted considerable interest as a nutritional aid during altitude exposure. In this review, we summarise and critically evaluate the physiological and performance effects of dietary NO3- supplementation during exposure to simulated and terrestrial altitude. Previous investigations at simulated altitude indicate that NO3- supplementation may reduce the oxygen cost of exercise, elevate arterial and tissue oxygen saturation, improve muscle metabolic function, and enhance exercise capacity/ performance. Conversely, current evidence suggests that NO3- supplementation does not augment the training response at simulated altitude. Few studies have evaluated the effects of NO3- at terrestrial altitude. Current evidence indicates potential improvements in endothelial function at terrestrial altitude following NO3- supplementation. No effects of NO3- supplementation have been observed on oxygen consumption or arterial oxygen saturation at terrestrial altitude, although further research is warranted. Limitations of the present body of literature are discussed, and directions for future research are provided
    corecore