543 research outputs found

    The standard error of measurement is a more appropriate measure of quality for postgraduate medical assessments than is reliability: an analysis of MRCP(UK) examinations

    Get PDF
    Background: Cronbach's alpha is widely used as the preferred index of reliability for medical postgraduate examinations. A value of 0.8-0.9 is seen by providers and regulators alike as an adequate demonstration of acceptable reliability for any assessment. Of the other statistical parameters, Standard Error of Measurement (SEM) is mainly seen as useful only in determining the accuracy of a pass mark. However the alpha coefficient depends both on SEM and on the ability range (standard deviation, SD) of candidates taking an exam. This study investigated the extent to which the necessarily narrower ability range in candidates taking the second of the three part MRCP(UK) diploma examinations, biases assessment of reliability and SEM.Methods: a) The interrelationships of standard deviation (SD), SEM and reliability were investigated in a Monte Carlo simulation of 10,000 candidates taking a postgraduate examination. b) Reliability and SEM were studied in the MRCP(UK) Part 1 and Part 2 Written Examinations from 2002 to 2008. c) Reliability and SEM were studied in eight Specialty Certificate Examinations introduced in 2008-9.Results: The Monte Carlo simulation showed, as expected, that restricting the range of an assessment only to those who had already passed it, dramatically reduced the reliability but did not affect the SEM of a simulated assessment. The analysis of the MRCP(UK) Part 1 and Part 2 written examinations showed that the MRCP(UK) Part 2 written examination had a lower reliability than the Part 1 examination, but, despite that lower reliability, the Part 2 examination also had a smaller SEM (indicating a more accurate assessment). The Specialty Certificate Examinations had small Ns, and as a result, wide variability in their reliabilities, but SEMs were comparable with MRCP(UK) Part 2.Conclusions: An emphasis upon assessing the quality of assessments primarily in terms of reliability alone can produce a paradoxical and distorted picture, particularly in the situation where a narrower range of candidate ability is an inevitable consequence of being able to take a second part examination only after passing the first part examination. Reliability also shows problems when numbers of candidates in examinations are low and sampling error affects the range of candidate ability. SEM is not subject to such problems; it is therefore a better measure of the quality of an assessment and is recommended for routine use

    The importance of iron in long-term survival of maintenance hemodialysis patients treated with epoetin-alfa and intravenous iron: analysis of 9.5 years of prospectively collected data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In patients treated by maintenance hemodialysis the relationship to survival of hemoglobin level and administered epoetin-alfa and intravenous iron is controversial. The study aim was to determine effects on patient survival of administered epoetin-alfa and intravenous iron, and of hemoglobin and variables related to iron status.</p> <p>Methods</p> <p>The patients were 1774 treated by maintenance hemodialysis in 3 dialysis units in New York, NY from January 1998 to June, 2007. A patient-centered, coded, electronic patient record used in patient care enabled retrospective analysis of data collected prospectively. For survival analysis, patients were censored when transplanted, transferred to hemodialysis at home or elsewhere, peritoneal dialysis. Univariate Kaplan-Meier analysis was followed by multivariate analysis with Cox's regression, using as variables age, race, gender, major co-morbid conditions, epoetin-alfa and intravenous iron administered, and 15 laboratory tests.</p> <p>Results</p> <p>Median age was 59 years, epoetin-alfa (interquartile range) 18,162 (12,099, 27,741) units/week, intravenous iron 301 (202, 455) mg/month, survival 789 (354, 1489) days. Median hemoglobin was 116 (110, 120)g/L, transferrin saturation 29.7 (24.9, 35.1)%, serum ferritin 526 (247, 833) μg/L, serum albumin 39.0 (36.3, 41.5) g/L. Survival was better the higher the hemoglobin, best with > 120 g/L. Epoetin-alfa effect on survival was weak but had statistically significant interaction with intravenous iron. For intravenous iron, survival was best with 1–202 mg/month, slightly worse with 202–455 mg/month; it was worst with no intravenous iron, only slightly better with > 455 mg/month. Survival was worst with transferrin saturation ≤ 16%, serum ferritin ≤ 100 μg/L, best with transferrin saturation > 25%, serum ferritin > 600 μg/L The effects of each of hemoglobin, intravenous iron, transferrin saturation, and serum ferritin on survival were independently significant and not mediated by other predictors in the model.</p> <p>Conclusion</p> <p>Long term survival of maintenance hemodialysis patients was favorably affected by a relatively high hemoglobin level, by moderate intravenous iron administration, and by indicators of iron sufficiency. It was unfavorably influenced by a low hemoglobin level, and by indicators of iron deficiency.</p

    Questions and answers on iron deficiency treatment selection and the use of intravenous iron in routine clinical practice.

    Get PDF
    Iron deficiency is a common cause of morbidity and can arise as a consequence or complication from many diseases. The use of intravenous iron has increased significantly in the last decade, but concerns remain about indications and administration. Modern intravenous iron preparations can facilitate rapid iron repletion in one or two doses, both for absolute iron deficiency and, in the presence of inflammation, functional iron deficiency, where oral iron therapy is ineffective or has not worked. A multidisciplinary team of experts experienced in iron deficiency undertook a consensus review to support healthcare professionals with practical advice on managing iron deficiency in gastrointestinal, renal and cardiac disease, as well as; pregnancy, heavy menstrual bleeding, and surgery. We explain how intravenous iron may work where oral iron has not. We provide context on how and when intravenous iron should be administered, and informed opinion on potential benefits balanced with potential side-effects. We propose how intravenous iron side-effects can be anticipated in terms of what they may be and when they may occur. The aim of this consensus is to provide a practical basis for educating and preparing staff and patients on when and how iron infusions can be administered safely and efficiently. Key messages Iron deficiency treatment selection is driven by several factors, including the presence of inflammation, the time available for iron replenishment, and the anticipated risk of side-effects or intolerance. Intravenous iron preparations are indicated for the treatment of iron deficiency when oral preparations are ineffective or cannot be used, and therefore have applicability in a wide range of clinical contexts, including chronic inflammatory conditions, perioperative settings, and disorders associated with chronic blood loss. Adverse events occurring with intravenous iron can be anticipated according to when they typically occur, which provides a basis for educating and preparing staff and patients on how iron infusions can be administered safely and efficiently

    Aptamer-based multiplexed proteomic technology for biomarker discovery

    Get PDF
    Interrogation of the human proteome in a highly multiplexed and efficient manner remains a coveted and challenging goal in biology. We present a new aptamer-based proteomic technology for biomarker discovery capable of simultaneously measuring thousands of proteins from small sample volumes (15 [mu]L of serum or plasma). Our current assay allows us to measure ~800 proteins with very low limits of detection (1 pM average), 7 logs of overall dynamic range, and 5% average coefficient of variation. This technology is enabled by a new generation of aptamers that contain chemically modified nucleotides, which greatly expand the physicochemical diversity of the large randomized nucleic acid libraries from which the aptamers are selected. Proteins in complex matrices such as plasma are measured with a process that transforms a signature of protein concentrations into a corresponding DNA aptamer concentration signature, which is then quantified with a DNA microarray. In essence, our assay takes advantage of the dual nature of aptamers as both folded binding entities with defined shapes and unique sequences recognizable by specific hybridization probes. To demonstrate the utility of our proteomics biomarker discovery technology, we applied it to a clinical study of chronic kidney disease (CKD). We identified two well known CKD biomarkers as well as an additional 58 potential CKD biomarkers. These results demonstrate the potential utility of our technology to discover unique protein signatures characteristic of various disease states. More generally, we describe a versatile and powerful tool that allows large-scale comparison of proteome profiles among discrete populations. This unbiased and highly multiplexed search engine will enable the discovery of novel biomarkers in a manner that is unencumbered by our incomplete knowledge of biology, thereby helping to advance the next generation of evidence-based medicine

    Perspectives on the Trypanosoma cruzi-host cell receptor interaction

    Get PDF
    Chagas disease is caused by the parasite Trypanosoma cruzi. The critical initial event is the interaction of the trypomastigote form of the parasite with host receptors. This review highlights recent observations concerning these interactions. Some of the key receptors considered are those for thromboxane, bradykinin, and for the nerve growth factor TrKA. Other important receptors such as galectin-3, thrombospondin, and laminin are also discussed. Investigation into the molecular biology and cell biology of host receptors for T. cruzi may provide novel therapeutic targets

    International Veterinary Epilepsy Task Force consensus proposal: Medical treatment of canine epilepsy in Europe

    Get PDF
    In Europe, the number of antiepileptic drugs (AEDs) licensed for dogs has grown considerably over the last years. Nevertheless, the same questions remain, which include, 1) when to start treatment, 2) which drug is best used initially, 3) which adjunctive AED can be advised if treatment with the initial drug is unsatisfactory, and 4) when treatment changes should be considered. In this consensus proposal, an overview is given on the aim of AED treatment, when to start long-term treatment in canine epilepsy and which veterinary AEDs are currently in use for dogs. The consensus proposal for drug treatment protocols, 1) is based on current published evidence-based literature, 2) considers the current legal framework of the cascade regulation for the prescription of veterinary drugs in Europe, and 3) reflects the authors’ experience. With this paper it is aimed to provide a consensus for the management of canine idiopathic epilepsy. Furthermore, for the management of structural epilepsy AEDs are inevitable in addition to treating the underlying cause, if possible

    DNA methylation, the early-life social environment and behavioral disorders

    Get PDF
    One of the outstanding questions in behavioral disorders is untangling the complex relationship between nurture and nature. Although epidemiological data provide evidence that there is an interaction between genetics (nature) and the social and physical environments (nurture) in a spectrum of behavioral disorders, the main open question remains the mechanism. Emerging data support the hypothesis that DNA methylation, a covalent modification of the DNA molecule that is a component of its chemical structure, serves as an interface between the dynamic environment and the fixed genome. We propose that modulation of DNA methylation in response to environmental cues early in life serves as a mechanism of life-long genome adaptation. Under certain contexts, this adaptation can turn maladaptive resulting in behavioral disorders. This hypothesis has important implications on understanding, predicting, preventing, and treating behavioral disorders including autism that will be discussed

    Goal Priming in Dieters: Recent Insights and Applications

    Get PDF
    What are the psychological mechanisms that make dieting so challenging in our food-rich living environment? Social psychological research on goal priming provides a useful framework for answering this question, as well as implications for how to enhance dieting success. This review presents and discusses recent research which shows that attractive food cues prime the hedonic eating goal in dieters, and thus facilitate overeating. However, external cues priming the goal of weight control can be used to offset these effects and thus to facilitate dieting success, as is demonstrated in both field and laboratory experiments. In addition, recent strategies to prevent hedonic effects of attractive food, such as mindful attention, can facilitate self-regulation. These recent advances in our understanding of dieting behavior have theoretical and practical implications for how successful dieting can be facilitated, both by means of individual strategies, as well as by environmental changes
    corecore