95 research outputs found

    In-training assessment using direct observation of single-patient encounters: a literature review

    Get PDF
    We reviewed the literature on instruments for work-based assessment in single clinical encounters, such as the mini-clinical evaluation exercise (mini-CEX), and examined differences between these instruments in characteristics and feasibility, reliability, validity and educational effect. A PubMed search of the literature published before 8 January 2009 yielded 39 articles dealing with 18 different assessment instruments. One researcher extracted data on the characteristics of the instruments and two researchers extracted data on feasibility, reliability, validity and educational effect. Instruments are predominantly formative. Feasibility is generally deemed good and assessor training occurs sparsely but is considered crucial for successful implementation. Acceptable reliability can be achieved with 10 encounters. The validity of many instruments is not investigated, but the validity of the mini-CEX and the ‘clinical evaluation exercise’ is supported by strong and significant correlations with other valid assessment instruments. The evidence from the few studies on educational effects is not very convincing. The reports on clinical assessment instruments for single work-based encounters are generally positive, but supporting evidence is sparse. Feasibility of instruments seems to be good and reliability requires a minimum of 10 encounters, but no clear conclusions emerge on other aspects. Studies on assessor and learner training and studies examining effects beyond ‘happiness data’ are badly needed

    The Influence of Markov Decision Process Structure on the Possible Strategic Use of Working Memory and Episodic Memory

    Get PDF
    Researchers use a variety of behavioral tasks to analyze the effect of biological manipulations on memory function. This research will benefit from a systematic mathematical method for analyzing memory demands in behavioral tasks. In the framework of reinforcement learning theory, these tasks can be mathematically described as partially-observable Markov decision processes. While a wealth of evidence collected over the past 15 years relates the basal ganglia to the reinforcement learning framework, only recently has much attention been paid to including psychological concepts such as working memory or episodic memory in these models. This paper presents an analysis that provides a quantitative description of memory states sufficient for correct choices at specific decision points. Using information from the mathematical structure of the task descriptions, we derive measures that indicate whether working memory (for one or more cues) or episodic memory can provide strategically useful information to an agent. In particular, the analysis determines which observed states must be maintained in or retrieved from memory to perform these specific tasks. We demonstrate the analysis on three simplified tasks as well as eight more complex memory tasks drawn from the animal and human literature (two alternation tasks, two sequence disambiguation tasks, two non-matching tasks, the 2-back task, and the 1-2-AX task). The results of these analyses agree with results from quantitative simulations of the task reported in previous publications and provide simple indications of the memory demands of the tasks which can require far less computation than a full simulation of the task. This may provide a basis for a quantitative behavioral stoichiometry of memory tasks

    Gene therapy for monogenic liver diseases: clinical successes, current challenges and future prospects

    Get PDF
    Over the last decade, pioneering liver-directed gene therapy trials for haemophilia B have achieved sustained clinical improvement after a single systemic injection of adeno-associated virus (AAV) derived vectors encoding the human factor IX cDNA. These trials demonstrate the potential of AAV technology to provide long-lasting clinical benefit in the treatment of monogenic liver disorders. Indeed, with more than ten ongoing or planned clinical trials for haemophilia A and B and dozens of trials planned for other inherited genetic/metabolic liver diseases, clinical translation is expanding rapidly. Gene therapy is likely to become an option for routine care of a subset of severe inherited genetic/metabolic liver diseases in the relatively near term. In this review, we aim to summarise the milestones in the development of gene therapy, present the different vector tools and their clinical applications for liver-directed gene therapy. AAV-derived vectors are emerging as the leading candidates for clinical translation of gene delivery to the liver. Therefore, we focus on clinical applications of AAV vectors in providing the most recent update on clinical outcomes of completed and ongoing gene therapy trials and comment on the current challenges that the field is facing for large-scale clinical translation. There is clearly an urgent need for more efficient therapies in many severe monogenic liver disorders, which will require careful risk-benefit analysis for each indication, especially in paediatrics

    Syndromics: A Bioinformatics Approach for Neurotrauma Research

    Get PDF
    Substantial scientific progress has been made in the past 50 years in delineating many of the biological mechanisms involved in the primary and secondary injuries following trauma to the spinal cord and brain. These advances have highlighted numerous potential therapeutic approaches that may help restore function after injury. Despite these advances, bench-to-bedside translation has remained elusive. Translational testing of novel therapies requires standardized measures of function for comparison across different laboratories, paradigms, and species. Although numerous functional assessments have been developed in animal models, it remains unclear how to best integrate this information to describe the complete translational “syndrome” produced by neurotrauma. The present paper describes a multivariate statistical framework for integrating diverse neurotrauma data and reviews the few papers to date that have taken an information-intensive approach for basic neurotrauma research. We argue that these papers can be described as the seminal works of a new field that we call “syndromics”, which aim to apply informatics tools to disease models to characterize the full set of mechanistic inter-relationships from multi-scale data. In the future, centralized databases of raw neurotrauma data will enable better syndromic approaches and aid future translational research, leading to more efficient testing regimens and more clinically relevant findings

    The genetic architecture of the human cerebral cortex

    Get PDF
    The cerebral cortex underlies our complex cognitive capabilities, yet little is known about the specific genetic loci that influence human cortical structure. To identify genetic variants that affect cortical structure, we conducted a genome-wide association meta-analysis of brain magnetic resonance imaging data from 51,665 individuals. We analyzed the surface area and average thickness of the whole cortex and 34 regions with known functional specializations. We identified 199 significant loci and found significant enrichment for loci influencing total surface area within regulatory elements that are active during prenatal cortical development, supporting the radial unit hypothesis. Loci that affect regional surface area cluster near genes in Wnt signaling pathways, which influence progenitor expansion and areal identity. Variation in cortical structure is genetically correlated with cognitive function, Parkinson's disease, insomnia, depression, neuroticism, and attention deficit hyperactivity disorder

    Integral control for population management

    Get PDF
    We present a novel management methodology for restocking a declining population. The strategy uses integral control, a concept ubiquitous in control theory which has not been applied to population dynamics. Integral control is based on dynamic feedback-using measurements of the population to inform management strategies and is robust to model uncertainty, an important consideration for ecological models. We demonstrate from first principles why such an approach to population management is suitable via theory and examples

    A user's guide to the Encyclopedia of DNA elements (ENCODE)

    Get PDF
    The mission of the Encyclopedia of DNA Elements (ENCODE) Project is to enable the scientific and medical communities to interpret the human genome sequence and apply it to understand human biology and improve health. The ENCODE Consortium is integrating multiple technologies and approaches in a collective effort to discover and define the functional elements encoded in the human genome, including genes, transcripts, and transcriptional regulatory regions, together with their attendant chromatin states and DNA methylation patterns. In the process, standards to ensure high-quality data have been implemented, and novel algorithms have been developed to facilitate analysis. Data and derived results are made available through a freely accessible database. Here we provide an overview of the project and the resources it is generating and illustrate the application of ENCODE data to interpret the human genome

    The role of nuclear technologies in the diagnosis and control of livestock diseases—a review

    Full text link
    corecore