315 research outputs found

    A Machine Learning Trainable Model to Assess the Accuracy of Probabilistic Record Linkage

    Get PDF
    Record linkage (RL) is the process of identifying and linking data that relates to the same physical entity across multiple heterogeneous data sources. Deterministic linkage methods rely on the presence of common uniquely identifying attributes across all sources while probabilistic approaches use non-unique attributes and calculates similarity indexes for pair wise comparisons. A key component of record linkage is accuracy assessment — the process of manually verifying and validating matched pairs to further refine linkage parameters and increase its overall effectiveness. This process however is time-consuming and impractical when applied to large administrative data sources where millions of records must be linked. Additionally, it is potentially biased as the gold standard used is often the reviewer’s intuition. In this paper, we present an approach for assessing and refining the accuracy of probabilistic linkage based on different supervised machine learning methods (decision trees, naïve Bayes, logistic regression, random forest, linear support vector machines and gradient boosted trees). We used data sets extracted from huge Brazilian socioeconomic and public health care data sources. These models were evaluated using receiver operating characteristic plots, sensitivity, specificity and positive predictive values collected from a 10-fold cross-validation method. Results show that logistic regression outperforms other classifiers and enables the creation of a generalized, very accurate model to validate linkage results

    Cognitive behaviour therapy versus counselling intervention for anxiety in young people with high-functioning autism spectrum disorders: a pilot randomised controlled trial

    Get PDF
    The use of cognitive-behavioural therapy (CBT) as a treatment for children and adolescents with autism spectrum disorder (ASD) has been explored in a number of trials. Whilst CBT appears superior to no treatment or treatment as usual, few studies have assessed CBT against a control group receiving an alternative therapy. Our randomised controlled trial compared use of CBT against person-centred counselling for anxiety in 36 young people with ASD, ages 12–18. Outcome measures included parent- teacher- and self-reports of anxiety and social disability. Whilst each therapy produced improvements inparticipants, neither therapy was superior to the other to a significant degree on any measure. This is consistent with findings for adults

    Design and Culture in the Making of Happiness

    Get PDF
    Design responds to the needs of individuals, being happiness and wellbeing the subject of an increasing number of studies, which gave rise to a new discipline, Positive Psychology. From these new approaches and concerns related to subjective well-being comes Positive Design, whose objective is to promote the well-being of individuals and communities in connection with a culture of innovation. The cultural routes made accessible through wayfinding systems, make it possible to put Heritage in dialogue, emphasize the culture, memory and history of communities, providing citizens with meaningful experiences that will have an impact both in the short and long run, thus becoming agents for the happiness of individuals. This article discusses the concept of Positive Design based on Positive Psychology, analyzes the evolution and importance of Heritage in the Culture of peoples and communities, questioning how the Wayfinding Systems developed for cultural promotion can integrate the practice of Positive Design and how this contributes to the subjective well-being of individuals

    The use of a physiologically based pharmacokinetic model to evaluate deconvolution measurements of systemic absorption

    Get PDF
    BACKGROUND: An unknown input function can be determined by deconvolution using the systemic bolus input function (r) determined using an experimental input of duration ranging from a few seconds to many minutes. The quantitative relation between the duration of the input and the accuracy of r is unknown. Although a large number of deconvolution procedures have been described, these routines are not available in a convenient software package. METHODS: Four deconvolution methods are implemented in a new, user-friendly software program (PKQuest, ). Three of these methods are characterized by input parameters that are adjusted by the user to provide the "best" fit. A new approach is used to determine these parameters, based on the assumption that the input can be approximated by a gamma distribution. Deconvolution methodologies are evaluated using data generated from a physiologically based pharmacokinetic model (PBPK). RESULTS AND CONCLUSIONS: The 11-compartment PBPK model is accurately described by either a 2 or 3-exponential function, depending on whether or not there is significant tissue binding. For an accurate estimate of r the first venous sample should be at or before the end of the constant infusion and a long (10 minute) constant infusion is preferable to a bolus injection. For noisy data, a gamma distribution deconvolution provides the best result if the input has the form of a gamma distribution. For other input functions, good results are obtained using deconvolution methods based on modeling the input with either a B-spline or uniform dense set of time points

    Physiologically based pharmacokinetic modeling of arterial – antecubital vein concentration difference

    Get PDF
    BACKGROUND: Modeling of pharmacokinetic parameters and pharmacodynamic actions requires knowledge of the arterial blood concentration. In most cases, experimental measurements are only available for a peripheral vein (usually antecubital) whose concentration may differ significantly from both arterial and central vein concentration. METHODS: A physiologically based pharmacokinetic (PBPK) model for the tissues drained by the antecubital vein (referred to as "arm") is developed. It is assumed that the "arm" is composed of tissues with identical properties (partition coefficient, blood flow/gm) as the whole body tissues plus a new "tissue" representing skin arteriovenous shunts. The antecubital vein concentration depends on the following parameters: the fraction of "arm" blood flow contributed by muscle, skin, adipose, connective tissue and arteriovenous shunts, and the flow per gram of the arteriovenous shunt. The value of these parameters was investigated using simultaneous experimental measurements of arterial and antecubital concentrations for eight solutes: ethanol, thiopental, (99)Tc(m)-diethylene triamine pentaacetate (DTPA), ketamine, D(2)O, acetone, methylene chloride and toluene. A new procedure is described that can be used to determine the arterial concentration for an arbitrary solute by deconvolution of the antecubital concentration. These procedures are implemented in PKQuest, a general PBPK program that is freely distributed . RESULTS: One set of "standard arm" parameters provides an adequate description of the arterial/antecubital vein concentration for ethanol, DTPA, thiopental and ketamine. A significantly different set of "arm" parameters was required to describe the data for D(2)O, acetone, methylene chloride and toluene – probably because the "arm" is in a different physiological state. CONCLUSIONS: Using the set of "standard arm" parameters, the antecubital vein concentration can be used to determine the whole body PBPK model parameters for an arbitrary solute without any additional adjustable parameters. Also, the antecubital vein concentration can be used to estimate the arterial concentration for an arbitrary input for solutes for which no arterial concentration data is available

    Transition probabilities for general birth-death processes with applications in ecology, genetics, and evolution

    Full text link
    A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. In the general process with nn current particles, a new particle is born with instantaneous rate λn\lambda_n and a particle dies with instantaneous rate μn\mu_n. Currently no robust and efficient method exists to evaluate the finite-time transition probabilities in a general birth-death process with arbitrary birth and death rates. In this paper, we first revisit the theory of continued fractions to obtain expressions for the Laplace transforms of these transition probabilities and make explicit an important derivation connecting transition probabilities and continued fractions. We then develop an efficient algorithm for computing these probabilities that analyzes the error associated with approximations in the method. We demonstrate that this error-controlled method agrees with known solutions and outperforms previous approaches to computing these probabilities. Finally, we apply our novel method to several important problems in ecology, evolution, and genetics

    The academic backbone: longitudinal continuities in educational achievement from secondary school and medical school to MRCP(UK) and the specialist register in UK medical students and doctors

    Get PDF
    Background: Selection of medical students in the UK is still largely based on prior academic achievement, although doubts have been expressed as to whether performance in earlier life is predictive of outcomes later in medical school or post-graduate education. This study analyses data from five longitudinal studies of UK medical students and doctors from the early 1970s until the early 2000s. Two of the studies used the AH5, a group test of general intelligence (that is, intellectual aptitude). Sex and ethnic differences were also analyzed in light of the changing demographics of medical students over the past decades. Methods: Data from five cohort studies were available: the Westminster Study (began clinical studies from 1975 to 1982), the 1980, 1985, and 1990 cohort studies (entered medical school in 1981, 1986, and 1991), and the University College London Medical School (UCLMS) Cohort Study (entered clinical studies in 2005 and 2006). Different studies had different outcome measures, but most had performance on basic medical sciences and clinical examinations at medical school, performance in Membership of the Royal Colleges of Physicians (MRCP(UK)) examinations, and being on the General Medical Council Specialist Register. Results: Correlation matrices and path analyses are presented. There were robust correlations across different years at medical school, and medical school performance also predicted MRCP(UK) performance and being on the GMC Specialist Register. A-levels correlated somewhat less with undergraduate and post-graduate performance, but there was restriction of range in entrants. General Certificate of Secondary Education (GCSE)/O-level results also predicted undergraduate and post-graduate outcomes, but less so than did A-level results, but there may be incremental validity for clinical and post-graduate performance. The AH5 had some significant correlations with outcome, but they were inconsistent. Sex and ethnicity also had predictive effects on measures of educational attainment, undergraduate, and post-graduate performance. Women performed better in assessments but were less likely to be on the Specialist Register. Non-white participants generally underperformed in undergraduate and post-graduate assessments, but were equally likely to be on the Specialist Register. There was a suggestion of smaller ethnicity effects in earlier studies. Conclusions: The existence of the Academic Backbone concept is strongly supported, with attainment at secondary school predicting performance in undergraduate and post-graduate medical assessments, and the effects spanning many years. The Academic Backbone is conceptualized in terms of the development of more sophisticated underlying structures of knowledge ('cognitive capital’ and 'medical capital’). The Academic Backbone provides strong support for using measures of educational attainment, particularly A-levels, in student selection

    Rituximab in B-Cell Hematologic Malignancies: A Review of 20 Years of Clinical Experience

    Get PDF
    Rituximab is a human/murine, chimeric anti-CD20 monoclonal antibody with established efficacy, and a favorable and well-defined safety profile in patients with various CD20-expressing lymphoid malignancies, including indolent and aggressive forms of B-cell non-Hodgkin lymphoma. Since its first approval 20 years ago, intravenously administered rituximab has revolutionized the treatment of B-cell malignancies and has become a standard component of care for follicular lymphoma, diffuse large B-cell lymphoma, chronic lymphocytic leukemia, and mantle cell lymphoma. For all of these diseases, clinical trials have demonstrated that rituximab not only prolongs the time to disease progression but also extends overall survival. Efficacy benefits have also been shown in patients with marginal zone lymphoma and in more aggressive diseases such as Burkitt lymphoma. Although the proven clinical efficacy and success of rituximab has led to the development of other anti-CD20 monoclonal antibodies in recent years (e.g., obinutuzumab, ofatumumab, veltuzumab, and ocrelizumab), rituximab is likely to maintain a position within the therapeutic armamentarium because it is well established with a long history of successful clinical use. Furthermore, a subcutaneous formulation of the drug has been approved both in the EU and in the USA for the treatment of B-cell malignancies. Using the wealth of data published on rituximab during the last two decades, we review the preclinical development of rituximab and the clinical experience gained in the treatment of hematologic B-cell malignancies, with a focus on the well-established intravenous route of administration. This article is a companion paper to A. Davies, et al., which is also published in this issue

    Quantum Computing

    Full text link
    Quantum mechanics---the theory describing the fundamental workings of nature---is famously counterintuitive: it predicts that a particle can be in two places at the same time, and that two remote particles can be inextricably and instantaneously linked. These predictions have been the topic of intense metaphysical debate ever since the theory's inception early last century. However, supreme predictive power combined with direct experimental observation of some of these unusual phenomena leave little doubt as to its fundamental correctness. In fact, without quantum mechanics we could not explain the workings of a laser, nor indeed how a fridge magnet operates. Over the last several decades quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit these unique quantum properties? Today it is understood that the answer is yes. Many research groups around the world are working towards one of the most ambitious goals humankind has ever embarked upon: a quantum computer that promises to exponentially improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for this task---ranging from single particles of light to superconducting circuits---and it is not yet clear which, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain what the major challenges are for the future.Comment: 26 pages, 7 figures, 291 references. Early draft of Nature 464, 45-53 (4 March 2010). Published version is more up-to-date and has several corrections, but is half the length with far fewer reference

    Estimating uncertainty in ecosystem budget calculations

    Get PDF
    © The Authors, 2010. This article is distributed under the terms of the Creative Commons Attribution-Noncommercial License. The definitive version was published in Ecosystems 13 (2010): 239-248, doi:10.1007/s10021-010-9315-8.Ecosystem nutrient budgets often report values for pools and fluxes without any indication of uncertainty, which makes it difficult to evaluate the significance of findings or make comparisons across systems. We present an example, implemented in Excel, of a Monte Carlo approach to estimating error in calculating the N content of vegetation at the Hubbard Brook Experimental Forest in New Hampshire. The total N content of trees was estimated at 847 kg ha−1 with an uncertainty of 8%, expressed as the standard deviation divided by the mean (the coefficient of variation). The individual sources of uncertainty were as follows: uncertainty in allometric equations (5%), uncertainty in tissue N concentrations (3%), uncertainty due to plot variability (6%, based on a sample of 15 plots of 0.05 ha), and uncertainty due to tree diameter measurement error (0.02%). In addition to allowing estimation of uncertainty in budget estimates, this approach can be used to assess which measurements should be improved to reduce uncertainty in the calculated values. This exercise was possible because the uncertainty in the parameters and equations that we used was made available by previous researchers. It is important to provide the error statistics with regression results if they are to be used in later calculations; archiving the data makes resampling analyses possible for future researchers. When conducted using a Monte Carlo framework, the analysis of uncertainty in complex calculations does not have to be difficult and should be standard practice when constructing ecosystem budgets
    corecore