2,037 research outputs found

    Estimating parameters for probabilistic linkage of privacy-preserved datasets.

    Get PDF
    Background: Probabilistic record linkage is a process used to bring together person-based records from within the same dataset (de-duplication) or from disparate datasets using pairwise comparisons and matching probabilities. The linkage strategy and associated match probabilities are often estimated through investigations into data quality and manual inspection. However, as privacy-preserved datasets comprise encrypted data, such methods are not possible. In this paper, we present a method for estimating the probabilities and threshold values for probabilistic privacy-preserved record linkage using Bloom filters. Methods: Our method was tested through a simulation study using synthetic data, followed by an application using real-world administrative data. Synthetic datasets were generated with error rates from zero to 20% error. Our method was used to estimate parameters (probabilities and thresholds) for de-duplication linkages. Linkage quality was determined by F-measure. Each dataset was privacy-preserved using separate Bloom filters for each field. Match probabilities were estimated using the expectation-maximisation (EM) algorithm on the privacy-preserved data. Threshold cut-off values were determined by an extension to the EM algorithm allowing linkage quality to be estimated for each possible threshold. De-duplication linkages of each privacy-preserved dataset were performed using both estimated and calculated probabilities. Linkage quality using the F-measure at the estimated threshold values was also compared to the highest F-measure. Three large administrative datasets were used to demonstrate the applicability of the probability and threshold estimation technique on real-world data. Results: Linkage of the synthetic datasets using the estimated probabilities produced an F-measure that was comparable to the F-measure using calculated probabilities, even with up to 20% error. Linkage of the administrative datasets using estimated probabilities produced an F-measure that was higher than the F-measure using calculated probabilities. Further, the threshold estimation yielded results for F-measure that were only slightly below the highest possible for those probabilities. Conclusions: The method appears highly accurate across a spectrum of datasets with varying degrees of error. As there are few alternatives for parameter estimation, the approach is a major step towards providing a complete operational approach for probabilistic linkage of privacy-preserved datasets

    Applying human factors methods to explore ‘Work as Imagined’ and ‘Work as Done’ in the Emergency Department’s response to chemical, biological, radiological, and nuclear events

    Get PDF
    The Emergency Department (ED) is a complex, hectic, and high-pressured environment. Chemical, Biological, Radiological, and Nuclear (CBRN) events are multi-faceted emergencies and present numerous challenges to ED staff (first receivers) with large scale trauma, consequently requiring a combination of complex responses. Human Factors and Ergonomics (HF/E) methods such as Hierarchical Task Analysis (HTA) have been used in healthcare research. However, HF/E methods and theory have not been combined to understand how the ED responds to CBRN events. This study aimed to compare Work as Imagined (WAI) and Work as Done (WAD) in the ED CBRN response in a UK based hospital. WAI was established by carrying out document analyses on a CBRN plan and WAD by exploring first receivers response to CBRN scenario cards. The responses were converted to HTAs and compared. The WAI HTAs showed 4-8 phases of general organizational responsibilities during a CBRN event. WAD HTAs placed emphasis on diagnosing and treating presenting conditions. A comparison of WAI and WAD HTAs highlighted common actions and tasks. This study has identified three key differences between WAI and WAD in the ED CBRN response: 1) documentation of the CBRN event 2) treating the patient and 3) diagnosing the presenting complaint. Findings from this study provide an evidence base which can be used to inform future clinical policy and practice in providing safe and high quality care during CBRN events in the ED

    A genome-wide linkage study of mammographic density, a risk factor for breast cancer

    Get PDF
    Abstract Introduction Mammographic breast density is a highly heritable (h2 > 0.6) and strong risk factor for breast cancer. We conducted a genome-wide linkage study to identify loci influencing mammographic breast density (MD). Methods Epidemiological data were assembled on 1,415 families from the Australia, Northern California and Ontario sites of the Breast Cancer Family Registry, and additional families recruited in Australia and Ontario. Families consisted of sister pairs with age-matched mammograms and data on factors known to influence MD. Single nucleotide polymorphism (SNP) genotyping was performed on 3,952 individuals using the Illumina Infinium 6K linkage panel. Results Using a variance components method, genome-wide linkage analysis was performed using quantitative traits obtained by adjusting MD measurements for known covariates. Our primary trait was formed by fitting a linear model to the square root of the percentage of the breast area that was dense (PMD), adjusting for age at mammogram, number of live births, menopausal status, weight, height, weight squared, and menopausal hormone therapy. The maximum logarithm of odds (LOD) score from the genome-wide scan was on chromosome 7p14.1-p13 (LOD = 2.69; 63.5 cM) for covariate-adjusted PMD, with a 1-LOD interval spanning 8.6 cM. A similar signal was seen for the covariate adjusted area of the breast that was dense (DA) phenotype. Simulations showed that the complete sample had adequate power to detect LOD scores of 3 or 3.5 for a locus accounting for 20% of phenotypic variance. A modest peak initially seen on chromosome 7q32.3-q34 increased in strength when only the 513 families with at least two sisters below 50 years of age were included in the analysis (LOD 3.2; 140.7 cM, 1-LOD interval spanning 9.6 cM). In a subgroup analysis, we also found a LOD score of 3.3 for DA phenotype on chromosome 12.11.22-q13.11 (60.8 cM, 1-LOD interval spanning 9.3 cM), overlapping a region identified in a previous study. Conclusions The suggestive peaks and the larger linkage signal seen in the subset of pedigrees with younger participants highlight regions of interest for further study to identify genes that determine MD, with the goal of understanding mammographic density and its involvement in susceptibility to breast cancer

    Fixed points for cyclic R-contractions and solution of nonlinear Volterra integro-differential equations

    Get PDF
    In this paper, we introduce the notion of cyclic R-contraction mapping and then study the existence of fixed points for such mappings in the framework of metric spaces. Examples and application are presented to support the main result. Our result unify, complement, and generalize various comparable results in the existing literature.http://link.springer.com/journal/11784am2016Mathematics and Applied Mathematic

    The semi-classical expansion and resurgence in gauge theories: new perturbative, instanton, bion, and renormalon effects

    Get PDF
    We study the dynamics of four dimensional gauge theories with adjoint fermions for all gauge groups, both in perturbation theory and non-perturbatively, by using circle compactification with periodic boundary conditions for the fermions. There are new gauge phenomena. We show that, to all orders in perturbation theory, many gauge groups are Higgsed by the gauge holonomy around the circle to a product of both abelian and nonabelian gauge group factors. Non-perturbatively there are monopole-instantons with fermion zero modes and two types of monopole-anti-monopole molecules, called bions. One type are "magnetic bions" which carry net magnetic charge and induce a mass gap for gauge fluctuations. Another type are "neutral bions" which are magnetically neutral, and their understanding requires a generalization of multi-instanton techniques in quantum mechanics - which we refer to as the Bogomolny-Zinn-Justin (BZJ) prescription - to compactified field theory. The BZJ prescription applied to bion-anti-bion topological molecules predicts a singularity on the positive real axis of the Borel plane (i.e., a divergence from summing large orders in peturbation theory) which is of order N times closer to the origin than the leading 4-d BPST instanton-anti-instanton singularity, where N is the rank of the gauge group. The position of the bion--anti-bion singularity is thus qualitatively similar to that of the 4-d IR renormalon singularity, and we conjecture that they are continuously related as the compactification radius is changed. By making use of transseries and Ecalle's resurgence theory we argue that a non-perturbative continuum definition of a class of field theories which admit semi-classical expansions may be possible.Comment: 112 pages, 7 figures; v2: typos corrected, discussion of supersymmetric models added at the end of section 8.1, reference adde

    New algorithm improves fine structure of the barley consensus SNP map

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The need to integrate information from multiple linkage maps is a long-standing problem in genetics. One way to visualize the complex ordinal relationships is with a directed graph, where each vertex in the graph is a bin of markers. When there are no ordering conflicts between the linkage maps, the result is a directed acyclic graph, or DAG, which can then be linearized to produce a consensus map.</p> <p>Results</p> <p>New algorithms for the simplification and linearization of consensus graphs have been implemented as a package for the R computing environment called DAGGER. The simplified consensus graphs produced by DAGGER exactly capture the ordinal relationships present in a series of linkage maps. Using either linear or quadratic programming, DAGGER generates a consensus map with minimum error relative to the linkage maps while remaining ordinally consistent with them. Both linearization methods produce consensus maps that are compressed relative to the mean of the linkage maps. After rescaling, however, the consensus maps had higher accuracy (and higher marker density) than the individual linkage maps in genetic simulations. When applied to four barley linkage maps genotyped at nearly 3000 SNP markers, DAGGER produced a consensus map with improved fine structure compared to the existing barley consensus SNP map. The root-mean-squared error between the linkage maps and the DAGGER map was 0.82 cM per marker interval compared to 2.28 cM for the existing consensus map. Examination of the barley hardness locus at the 5HS telomere, for which there is a physical map, confirmed that the DAGGER output was more accurate for fine structure analysis.</p> <p>Conclusions</p> <p>The R package DAGGER is an effective, freely available resource for integrating the information from a set of consistent linkage maps.</p

    The computational therapeutic: exploring Weizenbaum's ELIZA as a history of the present

    Get PDF
    This paper explores the history of ELIZA, a computer programme approximating a Rogerian therapist, developed by Jospeh Weizenbaum at MIT in the 1970s, as an early AI experiment. ELIZA’s reception provoked Weizenbaum to re-appraise the relationship between ‘computer power and human reason’ and to attack the ‘powerful delusional thinking’ about computers and their intelligence that he understood to be widespread in the general public and also amongst experts. The root issue for Weizenbaum was whether human thought could be ‘entirely computable’ (reducible to logical formalism). This also provoked him to re-consider the nature of machine intelligence and to question the instantiation of its logics in the social world, which would come to operate, he said, as a ‘slow acting poison’. Exploring Weizenbaum’s 20th Century apostasy, in the light of ELIZA, illustrates ways in which contemporary anxieties and debates over machine smartness connect to earlier formations. In particular, this article argues that it is in its designation as a computational therapist that ELIZA is most significant today. ELIZA points towards a form of human–machine relationship now pervasive, a precursor of the ‘machinic therapeutic’ condition we find ourselves in, and thus speaks very directly to questions concerning modulation, autonomy, and the new behaviorism that are currently arising

    Lithic technological responses to Late Pleistocene glacial cycling at Pinnacle Point Site 5-6, South Africa

    Get PDF
    There are multiple hypotheses for human responses to glacial cycling in the Late Pleistocene, including changes in population size, interconnectedness, and mobility. Lithic technological analysis informs us of human responses to environmental change because lithic assemblage characteristics are a reflection of raw material transport, reduction, and discard behaviors that depend on hunter-gatherer social and economic decisions. Pinnacle Point Site 5-6 (PP5-6), Western Cape, South Africa is an ideal locality for examining the influence of glacial cycling on early modern human behaviors because it preserves a long sequence spanning marine isotope stages (MIS) 5, 4, and 3 and is associated with robust records of paleoenvironmental change. The analysis presented here addresses the question, what, if any, lithic assemblage traits at PP5-6 represent changing behavioral responses to the MIS 5-4-3 interglacial-glacial cycle? It statistically evaluates changes in 93 traits with no a priori assumptions about which traits may significantly associate with MIS. In contrast to other studies that claim that there is little relationship between broad-scale patterns of climate change and lithic technology, we identified the following characteristics that are associated with MIS 4: increased use of quartz, increased evidence for outcrop sources of quartzite and silcrete, increased evidence for earlier stages of reduction in silcrete, evidence for increased flaking efficiency in all raw material types, and changes in tool types and function for silcrete. Based on these results, we suggest that foragers responded to MIS 4 glacial environmental conditions at PP5-6 with increased population or group sizes, 'place provisioning', longer and/or more intense site occupations, and decreased residential mobility. Several other traits, including silcrete frequency, do not exhibit an association with MIS. Backed pieces, once they appear in the PP5-6 record during MIS 4, persist through MIS 3. Changing paleoenvironments explain some, but not all temporal technological variability at PP5-6.Social Science and Humanities Research Council of Canada; NORAM; American-Scandinavian Foundation; Fundacao para a Ciencia e Tecnologia [SFRH/BPD/73598/2010]; IGERT [DGE 0801634]; Hyde Family Foundations; Institute of Human Origins; National Science Foundation [BCS-9912465, BCS-0130713, BCS-0524087, BCS-1138073]; John Templeton Foundation to the Institute of Human Origins at Arizona State Universit

    Quantifying Age-Related Differences in Information Processing Behaviors When Viewing Prescription Drug Labels

    Get PDF
    Adverse drug events (ADEs) are a significant problem in health care. While effective warnings have the potential to reduce the prevalence of ADEs, little is known about how patients access and use prescription labeling. We investigated the effectiveness of prescription warning labels (PWLs, small, colorful stickers applied at the pharmacy) in conveying warning information to two groups of patients (young adults and those 50+). We evaluated the early stages of information processing by tracking eye movements while participants interacted with prescription vials that had PWLs affixed to them. We later tested participants’ recognition memory for the PWLs. During viewing, participants often failed to attend to the PWLs; this effect was more pronounced for older than younger participants. Older participants also performed worse on the subsequent memory test. However, when memory performance was conditionalized on whether or not the participant had fixated the PWL, these age-related differences in memory were no longer significant, suggesting that the difference in memory performance between groups was attributable to differences in attention rather than differences in memory encoding or recall. This is important because older adults are recognized to be at greater risk for ADEs. These data provide a compelling case that understanding consumers’ attentive behavior is crucial to developing an effective labeling standard for prescription drugs
    • …
    corecore