63 research outputs found

    A Two-Light Version of the Classical Hundred Prisoners and a Light Bulb Problem: Optimizing Experimental Design through Simulations

    Get PDF
    We propose five original strategies of successively increasing complexity and efficiency that address a novel version of a classical mathematical problem that, in essence, focuses on the determination of an optimal protocol for exchanging limited amounts of information among a group of subjects with various prerogatives. The inherent intricacy of the problem�solving protocols eliminates the possibility to attain an analytical solution. Therefore, we implemented a large-scale simulation study to exhaustively search through an extensive list of competing algorithms associated with the above-mentioned 5 generally defined protocols. Our results show that the consecutive improvements in the average amount of time necessary for the strategy-specific problem-solving completion over the previous simpler and less advantageously structured designs were 18, 30, 12, and 9% respectively. The optimal multi-stage information exchange strategy allows for a successful execution of the task of interest in 1722 days (4.7 years) on average with standard deviation of 385 days. The execution of this protocol took as few as 1004 and as many as 4965 with median of 1616 days

    A Novel Correction for the Adjusted Box-Pierce Test

    Get PDF
    The classical Box-Pierce and Ljung-Box tests for auto-correlation of residuals possess severe deviations from nominal type I error rates. Previous studies have attempted to address this issue by either revising existing tests or designing new techniques. The Adjusted Box-Pierce achieves the best results with respect to attaining type I error rates closer to nominal values. This research paper proposes a further correction to the adjusted Box-Pierce test that possesses near perfect type I error rates. The approach is based on an inflation of the rejection region for all sample sizes and lags calculated via a linear model applied to simulated data that encompasses a large range of data scenarios. Our results show that the new approach possesses the best type I error rates of all goodness-of-fit time series statistics

    Deinstitutionalized patients, homelessness and imprisonment: A systematic review

    Get PDF
    BackgroundReports linking the deinstitutionalisation of psychiatric care with homelessness and imprisonment have been published widely.AimsTo identify cohort studies that followed up or traced back long-term psychiatric hospital residents who had been discharged as a consequence of deinstitutionalisation.MethodA broad search strategy was used and 9435 titles and abstracts were screened, 416 full articles reviewed and 171 articles from cohort studies of deinstitutionalised patients were examined in detail.ResultsTwenty-three studies of unique populations assessed homelessness and imprisonment among patients discharged from long-term care. Homelessness and imprisonment occurred sporadically; in the majority of studies no single case of homelessness or imprisonment was reported.ConclusionsOur results contradict the findings of ecological studies which indicated a strong correlation between the decreasing number of psychiatric beds and an increasing number of people with mental health problems who were homeless or in prison.</jats:sec

    The Chapman Bone Algorithm: A Diagnostic Alternative for the Evaluation of Osteoporosis

    Get PDF
    Osteoporosis is the most common metabolic bone disease and goes largely undiagnosed throughout the world, due to the inaccessibility of DXA machines. Multivariate analyses of serum bone turnover markers were evaluated in 226 Orange County, California, residents with the intent to determine if serum osteocalcin and serum pyridinoline cross-links could be used to detect the onset of osteoporosis as effectively as a DXA scan. Descriptive analyses of the demographic and lab characteristics of the participants were performed through frequency, means and standard deviation estimations. We implemented logistic regression modeling to find the best classification algorithm for osteoporosis. All calculations and model building steps were carried out using R statistical language. Through these analyses, a mathematical algorithm with diagnostic potential was created. This algorithm showed a sensitivity of 1.0 and a specificity of 0.83, with an area under the Receiver Operating Characteristic curve of 0.93, thus demonstrating a high predictability for osteoporosis. Our intention is for this algorithm to be used to evaluate osteoporosis in locations where access to DXA scanning is scarce

    Predicting Suicidal and Self-Injurious Events in a Correctional Setting Using AI Algorithms on Unstructured Medical Notes and Structured Data

    Get PDF
    Suicidal and self-injurious incidents in correctional settings deplete the institutional and healthcare resources, create disorder and stress for staff and other inmates. Traditional statistical analyses provide some guidance, but they can only be applied to structured data that are often difficult to collect and their recommendations are often expensive to act upon. This study aims to extract information from medical and mental health progress notes using AI algorithms to make actionable predictions of suicidal and self-injurious events to improve the efficiency of triage for health care services and prevent suicidal and injurious events from happening at California\u27s Orange County Jails. The results showed that the notes data contain more information with respect to suicidal or injurious behaviors than the structured data available in the EHR database at the Orange County Jails. Using the notes data alone (under-sampled to 50%) in a Transformer Encoder model produced an AUC-ROC of 0.862, a Sensitivity of 0.816, and a Specificity of 0.738. Incorporating the information extracted from the notes data into traditional Machine Learning models as a feature alongside structured data (under-sampled to 50%) yielded better performance in terms of Sensitivity (AUC-ROC: 0.77, Sensitivity: 0.89, Specificity: 0.65). In addition, under-sampling is an effective approach to mitigating the impact of the extremely imbalanced classes

    Investigative power of Genomic Informational Field Theory (GIFT) relative to GWAS for genotype-phenotype mapping

    Get PDF
    Identifying associations between phenotype and genotype is the fundamental basis of genetic analyses. Inspired by frequentist probability and the work of R.A. Fisher, genome-wide association studies (GWAS) extract information using averages and variances from genotype-phenotype datasets. Averages and variances are legitimated upon creating distribution density functions obtained through the grouping of data into categories. However, as data from within a given category cannot be differentiated, the investigative power of such methodologies is limited. Genomic Informational Field Theory (GIFT) is a method specifically designed to circumvent this issue. The way GIFT proceeds is opposite to that of GWAS. Whilst GWAS determines the extent to which genes are involved in phenotype formation (bottom-up approach), GIFT determines the degree to which the phenotype can select microstates (genes) for its subsistence (top-down approach). Doing so requires dealing with new genetic concepts, a.k.a. genetic paths, upon which significance levels for genotype-phenotype associations can be determined. By using different datasets obtained in ovis aries related to bone growth (Dataset-1) and to a series of linked metabolic and epigenetic pathways (Dataset-2), we demonstrate that removing the informational barrier linked to categories enhances the investigative and discriminative powers of GIFT, namely that GIFT extracts more information than GWAS. We conclude by suggesting that GIFT is an adequate tool to study how phenotypic plasticity and genetic assimilation are linked.</p

    Optimal Multi-Stage Arrhythmia Classification Approach

    Get PDF
    Arrhythmia constitutes a problem with the rate or rhythm of the heartbeat, and an early diagnosis is essential for the timely inception of successful treatment. We have jointly optimized the entire multi-stage arrhythmia classification scheme based on 12-lead surface ECGs that attains the accuracy performance level of professional cardiologists. The new approach is comprised of a three-step noise reduction stage, a novel feature extraction method and an optimal classification model with finely tuned hyperparameters. We carried out an exhaustive study comparing thousands of competing classification algorithms that were trained on our proprietary, large and expertly labeled dataset consisting of 12-lead ECGs from 40,258 patients with four arrhythmia classes: atrial fibrillation, general supraventricular tachycardia, sinus bradycardia and sinus rhythm including sinus irregularity rhythm. Our results show that the optimal approach consisted of Low Band Pass filter, Robust LOESS, Non Local Means smoothing, a proprietary feature extraction method based on percentiles of the empirical distribution of ratios of interval lengths and magnitudes of peaks and valleys, and Extreme Gradient Boosting Tree classifier, achieved an F1-Score of 0.988 on patients without additional cardiac conditions. The same noise reduction and feature extraction methods combined with Gradient Boosting Tree classifier achieved an F1-Score of 0.97 on patients with additional cardiac conditions. Our method achieved the highest classification accuracy (average 10-fold cross-validation F1-Score of 0.992) using an external validation data, MIT-BIH arrhythmia database. The proposed optimal multi-stage arrhythmia classification approach can dramatically benefit automatic ECG data analysis by providing cardiologist level accuracy and robust compatibility with various ECG data sources

    John Latham’s cosmos and mid-century representation

    Get PDF
    The conceptual artist John Latham (1921 – 2006) is sometimes cast as disconnected to the currents of British visual culture. Latham’s idiosyncratic cosmology based upon time and events and incorporating human creativity rather than matter and energy is used to distinguish this disconnection. However, this paper argues that his work can be seen as closely related to that of other mid-century cultural producers who were engaged with alternative cosmic speculations, and part of a broader shift in the register of representation. Papers from the Latham digital archive help make this case
    • …
    corecore