58 research outputs found

    Facts and Fabrications about Ebola: A Twitter Based Study

    Full text link
    Microblogging websites like Twitter have been shown to be immensely useful for spreading information on a global scale within seconds. The detrimental effect, however, of such platforms is that misinformation and rumors are also as likely to spread on the network as credible, verified information. From a public health standpoint, the spread of misinformation creates unnecessary panic for the public. We recently witnessed several such scenarios during the outbreak of Ebola in 2014 [14, 1]. In order to effectively counter the medical misinformation in a timely manner, our goal here is to study the nature of such misinformation and rumors in the United States during fall 2014 when a handful of Ebola cases were confirmed in North America. It is a well known convention on Twitter to use hashtags to give context to a Twitter message (a tweet). In this study, we collected approximately 47M tweets from the Twitter streaming API related to Ebola. Based on hashtags, we propose a method to classify the tweets into two sets: credible and speculative. We analyze these two sets and study how they differ in terms of a number of features extracted from the Twitter API. In conclusion, we infer several interesting differences between the two sets. We outline further potential directions to using this material for monitoring and separating speculative tweets from credible ones, to enable improved public health information.Comment: Appears in SIGKDD BigCHat Workshop 201

    A Video Bioinformatics Method to Quantify Cell Spreading and Its Application to Cells Treated with Rho-Associated Protein Kinase and Blebbistatin

    Get PDF
    Commercial software is available for performing video bioinformatics analysis on cultured cells. Such software is convenient and can often be used to create suitable protocols for quantitative analysis of video data with relatively little background in image processing. This chapter demonstrates that CL-Quant software, a commercial program produced by DRVision, can be used to automatically analyze cell spreading in time-lapse videos of human embryonic stem cells (hESC). Two cell spreading protocols were developed and tested. One was professionally created by engineers at DRVision and adapted to this project. The other was created by an undergraduate student with 1 month of experience using CL-Quant. Both protocols successfully segmented small spreading colonies of hESC, and, in general, were in good agreement with the ground truth which was measured using ImageJ. Overall the professional protocol performed better segmentation, while the user-generated protocol demonstrated that someone who had relatively little background with CL-Quant can successfully create protocols. The protocols were applied to hESC that had been treated with ROCK inhibitors or blebbistatin, which tend to cause rapid attachment and spreading of hESC colonies. All treatments enabled hESC to attach rapidly. Cells treated with the ROCK inhibitors or blebbistatin spread more than controls and often looked stressed. The use of the spreading analysis protocol can provide a very rapid method to evaluate the cytotoxicity of chemical treatment and reveal effects on the cytoskeleton of the cell. While hESC are presented in this chapter, other cell types could also be used in conjunction with the spreading protocol

    Developing a systems and informatics based approach to lifestyle monitoring within eHealth:part I - technology and data management

    Get PDF
    Lifestyle monitoring forms a subset of telecare in which data derived from sensors located in the home is used to identify variations in behaviour which are indicative of a change in care needs. Key to this is the performance of the sensors themselves and the way in which the information from multiple sources is integrated within the decision making process. The paper therefore considers the functions of the key sensors currently deployed and places their operation within the context of a proposed multi-level system structure which takes due cognisance of the requisite informatics framework

    Mechatronics & the cloud

    Get PDF
    Conventionally, the engineering design process has assumed that the design team is able to exercise control over all elements of the design, either directly or indirectly in the case of sub-systems through their specifications. The introduction of Cyber-Physical Systems (CPS) and the Internet of Things (IoT) means that a design teamā€™s ability to have control over all elements of a system is no longer the case, particularly as the actual system configuration may well be being dynamically reconfigured in real-time according to user (and vendor) context and need. Additionally, the integration of the Internet of Things with elements of Big Data means that information becomes a commodity to be autonomously traded by and between systems, again according to context and need, all of which has implications for the privacy of system users. The paper therefore considers the relationship between mechatronics and cloud-basedtechnologies in relation to issues such as the distribution of functionality and user privacy

    A Parallel Deconvolution Algorithm in Perfusion Imaging

    Get PDF

    Global disease monitoring and forecasting with Wikipedia

    Full text link
    Infectious disease is a leading threat to public health, economic stability, and other key social structures. Efforts to mitigate these impacts depend on accurate and timely monitoring to measure the risk and progress of disease. Traditional, biologically-focused monitoring techniques are accurate but costly and slow; in response, new techniques based on social internet data such as social media and search queries are emerging. These efforts are promising, but important challenges in the areas of scientific peer review, breadth of diseases and countries, and forecasting hamper their operational usefulness. We examine a freely available, open data source for this use: access logs from the online encyclopedia Wikipedia. Using linear models, language as a proxy for location, and a systematic yet simple article selection procedure, we tested 14 location-disease combinations and demonstrate that these data feasibly support an approach that overcomes these challenges. Specifically, our proof-of-concept yields models with r2r^2 up to 0.92, forecasting value up to the 28 days tested, and several pairs of models similar enough to suggest that transferring models from one location to another without re-training is feasible. Based on these preliminary results, we close with a research agenda designed to overcome these challenges and produce a disease monitoring and forecasting system that is significantly more effective, robust, and globally comprehensive than the current state of the art.Comment: 27 pages; 4 figures; 4 tables. Version 2: Cite McIver & Brownstein and adjust novelty claims accordingly; revise title; various revisions for clarit

    Automated Classification System for HEp-2 Cell Patterns

    Get PDF
    Human Epithelial Type-2 (HEp-2) cells are essential in diagnosing autoimmune diseases. Indirect immunofluorescence (IIF) imaging is a fundamental technique for detecting antinuclear antibodies in HEp-2 cells. The four main patterns of HEp-2 cells that are being identified are nucleolar, homogeneous, speckled and centromere. The most commonly used method to classify the patterns is manual evaluation. This method is prone to human error. This paper will propose an automated method of classifying HEp-2 cells patterns. The first stage is image enhancement using Histogram equalization contrast adjustment and Wiener Filter. The second stage uses Sobel Filter and Mean Filter for segmentation. The third stage feature extraction based on shape properties data extraction. The last stage uses classification based on different properties data abstracted. The results obtained are more than 90% for nucleolar and centromere and about 70% for homogenous and speckled. For future work, another feature extraction method need to be introduced to increase the accuracy of the classification result. The method suggested is to analyze and obtain the data based on the texture of the image

    Looking in the Right place for Anomalies: Explainable AI through Automatic Location Learning

    Full text link
    Deep learning has now become the de facto approach to the recognition of anomalies in medical imaging. Their 'black box' way of classifying medical images into anomaly labels poses problems for their acceptance, particularly with clinicians. Current explainable AI methods offer justifications through visualizations such as heat maps but cannot guarantee that the network is focusing on the relevant image region fully containing the anomaly. In this paper, we develop an approach to explainable AI in which the anomaly is assured to be overlapping the expected location when present. This is made possible by automatically extracting location-specific labels from textual reports and learning the association of expected locations to labels using a hybrid combination of Bi-Directional Long Short-Term Memory Recurrent Neural Networks (Bi-LSTM) and DenseNet-121. Use of this expected location to bias the subsequent attention-guided inference network based on ResNet101 results in the isolation of the anomaly at the expected location when present. The method is evaluated on a large chest X-ray dataset.Comment: 5 pages, Paper presented as a poster at the International Symposium on Biomedical Imaging, 2020, Paper Number 65
    • ā€¦
    corecore