9,774 research outputs found

    Improving Bayesian statistics understanding in the age of Big Data with the bayesvl R package

    Get PDF
    The exponential growth of social data both in volume and complexity has increasingly exposed many of the shortcomings of the conventional frequentist approach to statistics. The scientific community has called for careful usage of the approach and its inference. Meanwhile, the alternative method, Bayesian statistics, still faces considerable barriers toward a more widespread application. The bayesvl R package is an open program, designed for implementing Bayesian modeling and analysis using the Stan language’s no-U-turn (NUTS) sampler. The package combines the ability to construct Bayesian network models using directed acyclic graphs (DAGs), the Markov chain Monte Carlo (MCMC) simulation technique, and the graphic capability of the ggplot2 package. As a result, it can improve the user experience and intuitive understanding when constructing and analyzing Bayesian network models. A case example is offered to illustrate the usefulness of the package for Big Data analytics and cognitive computing

    Unmasking Clever Hans Predictors and Assessing What Machines Really Learn

    Full text link
    Current learning machines have successfully solved hard application problems, reaching high accuracy and displaying seemingly "intelligent" behavior. Here we apply recent techniques for explaining decisions of state-of-the-art learning machines and analyze various tasks from computer vision and arcade games. This showcases a spectrum of problem-solving behaviors ranging from naive and short-sighted, to well-informed and strategic. We observe that standard performance evaluation metrics can be oblivious to distinguishing these diverse problem solving behaviors. Furthermore, we propose our semi-automated Spectral Relevance Analysis that provides a practically effective way of characterizing and validating the behavior of nonlinear learning machines. This helps to assess whether a learned model indeed delivers reliably for the problem that it was conceived for. Furthermore, our work intends to add a voice of caution to the ongoing excitement about machine intelligence and pledges to evaluate and judge some of these recent successes in a more nuanced manner.Comment: Accepted for publication in Nature Communication

    Bubble-Wall Plot: A New Tool for Data Visualization

    Get PDF
    This research aimed to design a new tool for data visualization with performed features - named Bubble-Wall Plot and assumed that it could be an effective tool for developing data visualization systems. This research reviewed seven data visualization approaches for identifying the outliers, including Line Charts, Parallel Coordinates Plot, Scatter Plots, TreeMap, Glyphs, Pixel-based techniques, and Redial visualizations. The challenges for current data visualization approaches were also summarized. Two principles were addressed to design the new tool- keep it simple strategy with the smallest strategy. As a result, the newly designed Bubble-Wall Plot has successfully been adopted to develop a warning system for identifying the outliers in a Case Study company, which was deployed for user acceptance testing in May 2021. The main contribution is that this newly designed tool with the simplest style was well-designed and proven to effectively develop a warning visualization system

    Navigated Ultrasound in Laparoscopic Surgery

    Get PDF

    Travails in the third dimension: a critical evaluation of three-dimensional geographical visualization

    Get PDF
    Several broad questions are posed about the role of the third dimension in data visualization. First, how far have we come in developing effective 3D displays for the analysis of spatial and other data? Second, when is it appropriate to use 3D techniques in visualising data, which 3D techniques are most appropriate for particular applications, and when might 2D approaches be more appropriate? (Indeed, is 3D always better than 2D?) Third, what can we learn from other communities in which 3D graphics and visualization technologies have been developed? And finally, what are the key R&D challenges in making effective use of the third dimension for visualising data across the spatial and related sciences? Answers to these questions will be based on several lines of evidence: the extensive literature on data and information visualization; visual perception research; computer games technology; and the author’s experiments with a prototype 3D data visualization system

    Advanced Visualization and Intuitive User Interface Systems for Biomedical Applications

    Get PDF
    Modern scientific research produces data at rates that far outpace our ability to comprehend and analyze it. Such sources include medical imaging data and computer simulations, where technological advancements and spatiotemporal resolution generate increasing amounts of data from each scan or simulation. A bottleneck has developed whereby medical professionals and researchers are unable to fully use the advanced information available to them. By integrating computer science, computer graphics, artistic ability and medical expertise, scientific visualization of medical data has become a new field of study. The objective of this thesis is to develop two visualization systems that use advanced visualization, natural user interface technologies and the large amount of biomedical data available to produce results that are of clinical utility and overcome the data bottleneck that has developed. Computational Fluid Dynamics (CFD) is a tool used to study the quantities associated with the movement of blood by computer simulation. We developed methods of processing spatiotemporal CFD data and displaying it in stereoscopic 3D with the ability to spatially navigate through the data. We used this method with two sets of display hardware: a full-scale visualization environment and a small-scale desktop system. The advanced display and data navigation abilities provide the user with the means to better understand the relationship between the vessel\u27s form and function. Low-cost 3D, depth-sensing cameras capture and process user body motion to recognize motions and gestures. Such devices allow users to use hand motions as an intuitive interface to computer applications. We developed algorithms to process and prepare the biomedical and scientific data for use with a custom control application. The application interprets user gestures as commands to a visualization tool and allows the user to control the visualization of multi-dimensional data. The intuitive interface allows the user to control the visualization of data without manual contact with an interaction device. In developing these methods and software tools we have leveraged recent trends in advanced visualization and intuitive interfaces in order to efficiently visualize biomedical data in such a way that provides meaningful information that can be used to further appreciate it

    Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data

    Full text link
    Abstract Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be ‘team science’.http://deepblue.lib.umich.edu/bitstream/2027.42/134522/1/13742_2016_Article_117.pd

    TrauMAP - Integrating Anatomical and Physiological Simulation (Dissertation Proposal)

    Get PDF
    In trauma, many injuries impact anatomical structures, which may in turn affect physiological processes - not only those processes within the structure, but also ones occurring in physical proximity to them. Our goal with this research is to model mechanical interactions of different body systems and their impingement on underlying physiological processes. We are particularly concerned with pathological situations in which body system functions that normally do not interact become dependent as a result of mechanical behavior. Towards that end, the proposed TRAUMAP system (Trauma Modeling of Anatomy and Physiology) consists of three modules: (1) a hypothesis generator for suggesting possible structural changes that result from the direct injuries sustained; (2) an information source for responding to operator querying about anatomical structures, physiological processes, and pathophysiological processes; and (3) a continuous system simulator for simulating and illustrating anatomical and physiological changes in three dimensions. Models that can capture such changes may serve as an infrastructure for more detailed modeling and benefit surgical planning, surgical training, and general medical education, enabling students to visualize better, in an interactive environment, certain basic anatomical and physiological dependencies
    • …
    corecore