524,313 research outputs found

    Adoption of Free Open Source Geographic Information System Solution for Health Sector in Zanzibar Tanzania

    Get PDF
    \ud The study aims at developing in-depth understanding on how Open Source Geographic Information System technology is used to provide solutions for data visualization in the health sector of Zanzibar, Tanzania. The study focuses on implementing the health visualization solutions for the purpose of bridging the gap during the transition period from proprietary software to the Free Open-Source Software using Key Indicator Data System. The developed tool facilitates data integration between the two District Health Information Software versions and hence served as a gateway solution during the transition process. Implementation challenges that include outdated spatial data and the reluctance of the key users in coping with the new Geographical Information System technologies were also identified. Participatory action research and interviews were used in understanding the requirements for the new tool to facilitate the smooth system development for better health service delivery.\u

    Supporting the active learning of collaborative database browsing techniques

    Get PDF
    We describe the implications of a study of database browsing behaviour for the development of a system to support more effective browsing. In particular we consider the importance of collaborative working, both in learning browsing skills and in co‐operating on a shared information‐retrieval task. From our study, we believe that an interface to support collaboration should promote the awareness of the activities of others, better visualization of the information data structures being browsed, and effective communication of the browsing process

    Information visualization for business applications

    Get PDF
    Business applications are generating ever-increasing amounts of data, and more and more of [these] data are in real-time data. Data mining is a standard feature in most data management system[s] for retrieving complex information but the challenge is about how to represent [this] information in an effective way for better analysis and decision making. With this rich information, the challenge still remains to derive the most business value from [an] ever-increasing amount of available information. Information visualization can significantly solve this problem for handling such complex and high volumes of data to be represented in business applications. Through this research, an attempt will be made to solve this problem by prototyping an interactive data driven business application through visualization. The key focus of this research work will be given in the \u27Presentation Tier\u27 of a business application for visually representing the information in an aesthetic matter for better understanding

    Immersive and Collaborative Data Visualization Using Virtual Reality Platforms

    Get PDF
    Effective data visualization is a key part of the discovery process in the era of big data. It is the bridge between the quantitative content of the data and human intuition, and thus an essential component of the scientific path from data into knowledge and understanding. Visualization is also essential in the data mining process, directing the choice of the applicable algorithms, and in helping to identify and remove bad data from the analysis. However, a high complexity or a high dimensionality of modern data sets represents a critical obstacle. How do we visualize interesting structures and patterns that may exist in hyper-dimensional data spaces? A better understanding of how we can perceive and interact with multi dimensional information poses some deep questions in the field of cognition technology and human computer interaction. To this effect, we are exploring the use of immersive virtual reality platforms for scientific data visualization, both as software and inexpensive commodity hardware. These potentially powerful and innovative tools for multi dimensional data visualization can also provide an easy and natural path to a collaborative data visualization and exploration, where scientists can interact with their data and their colleagues in the same visual space. Immersion provides benefits beyond the traditional desktop visualization tools: it leads to a demonstrably better perception of a datascape geometry, more intuitive data understanding, and a better retention of the perceived relationships in the data.Comment: 6 pages, refereed proceedings of 2014 IEEE International Conference on Big Data, page 609, ISBN 978-1-4799-5665-

    Bridging the Gap between NASA Hydrological Data and the Geospatial Community

    Get PDF
    There is a vast and ever increasing amount of data on the Earth interconnected energy and hydrological systems, available from NASA remote sensing and modeling systems, and yet, one challenge persists: increasing the usefulness of these data for, and thus their use by, the geospatial communities. The Hydrology Data and Information Services Center (HDISC), part of the Goddard Earth Sciences DISC, has continually worked to better understand the hydrological data needs of the geospatial end users, to thus better able to bridge the gap between NASA data and the geospatial communities. This paper will cover some of the hydrological data sets available from HDISC, and the various tools and services developed for data searching, data subletting ; format conversion. online visualization and analysis; interoperable access; etc.; to facilitate the integration of NASA hydrological data by end users. The NASA Goddard data analysis and visualization system, Giovanni, is described. Two case examples of user-customized data services are given, involving the EPA BASINS (Better Assessment Science Integrating point & Non-point Sources) project and the CUAHSI Hydrologic Information System, with the common requirement of on-the-fly retrieval of long duration time series for a geographical poin

    Deep learning with convolutional neural networks for decoding and visualization of EEG pathology

    Get PDF
    We apply convolutional neural networks (ConvNets) to the task of distinguishing pathological from normal EEG recordings in the Temple University Hospital EEG Abnormal Corpus. We use two basic, shallow and deep ConvNet architectures recently shown to decode task-related information from EEG at least as well as established algorithms designed for this purpose. In decoding EEG pathology, both ConvNets reached substantially better accuracies (about 6% better, ~85% vs. ~79%) than the only published result for this dataset, and were still better when using only 1 minute of each recording for training and only six seconds of each recording for testing. We used automated methods to optimize architectural hyperparameters and found intriguingly different ConvNet architectures, e.g., with max pooling as the only nonlinearity. Visualizations of the ConvNet decoding behavior showed that they used spectral power changes in the delta (0-4 Hz) and theta (4-8 Hz) frequency range, possibly alongside other features, consistent with expectations derived from spectral analysis of the EEG data and from the textual medical reports. Analysis of the textual medical reports also highlighted the potential for accuracy increases by integrating contextual information, such as the age of subjects. In summary, the ConvNets and visualization techniques used in this study constitute a next step towards clinically useful automated EEG diagnosis and establish a new baseline for future work on this topic.Comment: Published at IEEE SPMB 2017 https://www.ieeespmb.org/2017

    Data Visualization and Rapid Analytics: Applying Tableau Desktop to Support Library Decision-Making

    Get PDF
    Data visualization offers librarians the ability to better manage, explore and present information collected by various individuals throughout a library organization. This article discusses The Ohio State University Libraries experiments with Tableau, a sophisticated data-visualization and rapid analytics software. Tableau allows librarians to blend and leverage data collected from a number of disparate sources, including transaction logs, Google Analytics, and e-resource usage reports. The article provides context for incorporating data-visualization into the OSU Libraries assessment program and shares examples of visualizations created for two data-analysis projects. The benefits of blending and simultaneously viewing visualizations of data from multiple sources are articulated and explored. The article concludes with a short discussion of potential future projects for visualizing library data using Tableau Desktop.Publisher does not allow open access until after publicatio
    corecore