21,487 research outputs found

    Modeling IoT-aware Business Processes - A State of the Art Report

    Get PDF
    This research report presents an analysis of the state of the art of modeling Internet of Things (IoT)-aware business processes. IOT links the physical world to the digital world. Traditionally, we would find information about events and processes in the physical world in the digital world entered by humans and humans using this information to control the physical world. In the IoT paradigm, the physical world is equipped with sensors and actuators to create a direct link with the digital world. Business processes are used to coordinate a complex environment including multiple actors for a common goal, typically in the context of administrative work. In the past few years, we have seen research efforts on the possibilities to model IoT- aware business processes, extending process coordination to real world entities directly. This set of research efforts is relatively small when compared to the overall research effort into the IoT and much of the work is still in the early research stage. To create a basis for a bridge between IoT and BPM, the goal of this report is to collect and analyze the state of the art of existing frameworks for modeling IoT-aware business processes.Comment: 42 page

    An n-sided polygonal model to calculate the impact of cyber security events

    Full text link
    This paper presents a model to represent graphically the impact of cyber events (e.g., attacks, countermeasures) in a polygonal systems of n-sides. The approach considers information about all entities composing an information system (e.g., users, IP addresses, communication protocols, physical and logical resources, etc.). Every axis is composed of entities that contribute to the execution of the security event. Each entity has an associated weighting factor that measures its contribution using a multi-criteria methodology named CARVER. The graphical representation of cyber events is depicted as straight lines (one dimension) or polygons (two or more dimensions). Geometrical operations are used to compute the size (i.e, length, perimeter, surface area) and thus the impact of each event. As a result, it is possible to identify and compare the magnitude of cyber events. A case study with multiple security events is presented as an illustration on how the model is built and computed.Comment: 16 pages, 5 figures, 2 tables, 11th International Conference on Risks and Security of Internet and Systems, (CRiSIS 2016), Roscoff, France, September 201

    Big Data Visualization Tools

    Full text link
    Data visualization is the presentation of data in a pictorial or graphical format, and a data visualization tool is the software that generates this presentation. Data visualization provides users with intuitive means to interactively explore and analyze data, enabling them to effectively identify interesting patterns, infer correlations and causalities, and supports sense-making activities.Comment: This article appears in Encyclopedia of Big Data Technologies, Springer, 201

    Building Data-Driven Pathways From Routinely Collected Hospital Data:A Case Study on Prostate Cancer

    Get PDF
    Background: Routinely collected data in hospitals is complex, typically heterogeneous, and scattered across multiple Hospital Information Systems (HIS). This big data, created as a byproduct of health care activities, has the potential to provide a better understanding of diseases, unearth hidden patterns, and improve services and cost. The extent and uses of such data rely on its quality, which is not consistently checked, nor fully understood. Nevertheless, using routine data for the construction of data-driven clinical pathways, describing processes and trends, is a key topic receiving increasing attention in the literature. Traditional algorithms do not cope well with unstructured processes or data, and do not produce clinically meaningful visualizations. Supporting systems that provide additional information, context, and quality assurance inspection are needed. Objective: The objective of the study is to explore how routine hospital data can be used to develop data-driven pathways that describe the journeys that patients take through care, and their potential uses in biomedical research; it proposes a framework for the construction, quality assessment, and visualization of patient pathways for clinical studies and decision support using a case study on prostate cancer. Methods: Data pertaining to prostate cancer patients were extracted from a large UK hospital from eight different HIS, validated, and complemented with information from the local cancer registry. Data-driven pathways were built for each of the 1904 patients and an expert knowledge base, containing rules on the prostate cancer biomarker, was used to assess the completeness and utility of the pathways for a specific clinical study. Software components were built to provide meaningful visualizations for the constructed pathways. Results: The proposed framework and pathway formalism enable the summarization, visualization, and querying of complex patient-centric clinical information, as well as the computation of quality indicators and dimensions. A novel graphical representation of the pathways allows the synthesis of such information. Conclusions: Clinical pathways built from routinely collected hospital data can unearth information about patients and diseases that may otherwise be unavailable or overlooked in hospitals. Data-driven clinical pathways allow for heterogeneous data (ie, semistructured and unstructured data) to be collated over a unified data model and for data quality dimensions to be assessed. This work has enabled further research on prostate cancer and its biomarkers, and on the development and application of methods to mine, compare, analyze, and visualize pathways constructed from routine data. This is an important development for the reuse of big data in hospitals

    The Unfulfilled Potential of Data-Driven Decision Making in Agile Software Development

    Get PDF
    With the general trend towards data-driven decision making (DDDM), organizations are looking for ways to use DDDM to improve their decisions. However, few studies have looked into the practitioners view of DDDM, in particular for agile organizations. In this paper we investigated the experiences of using DDDM, and how data can improve decision making. An emailed questionnaire was sent out to 124 industry practitioners in agile software developing companies, of which 84 answered. The results show that few practitioners indicated a widespread use of DDDM in their current decision making practices. The practitioners were more positive to its future use for higher-level and more general decision making, fairly positive to its use for requirements elicitation and prioritization decisions, while being less positive to its future use at the team level. The practitioners do see a lot of potential for DDDM in an agile context; however, currently unfulfilled

    Graph Signal Processing: Overview, Challenges and Applications

    Full text link
    Research in Graph Signal Processing (GSP) aims to develop tools for processing data defined on irregular graph domains. In this paper we first provide an overview of core ideas in GSP and their connection to conventional digital signal processing. We then summarize recent developments in developing basic GSP tools, including methods for sampling, filtering or graph learning. Next, we review progress in several application areas using GSP, including processing and analysis of sensor network data, biological data, and applications to image processing and machine learning. We finish by providing a brief historical perspective to highlight how concepts recently developed in GSP build on top of prior research in other areas.Comment: To appear, Proceedings of the IEE

    Visual analytics for supply network management: system design and evaluation

    Full text link
    We propose a visual analytic system to augment and enhance decision-making processes of supply chain managers. Several design requirements drive the development of our integrated architecture and lead to three primary capabilities of our system prototype. First, a visual analytic system must integrate various relevant views and perspectives that highlight different structural aspects of a supply network. Second, the system must deliver required information on-demand and update the visual representation via user-initiated interactions. Third, the system must provide both descriptive and predictive analytic functions for managers to gain contingency intelligence. Based on these capabilities we implement an interactive web-based visual analytic system. Our system enables managers to interactively apply visual encodings based on different node and edge attributes to facilitate mental map matching between abstract attributes and visual elements. Grounded in cognitive fit theory, we demonstrate that an interactive visual system that dynamically adjusts visual representations to the decision environment can significantly enhance decision-making processes in a supply network setting. We conduct multi-stage evaluation sessions with prototypical users that collectively confirm the value of our system. Our results indicate a positive reaction to our system. We conclude with implications and future research opportunities.The authors would like to thank the participants of the 2015 Businessvis Workshop at IEEE VIS, Prof. Benoit Montreuil, and Dr. Driss Hakimi for their valuable feedback on an earlier version of the software; Prof. Manpreet Hora for assisting with and Georgia Tech graduate students for participating in the evaluation sessions; and the two anonymous reviewers for their detailed comments and suggestions. The study was in part supported by the Tennenbaum Institute at Georgia Tech Award # K9305. (K9305 - Tennenbaum Institute at Georgia Tech Award)Accepted manuscrip

    MITK-ModelFit: A generic open-source framework for model fits and their exploration in medical imaging -- design, implementation and application on the example of DCE-MRI

    Full text link
    Many medical imaging techniques utilize fitting approaches for quantitative parameter estimation and analysis. Common examples are pharmacokinetic modeling in DCE MRI/CT, ADC calculations and IVIM modeling in diffusion-weighted MRI and Z-spectra analysis in chemical exchange saturation transfer MRI. Most available software tools are limited to a special purpose and do not allow for own developments and extensions. Furthermore, they are mostly designed as stand-alone solutions using external frameworks and thus cannot be easily incorporated natively in the analysis workflow. We present a framework for medical image fitting tasks that is included in MITK, following a rigorous open-source, well-integrated and operating system independent policy. Software engineering-wise, the local models, the fitting infrastructure and the results representation are abstracted and thus can be easily adapted to any model fitting task on image data, independent of image modality or model. Several ready-to-use libraries for model fitting and use-cases, including fit evaluation and visualization, were implemented. Their embedding into MITK allows for easy data loading, pre- and post-processing and thus a natural inclusion of model fitting into an overarching workflow. As an example, we present a comprehensive set of plug-ins for the analysis of DCE MRI data, which we validated on existing and novel digital phantoms, yielding competitive deviations between fit and ground truth. Providing a very flexible environment, our software mainly addresses developers of medical imaging software that includes model fitting algorithms and tools. Additionally, the framework is of high interest to users in the domain of perfusion MRI, as it offers feature-rich, freely available, validated tools to perform pharmacokinetic analysis on DCE MRI data, with both interactive and automatized batch processing workflows.Comment: 31 pages, 11 figures URL: http://mitk.org/wiki/MITK-ModelFi

    Anisotropic Radial Layout for Visualizing Centrality and Structure in Graphs

    Full text link
    This paper presents a novel method for layout of undirected graphs, where nodes (vertices) are constrained to lie on a set of nested, simple, closed curves. Such a layout is useful to simultaneously display the structural centrality and vertex distance information for graphs in many domains, including social networks. Closed curves are a more general constraint than the previously proposed circles, and afford our method more flexibility to preserve vertex relationships compared to existing radial layout methods. The proposed approach modifies the multidimensional scaling (MDS) stress to include the estimation of a vertex depth or centrality field as well as a term that penalizes discord between structural centrality of vertices and their alignment with this carefully estimated field. We also propose a visualization strategy for the proposed layout and demonstrate its effectiveness using three social network datasets.Comment: Appears in the Proceedings of the 25th International Symposium on Graph Drawing and Network Visualization (GD 2017
    • …
    corecore