895 research outputs found

    Bundled Crossings Revisited

    Get PDF
    An effective way to reduce clutter in a graph drawing that has (many) crossings is to group edges that travel in parallel into \emph{bundles}. Each edge can participate in many such bundles. Any crossing in this bundled graph occurs between two bundles, i.e., as a \emph{bundled crossing}. We consider the problem of bundled crossing minimization: A graph is given and the goal is to find a bundled drawing with at most kk bundled crossings. We show that the problem is NP-hard when we require a simple drawing. Our main result is an FPT algorithm (in kk) when we require a simple circular layout. These results make use of the connection between bundled crossings and graph genus.Comment: Appears in the Proceedings of the 27th International Symposium on Graph Drawing and Network Visualization (GD 2019

    Bundled Crossings Revisited

    Get PDF
    International audienceAn effective way to reduce clutter in a graph drawing that has (many) crossings is to group edges that travel in parallel into bundles. Each edge can participate in many such bundles. Any crossing in this bundled graph occurs between two bundles, i.e., as a bundled crossing. We consider the problem of bundled crossing minimization: A graph is given and the goal is to find a bundled drawing with at most k bundled crossings. We show that the problem is NP-hard when we require a simple drawing. Our main result is an FPT algorithm (in k) for simple circular layouts where vertices must be placed on a circle and edges must be drawn inside the circle. These results make use of the connection between bundled crossings and graph genus. We also consider bundling crossings in a given drawing, in particular for storyline visualizations

    An Information-Theoretic Framework for Evaluating Edge Bundling Visualization

    Get PDF
    Edge bundling is a promising graph visualization approach to simplifying the visual result of a graph drawing. Plenty of edge bundling methods have been developed to generate diverse graph layouts. However, it is difficult to defend an edge bundling method with its resulting layout against other edge bundling methods as a clear theoretic evaluation framework is absent in the literature. In this paper, we propose an information-theoretic framework to evaluate the visual results of edge bundling techniques. We first illustrate the advantage of edge bundling visualizations for large graphs, and pinpoint the ambiguity resulting from drawing results. Second, we define and quantify the amount of information delivered by edge bundling visualization from the underlying network using information theory. Third, we propose a new algorithm to evaluate the resulting layouts of edge bundling using the amount of the mutual information between a raw network dataset and its edge bundling visualization. Comparison examples based on the proposed framework between different edge bundling techniques are presented

    DEPLOYING, IMPROVING AND EVALUATING EDGE BUNDLING METHODS FOR VISUALIZING LARGE GRAPHS

    Get PDF
    A tremendous increase in the scale of graphs has been witnessed in a wide range of fields, which demands efficient and effective visualization techniques to assist users in better understandings of large graphs. Conventional node-link diagrams are often used to visualize graphs, whereas excessive edge crossings can easily incur severe visual clutter in the node-link diagram of a large graph. Edge bundling can effectively remedy visual clutter and reveal high-level graph structures. Although significant efforts have been devoted to developing edge bundling, three challenging problems remain. First, edge bundling techniques are often computationally expensive and are not easy to deploy for web-based applications. The state-of-the-art edge bundling methods often require special system supports and techniques such as high-end GPU acceleration for large graphs, which makes these methods less portable, especially for ubiquitous mobile devices. Second, the quantitative quality of edge bundling results is barely assessed in the literature. Currently, the comparison of edge bundling mainly focuses on computational performance and perceptual results. Third, although the family of edge bundling techniques has a rich set of bundling layout, there is a lack of a generic method to generate different styles of edge bundling. In this research, I aim to address these problems and have made the following contributions. First, I provide an efficient framework to deploy edge bundling for web-based platforms by exploiting standard graphics hardware functions and libraries. My framework can generate high-quality edge bundling results on web-based platforms, and achieve a speedup of 50X compared to the previous state-of-the-art edge bundling method on a graph with half of a million edges. Second, I propose a new moving least squares based approach to lower the algorithm complexity of edge bundling. In addition, my approach can generate better bundling results compared to other methods based on a quality metric. Third, I provide an information-theoretic metric to evaluate the edge bundling methods. I leverage information theory in this metric. With my information-theoretic metric, domain users can choose appropriate edge bundling methods with proper parameters for their applications. Last but not least, I present a deep learning framework for edge bundling visualizations. Through a training process that learns the results of a specific edge bundling method, my deep learning framework can infer the final layout of the edge bundling method. My deep learning framework is a generic framework that can generate the corresponding results of different edge bundling methods. Adviser: Hongfeng Y

    Visualizing Spatio-Temporal data

    Get PDF
    The amount of spatio-temporal data produced everyday has sky rocketed in the recent years due to the commercial GPS systems and smart devices. Together with this, the need for tools and techniques to analyze this kind of data have also increased. A major task of spatio-temporal data analysis is to discover relationships and patterns among spatially and temporally scattered events. However, most of the existing visualization techniques implement a top-down approach i.e, they require prior knowledge of existing patterns. In this dissertation, I present my novel visualization technique called Storygraph which supports bottom-up discovery of patterns. Since Storygraph presents and integrated view, analysis of events can be done with losing either of time or spatial contexts. In addition, Storygraph can handle spatio-temporal uncertainty making it ideal for data being extracted from text. In the subsequent chapters, I demonstrate the versatility and the effectiveness of the Storygraph along with case studies from my published works. Finally, I also talk about edge bundling in Storygraph to enhance the aesthetics and improve the readability of Storygraph

    Engineering Approaches for Improving Cortical Interfacing and Algorithms for the Evaluation of Treatment Resistant Epilepsy

    Get PDF
    abstract: Epilepsy is a group of disorders that cause seizures in approximately 2.2 million people in the United States. Over 30% of these patients have epilepsies that do not respond to treatment with anti-epileptic drugs. For this population, focal resection surgery could offer long-term seizure freedom. Surgery candidates undergo a myriad of tests and monitoring to determine where and when seizures occur. The “gold standard” method for focus identification involves the placement of electrocorticography (ECoG) grids in the sub-dural space, followed by continual monitoring and visual inspection of the patient’s cortical activity. This process, however, is highly subjective and uses dated technology. Multiple studies were performed to investigate how the evaluation process could benefit from an algorithmic adjust using current ECoG technology, and how the use of new microECoG technology could further improve the process. Computational algorithms can quickly and objectively find signal characteristics that may not be detectable with visual inspection, but many assume the data are stationary and/or linear, which biological data are not. An empirical mode decomposition (EMD) based algorithm was developed to detect potential seizures and tested on data collected from eight patients undergoing monitoring for focal resection surgery. EMD does not require linearity or stationarity and is data driven. The results suggest that a biological data driven algorithm could serve as a useful tool to objectively identify changes in cortical activity associated with seizures. Next, the use of microECoG technology was investigated. Though both ECoG and microECoG grids are composed of electrodes resting on the surface of the cortex, changing the diameter of the electrodes creates non-trivial changes in the physics of the electrode-tissue interface that need to be accounted for. Experimenting with different recording configurations showed that proper grounding, referencing, and amplification are critical to obtain high quality neural signals from microECoG grids. Finally, the relationship between data collected from the cortical surface with micro and macro electrodes was studied. Simultaneous recordings of the two electrode types showed differences in power spectra that suggest the inclusion of activity, possibly from deep structures, by macroelectrodes that is not accessible by microelectrodes.Dissertation/ThesisDoctoral Dissertation Bioengineering 201

    The Permit Power Revisited: The Theory and Practice of Regulatory Permits in the Administrative State

    Get PDF
    Two decades ago, Professor Richard Epstein fired a shot at the administrative state that has gone largely unanswered in legal scholarship. His target was the permit power, under which legislatures prohibit a specified activity by statute and delegate to administrative agencies the discretionary power to authorize the activity under terms the agency mandates in a regulatory permit. Accurately describing the permit power as an enormous power in the state, Epstein bemoaned that it had received scant attention in the academic literature. He sought to fill that gap. Centered on the premise that the permit power represents a complete inversion of the proper distribution of power within a legal system, Epstein launched a scathing critique of regulatory permitting in operation, condemning it as a racket for administrative abuses and excesses. Epstein\u27s assessment of the permit power was and remains accurate in three respects: it is vast in scope, it is ripe for administrative abuse, and it has been largely ignored in legal scholarship. The problem is that, beyond what he got right about the permit power, most of Epstein\u27s critique was based on an incomplete caricature of permitting in theory and practice. This Article is the first to return comprehensively to the permit power since Epstein\u27s critique, offering a deep account of the theory and practice of regulatory permits in the administrative state. This Article opens by defining the various types of regulatory permits and describing the scope of permitting in the regulatory state. From there it compares different permit design approaches and explores the advantages of general permits, including their ability to mitigate many of the concerns Epstein advanced. This Article then applies a theoretical model to environmental degradation problems and concludes that if certain conditions are met, general permits can effectively respond to many of the complex policy problems of the future. Finally, this Article adds to the scholarship initiated by Epstein by proposing a set of default rules and exceptions for permit design and suggesting how they apply to complex policy problems

    Problems and Solutions Regarding Indigenous Peoples Split by International Borders

    Get PDF

    AccuSyn: Using Simulated Annealing to Declutter Genome Visualizations

    Get PDF
    We apply Simulated Annealing, a well-known metaheuristic for obtaining near-optimal solutions to optimization problems, to discover conserved synteny relations (similar features) in genomes. The analysis of synteny gives biologists insights into the evolutionary history of species and the functional relationships between genes. However, as even simple organisms have huge numbers of genomic features, syntenic plots initially present an enormous clutter of connections, making the structure difficult to understand. We address this problem by using Simulated Annealing to minimize link crossings. Our interactive web-based synteny browser, AccuSyn, visualizes syntenic relations with circular plots of chromosomes and draws links between similar blocks of genes. It also brings together a huge amount of genomic data by integrating an adjacent view and additional tracks, to visualize the details of the blocks and accompanying genomic data, respectively. Our work shows multiple ways to manually declutter a synteny plot and then thoroughly explains how we integrated Simulated Annealing, along with human interventions as a human-in-the-loop approach, to achieve an accurate representation of conserved synteny relations for any genome. The goal of AccuSyn was to make a fairly complete tool combining ideas from four major areas: genetics, information visualization, heuristic search, and human-in-the-loop. Our results contribute to a better understanding of synteny plots and show the potential that decluttering algorithms have for syntenic analysis, adding more clues for evolutionary development. At this writing, AccuSyn is already actively used in the research being done at the University of Saskatchewan and has already produced a visualization of the recently-sequenced Wheat genome
    • …
    corecore