312 research outputs found

    Gunrock: GPU Graph Analytics

    Full text link
    For large-scale graph analytics on the GPU, the irregularity of data access and control flow, and the complexity of programming GPUs, have presented two significant challenges to developing a programmable high-performance graph library. "Gunrock", our graph-processing system designed specifically for the GPU, uses a high-level, bulk-synchronous, data-centric abstraction focused on operations on a vertex or edge frontier. Gunrock achieves a balance between performance and expressiveness by coupling high performance GPU computing primitives and optimization strategies with a high-level programming model that allows programmers to quickly develop new graph primitives with small code size and minimal GPU programming knowledge. We characterize the performance of various optimization strategies and evaluate Gunrock's overall performance on different GPU architectures on a wide range of graph primitives that span from traversal-based algorithms and ranking algorithms, to triangle counting and bipartite-graph-based algorithms. The results show that on a single GPU, Gunrock has on average at least an order of magnitude speedup over Boost and PowerGraph, comparable performance to the fastest GPU hardwired primitives and CPU shared-memory graph libraries such as Ligra and Galois, and better performance than any other GPU high-level graph library.Comment: 52 pages, invited paper to ACM Transactions on Parallel Computing (TOPC), an extended version of PPoPP'16 paper "Gunrock: A High-Performance Graph Processing Library on the GPU

    An Interactive Deformable Model Segmentation Algorithm Driven by Morphological Dilations and Erosions Constrained by an Exclusion Band

    Get PDF
    This study introduces an interactive image segmentation algorithm for extraction of ill-defined edges (faint, blurred or partially broken) often observed at small-scale imaging. It is based on a simplified deformable elastic model evolution paradigm. Segmentation is achieved as a two-step region-growing, shrinking and merging simulation constrained by an exclusion band built around the edges of the regions of interest, defined from a variation image. The simulation starts from a set of unlabeled markers and the respective elastic models. During the first step, model evolution occurs entirely outside the exclusion band, driven by alternate action-reaction movements. Forward and backward movements are performed by constrained binary morphological dilations and erosions. Constraints allow controlling how far models can move through narrow gaps. At the end of the first step, models remaining from merging operations receive unique and exclusive labels. On the second and final step, models expansion occurs entirely inside the exclusion band, now driven only by binary unconstrained morphological dilations. A point where two labeled models get into contact defines an edge point. The simulation goes on until the concurrent expansion of all models comes to a complete stop. At this point, the edges of the regions-of-interest have been extracted. Interactivity introduces the possibility to correct small imperfections in the edge positioning by changing a parameter controlling action-reaction or by changing marker's size, position and shape. Slightly inspired by traditional approaches as PDE Level-Set based curve evolution and Immersion Simulation, the algorithm presents a solution to the problem of "synchronizing the concurrent evolution of a large number of models" and an "automatic stopping criterion" for the front propagation. Integer arithmetic implementation assures linear execution time. The results obtained for real applications show that even ill-defined edges can be located with the desired accuracy, thanks to algorithm features and to the interactivity exerted by the user during the segmentation procedure

    Advanced reduction techniques for model checking

    Get PDF

    Semiautomatic contour detection of breast lesions in ultrasonic images with morphological operators and average radial derivative function

    Get PDF
    AbstractThis work presents a computerized lesion segmentation technique on breast ultrasound images. There were applied known techniques such as morphological filtering, Watershed transformation and average radial derivative function. To evaluate the performance of the proposed method, two protocols were established. For the first, the resulting segmentation contours were compared with those of 24 gold standard simulated ultrasound-like images, and, for second, with 36 breast US images manually delineated by two senior radiologists. Further, two evaluation parameters were used: the percentage of coincidence (CP) and the proportional distance (PD). The former indicates the similarity between contours, while the latter express the dissimilarity. The accuracy of the proposed method was evaluated by considering images with CP>80% and PD<10% as adequately delineated. It was higher than 80% for real images and higher than 88% for simulated images

    Indeterministic Handling of Uncertain Decisions in Duplicate Detection

    Get PDF
    In current research, duplicate detection is usually considered as a deterministic approach in which tuples are either declared as duplicates or not. However, most often it is not completely clear whether two tuples represent the same real-world entity or not. In deterministic approaches, however, this uncertainty is ignored, which in turn can lead to false decisions. In this paper, we present an indeterministic approach for handling uncertain decisions in a duplicate detection process by using a probabilistic target schema. Thus, instead of deciding between multiple possible worlds, all these worlds can be modeled in the resulting data. This approach minimizes the negative impacts of false decisions. Furthermore, the duplicate detection process becomes almost fully automatic and human effort can be reduced to a large extent. Unfortunately, a full-indeterministic approach is by definition too expensive (in time as well as in storage) and hence impractical. For that reason, we additionally introduce several semi-indeterministic methods for heuristically reducing the set of indeterministic handled decisions in a meaningful way

    SODALITE@RT: Orchestrating Applications on Cloud-Edge Infrastructures

    Get PDF
    AbstractIoT-based applications need to be dynamically orchestrated on cloud-edge infrastructures for reasons such as performance, regulations, or cost. In this context, a crucial problem is facilitating the work of DevOps teams in deploying, monitoring, and managing such applications by providing necessary tools and platforms. The SODALITE@RT open-source framework aims at addressing this scenario. In this paper, we present the main features of the SODALITE@RT: modeling of cloud-edge resources and applications using open standards and infrastructural code, and automated deployment, monitoring, and management of the applications in the target infrastructures based on such models. The capabilities of the SODALITE@RT are demonstrated through a relevant case study

    Using Karnaugh Maps in Software Requirements Analysis

    Get PDF
    Faulty requirements leading to design deficiencies have been shown to be an avoidable root cause of many product failures. This paper is an effort to push the boundaries of system safety by proposing a novel approach for discovering faulty or missing software requirements by adapting a proven methodology heretofore used in circuit analysis. Karnaugh Mapping is employed in Application-Specific Integrated Circuit (ASIC) design to minimize power consumption, facilitate temperature control, increase functionality and minimize the number of physical logic gates. Karnaugh Maps (K-Maps) are ideally suited to impose order on logical requirements that describe the operation of electronic circuits. With the assumption that software requirements are expressible as logical statements, this paper assesses the ability of Karnaugh Mapping to effectively deconstruct and rationalize developmental requirements in the analysis of software and seeks to demonstrate that K-Maps can be used not only to minimize the number of requirements, but also to detect missing requirements. The analysis conducted in the course of developing this paper indicates that K‑Maps can effectively identify faulty requirements in two examples of varying complexity, provided that sematic conventions are established and observed

    Scalable and Reliable Middlebox Deployment

    Get PDF
    Middleboxes are pervasive in modern computer networks providing functionalities beyond mere packet forwarding. Load balancers, intrusion detection systems, and network address translators are typical examples of middleboxes. Despite their benefits, middleboxes come with several challenges with respect to their scalability and reliability. The goal of this thesis is to devise middlebox deployment solutions that are cost effective, scalable, and fault tolerant. The thesis includes three main contributions: First, distributed service function chaining with multiple instances of a middlebox deployed on different physical servers to optimize resource usage; Second, Constellation, a geo-distributed middlebox framework enabling a middlebox application to operate with high performance across wide area networks; Third, a fault tolerant service function chaining system

    On normalized compression distance and large malware

    Get PDF
    corecore