439 research outputs found

    An Object-Oriented Framework for Explicit-State Model Checking

    Get PDF
    This paper presents a conceptual architecture for an object-oriented framework to support the development of formal verification tools (i.e. model checkers). The objective of the architecture is to support the reuse of algorithms and to encourage a modular design of tools. The conceptual framework is accompanied by a C++ implementation which provides reusable algorithms for the simulation and verification of explicit-state models as well as a model representation for simple models based on guard-based process descriptions. The framework has been successfully used to develop a model checker for a subset of PROMELA

    Metal-polymer functionally graded materials for removing guided wave reflections at beam end boundaries

    Get PDF
    This paper investigates the potential of a metal-polymer functionally graded material (FGM) to remove beam end boundary wave reflections that produce complicated interference patterns in the response signals used for guided wave damage identification methodologies. The metal-polymer FGM matches the material properties to a metal beam for total wave transmission on one side and is continuously graded to a viscoelastic polymer on the other side. An Aluminium-Polycarbonate (Al-PC) FGM was fabricated and characterised using microscopy, hardness testing and through-transmission ultrasonics to verify the continuous gradient. Measurements of guided waves on an aluminium beam attached to the FGM on one end show reduction in boundary wave reflections that varies with wave frequency. A damaged aluminium beam attached with the FGM produced promising improvements in a damage identification system

    Staging Laparoscopy for Hilar Cholangiocarcinoma: Is it Still Worthwhile?

    Get PDF
    This study was designed to evaluate the benefit of staging laparoscopy (SL) in patients with suspected hilar cholangiocarcinoma (HCCA) during the past 10 years. Only 50-60% of patients with HCCA who undergo laparotomy are ultimately amenable to a potentially curative resection. In a previous study, we recommended routine use of SL to prevent unnecessary laparotomies. The accuracy of imaging techniques, however, has significantly improved during the past decade, which is likely to impact the yield and accuracy of SL. From 2000 to 2010, 195 patients with suspected HCCA were analyzed. The yield and accuracy of SL were calculated by dividing total number of avoided laparotomies by the total number of laparoscopies or by all patients with unresectable disease, respectively. Factors associated with better yield and accuracy were assessed. Of 195 patients with HCCA, 175 underwent SL. The yield of SL was 14% and the accuracy was 32%. Operative morbidity of SL was 3%, and operative morbidity of laparotomy for unresectable disease was 33%. No clear factors that influenced the yield of SL were found. Overall yield and accuracy of SL for HCCA in the present series decreased to 14% and 32%, respectively, compared with earlier reports. This finding is likely the result of improved imaging techniques that evolved during the past decade. The place of SL in the workup of patients with HCCA needs to be reconsidered, and one should decide whether the declining additional value of SL still outweighs the drawbacks of S

    Application profiling and resource management for MapReduce

    Get PDF
    Scale of data generated and processed is exponential growth in the Big Data ear. It poses a challenge that is far beyond the goal of a single computing system. Processing such vast amount of data on a single machine is impracticable in term of time or cost. Hence, distributed systems, which can harness very large clusters of commodity computers and processing data within restrictive time deadlines, are imperative. In this thesis, we target two aspects of distributed systems: application profiling and resource management. We study a MapReduce system in detail, which is a programming paradigm for large scale distributed computing, and presents solutions to tackle three key problems. Firstly, this thesis analyzes the characteristics of jobs running on the MapReduce system to reveal the problem—the Application scope of MapReduce has been extended beyond the original design goal that was large-scale data processing. This problem enables us to present a Workload Characteristic Oriented Scheduler (WCO), which strives for co-locating tasks of possibly different MapReduce jobs with complementing resource usage characteristics. Secondly, this thesis studies the current job priority mechanism focusing on resource management. In the MapReduce system, job priority only exists at scheduling level. High priority jobs are placed at the front of the scheduling queue and dispatched first. Resource, however, is fairly shared among jobs running at the same worker node without any consideration for their priorities. In order to resolve this, this thesis presents a non-intrusive slot layering solution, which dynamically allocates resource between running jobs based on their priority and efficiently reduces the execution time of high priority jobs while improves overall throughput. Last, based on the fact of underutilization of resource at each individual worker node, this thesis propose a new way, Local Resource Shaper (LRS), to smooth resource consumption of each individual job by automatically tuning the execution of concurrent jobs to maximize resource utilization while minimizing resource contention

    Response

    Get PDF
    corecore