530 research outputs found

    A Polynomial Time Algorithm for Deciding Branching Bisimilarity on Totally Normed BPA

    Full text link
    Strong bisimilarity on normed BPA is polynomial-time decidable, while weak bisimilarity on totally normed BPA is NP-hard. It is natural to ask where the computational complexity of branching bisimilarity on totally normed BPA lies. This paper confirms that this problem is polynomial-time decidable. To our knowledge, in the presence of silent transitions, this is the first bisimilarity checking algorithm on infinite state systems which runs in polynomial time. This result spots an instance in which branching bisimilarity and weak bisimilarity are both decidable but lie in different complexity classes (unless NP=P), which is not known before. The algorithm takes the partition refinement approach and the final implementation can be thought of as a generalization of the previous algorithm of Czerwi\'{n}ski and Lasota. However, unexpectedly, the correctness of the algorithm cannot be directly generalized from previous works, and the correctness proof turns out to be subtle. The proof depends on the existence of a carefully defined refinement operation fitted for our algorithm and the proposal of elaborately developed techniques, which are quite different from previous works.Comment: 32 page

    Coupling Ground Penetrating Radar Applications with Continually Changing Decomposing Human Remains

    Get PDF
    Locating the clandestine burial of human remains has long perplexed law enforcement officials involved in crime scene investigations, and continues to bewilder all the scientific disciplines that have been incorporated into their search and recovery. Locating concealed human remains can often be compared to the proverbial search for a needle in the haystack. Many notable forensic specialists and law enforcement agencies, in an effort to alleviate some of the bewilderment that commonly accompanies the search for a buried body, suggest that multidisciplinary search efforts are becoming more of a necessity, and less of an option. Research at the University of Tennessee’s Anthropological Research Facility (ARF) in Knoxville supports this theory through a collaborative research effort directed toward the development of more efficient and effective methods in the search for, and detection of, buried human remains. The Department of Anthropology, in conjunction with the University’s Department of Biosystems Engineering and Environmental Science, has correlated the use of ground penetrating radar (GPR) with postmortem processes of decomposing human targets. Two and three dimensional imagery programs were utilized to optimize the analysis and interpretation of the data acquired over the past eight months. The processed images were then compared to models of human decompositional stages. The results of this research support and acknowledge that GPR is only capable of enhancing field methods in the search for clandestine burials, and when coupled with target-specific geophysical imagery software, contributes valuable working knowledge in regards to the contents of the burial itself. Hence, such resources can only be seen as beneficial to a search teams’ endeavors

    Design of Model For Restructure Transformation of Public Sector

    Get PDF
    Public sector such as Govt. University composed of many physical as well logical threads, which are very beneficial for public to provide services. Over times due to repeated modification of software modules, the structure of the system deteriorates and it become very complex to understand for further modification whenever requirement need to provide services to public, because it is universal truth after specific time period there is need of modification to fulfill the requirement for public. And if we repeat to modify the software module, then it is very complicated just like noodles in chowmin plate and program structure is twisted and tangled. Due to this program structure greatly decrease the scalability, reliability, efficiency, robustness and increased the complexity of software module. And it also increased the maintenance cost of software module, therefore repeated modification is not a good choice. Reengineering is good choice for this. Therefore, in this paper we will introduced a new methodology that is known as pattern based reengineer methodology, that is not only focus on only logical thread, but also focus on physical entities - reduce overall complexity. It is proved that the transformation does not alter the semantic of restructured program

    Early detection of ripple propagation in evolving software systems

    Get PDF
    Ripple effect analysis is the analysis of the consequential knock on effects of a change to a software system. In the first part of this study, ripple effect analysis methods are classified into several categories based on the types of information the methods analyse and produce. A comparative and analytical study of methods from these categories was performed in an attempt to assist maintainers in the selection of ripple effect analysis methods for use in different phases of the software maintenance process. It was observed that existing methods are most usable in the later stages of the software maintenance process and not at an early stage when strategic decisions concerning project scheduling are made. The second part of the work, addresses itself to the problem of tracing the ripple effect of a change, at a stage earlier in the maintenance process than existing ripple effect analysis methods allow. Particular emphasis is placed upon the development of ripple effect analysis methods for analysing system documentation. The ripple effect analysis methods described in this thesis involve manipulating a novel graph theory model called a Ripple Propagation Graph. The model is based on the thematic structure of documentation, previous release information and expert judgement concerning potential ripple effects. In the third part of the study the Ripple Propagation Graph model and the analysis methods are applied and evaluated, using examples of documentation structure and a major case study

    Software engineering : redundancy is key

    Get PDF
    Software engineers are humans and so they make lots of mistakes. Typically 1 out of 10 to 100 tasks go wrong. The only way to avoid these mistakes is to introduce redundancy in the software engineering process. This article is a plea to consciously introduce several levels of redundancy for each programming task. Depending on the required level of correctness, expressed in a residual error probability (typically 10-3 to 10-10), each programming task must be carried out redundantly 4 to 8 times. This number is hardly influenced by the size of a programming endeavour. Training software engineers does have some effect as non trained software engineers require a double amount of redundant tasks to deliver software of a desired quality. More compact programming, for instance by using domain specific languages, only reduces the number of redundant tasks by a small constant

    Energy and Route Optimization of Moving Devices

    Get PDF
    This thesis highlights our efforts in energy and route optimization of moving devices. We have focused on three categories of such devices; industrial robots in a multi-robot environment, generic vehicles in a vehicle routing problem (VRP) context, automatedguided vehicles (AGVs) in a large-scale flexible manufacturing system (FMS). In the first category, the aim is to develop a non-intrusive energy optimization technique, based on a given set of paths and sequences of operations, such that the original cycle time is not exceeded. We develop an optimization procedure based on a mathematical programming model that aims to minimize the energy consumption and peak power. Our technique has several advantages. It is non-intrusive, i.e. it requires limited changes in the robot program and can be implemented easily. Moreover,it is model-free, in the sense that no particular, and perhaps secret, parameter or dynamic model is required. Furthermore, the optimization can be done offline, within seconds using a generic solver. Through careful experiments, we have shown that it is possible to reduce energy and peak-power up to about 30% and 50% respectively. The second category of moving devices comprises of generic vehicles in a VRP context. We have developed a hybrid optimization approach that integrates a distributed algorithm based on a gossip protocol with a column generation (CG) algorithm, which manages to solve the tested problems faster than the CG algorithm alone. The algorithm is developed for a VRP variation including time windows (VRPTW), which is meant to model the task of scheduling and routing of caregivers in the context of home healthcare routing and scheduling problems (HHRSPs). Moreover,the developed algorithm can easily be parallelized to further increase its efficiency. The last category deals with AGVs. The choice of AGVs was not arbitrary; by design, we decided to transfer our knowledge of energy optimization and routing algorithms to a class of moving devices in which both techniques are of interest. Initially, we improve an existing method of conflict-free AGV scheduling and routing, such that the new algorithm can manage larger problems. A heuristic version of the algorithm manages to solve the problem instances in a reasonable amount of time. Later, we develop strategies to reduce the energy consumption. The study is carried out using an AGV system installed at Volvo Cars. The results are promising; (1)the algorithm reduces performance measures such as makespan up to 50%, while reducing the total travelled distance of the vehicles about 14%, leading to an energy saving of roughly 14%, compared to the results obtained from the original traffic controller. (2) It is possible to reduce the cruise velocities such that more energy is saved, up to 20%, while the new makespan remains better than the original one

    Restoring product focus across the value stream through organizational restructuring

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Engineering Systems Division; and, (M.B.A.) -- Massachusetts Institute of Technology, Sloan School of Management; in conjunction with the Leaders for Manufacturing Program at MIT, 2008.Includes bibliographical references (p. 93-94).Businesses take deliberate action to change their internal context when managers believe that better performance lies beyond the capabilities of assets in their present configuration. A typical course of action is reorganization. A key consideration for organizational design is how the relationship between an organization's structure, the structure of its products, and the structure of its processes influence the value delivered to customers. In some sense, products, processes, and the organization should "fit" each other. This thesis presents a framework for thinking about product architecture, enterprise architecture, and the value stream of processes that binds them together. Critical to any enterprise architecture are process owners that control and improve organizational processes and product owners that manage the end-to-end development of products. When a product is significantly complex, independent tiers of product ownership might be established to ensure that different levels of products - systems, subsystems, or components - are managed with appropriate developmental objectives in mind. For example, some components must be distinct to a single product; other components can be common among several products. The proposed framework shows how product and enterprise architectures can be integrated to support the development of complex systems. The thesis also presents a case study to which the proposed framework is applied. The study focuses on a business that has recently restructured its organization to achieve better alignment with the complex products it develops. Using the proposed framework, the new organizational structure is evaluated to determine if the new enterprise architecture positions the business to increase customer value and accomplish its long-term goals.by Jeffrey M. Pasqual.M.B.A.S.M
    • …
    corecore