2,258 research outputs found

    Engineering Planar-Separator and Shortest-Path Algorithms

    Get PDF
    "Algorithm engineering" denotes the process of designing, implementing, testing, analyzing, and refining computational proceedings to improve their performance. We consider three graph problems -- planar separation, single-pair shortest-path routing, and multimodal shortest-path routing -- and conduct a systematic study in order to: classify different kinds of input; draw concrete recommendations for choosing the parameters involved; and identify and tune crucial parts of the algorithm

    Tree decompositions of graphs: Saving memory in dynamic programming

    Get PDF
    AbstractWe propose a simple and effective heuristic to save memory in dynamic programming on tree decompositions when solving graph optimization problems. The introduced “anchor technique” is based on a tree-like set covering problem. We substantiate our findings by experimental results. Our strategy has negligible computational overhead concerning running time but achieves memory savings for nice tree decompositions and path decompositions between 60% and 98%

    An Optimisation-based Framework for Complex Business Process: Healthcare Application

    Get PDF
    The Irish healthcare system is currently facing major pressures due to rising demand, caused by population growth, ageing and high expectations of service quality. This pressure on the Irish healthcare system creates a need for support from research institutions in dealing with decision areas such as resource allocation and performance measurement. While approaches such as modelling, simulation, multi-criteria decision analysis, performance management, and optimisation can – when applied skilfully – improve healthcare performance, they represent just one part of the solution. Accordingly, to achieve significant and sustainable performance, this research aims to develop a practical, yet effective, optimisation-based framework for managing complex processes in the healthcare domain. Through an extensive review of the literature on the aforementioned solution techniques, limitations of using each technique on its own are identified in order to define a practical integrated approach toward developing the proposed framework. During the framework validation phase, real-time strategies have to be optimised to solve Emergency Department performance issues in a major hospital. Results show a potential of significant reduction in patients average length of stay (i.e. 48% of average patient throughput time) whilst reducing the over-reliance on overstretched nursing resources, that resulted in an increase of staff utilisation between 7% and 10%. Given the high uncertainty in healthcare service demand, using the integrated framework allows decision makers to find optimal staff schedules that improve emergency department performance. The proposed optimum staff schedule reduces the average waiting time of patients by 57% and also contributes to reduce number of patients left without treatment to 8% instead of 17%. The developed framework has been implemented by the hospital partner with a high level of success

    DO SOME FINANCIAL PRODUCT FEATURES NEGATIVELY AFFECT CONSUMER DECISIONS? A REVIEW OF EVIDENCE. ESRI RESEARCH SERIES NUMBER 78 SEPTEMBER 2018

    Get PDF
    This paper reviews international evidence on consumer decision-making in retail financial markets. Specifically, we identify and evaluate research from multiple disciplines and methods that links specific features of products to the quality of consumer decisions. The notion of product ‘features’ is broadly defined to include not only product attributes, but also emergent properties such as product complexity and the salience of information disclosure. We document areas of concern from a consumer protection perspective, and describe some common themes, including the inability of consumers to consider all important attributes and whether they can easily discern how the provider is making its profit. We conclude that there is a case for closer integration of empirical evidence and financial regulation

    The Traveling Salesman Problem

    Get PDF
    This paper presents a self-contained introduction into algorithmic and computational aspects of the traveling salesman problem and of related problems, along with their theoretical prerequisites as seen from the point of view of an operations researcher who wants to solve practical problem instances. Extensive computational results are reported on most of the algorithms described. Optimal solutions are reported for instances with sizes up to several thousand nodes as well as heuristic solutions with provably very high quality for larger instances

    Cutset Sampling for Bayesian Networks

    Full text link
    The paper presents a new sampling methodology for Bayesian networks that samples only a subset of variables and applies exact inference to the rest. Cutset sampling is a network structure-exploiting application of the Rao-Blackwellisation principle to sampling in Bayesian networks. It improves convergence by exploiting memory-based inference algorithms. It can also be viewed as an anytime approximation of the exact cutset-conditioning algorithm developed by Pearl. Cutset sampling can be implemented efficiently when the sampled variables constitute a loop-cutset of the Bayesian network and, more generally, when the induced width of the networks graph conditioned on the observed sampled variables is bounded by a constant w. We demonstrate empirically the benefit of this scheme on a range of benchmarks

    Heuristic assignment of CPDs for probabilistic inference in junction trees

    Get PDF
    Many researches have been done for efficient computation of probabilistic queries posed to Bayesian networks (BN). One of the popular architectures for exact inference on BNs is the Junction Tree (JT) based architecture. Among all the different architectures developed, HUGIN is the most efficient JT-based architecture. The Global Propagation (GP) method used in the HUGIN architecture is arguably one of the best methods for probabilistic inference in BNs. Before the propagation, initialization is done to obtain the potential for each cluster in the JT. Then with the GP method, each cluster potential becomes cluster marginal through passing messages with its neighboring clusters. Improvements have been proposed by many researchers to make this message propagation more efficient. Still the GP method can be very slow for dense networks. As BNs are applied to larger, more complex, and realistic applications, developing more efficient inference algorithm has become increasingly important. Towards this goal, in this paper, we present some heuristics for initialization that avoids unnecessary message passing among clusters of the JT and therefore it improves the performance of the architecture by passing lesser messages

    Topology sensitive algorithms for large scale uncapacitated covering problem

    Get PDF
    ix, 89 leaves : ill. ; 29 cmSolving NP-hard facility location problems in wireless network planning is a common scenario. In our research, we study the Covering problem, a well known facility location problem with applications in wireless network deployment. We focus on networks with a sparse structure. First, we analyzed two heuristics of building Tree Decomposition based on vertex separator and perfect elimination order. We extended the vertex separator heuristic to improve its time performance. Second, we propose a dynamic programming algorithm based on the Tree Decomposition to solve the Covering problem optimally on the network. We developed several heuristic techniques to speed up the algorithm. Experiment results show that one variant of the dynamic programming algorithm surpasses the performance of the state of the art mathematical optimization commercial software on several occasions
    • 

    corecore