2,858 research outputs found

    Trying again to fail-first

    Get PDF
    For constraint satisfaction problems (CSPs), Haralick and Elliott [1] introduced the Fail-First Principle and defined in it terms of minimizing branch depth. By devising a range of variable ordering heuristics, each in turn trying harder to fail first, Smith and Grant [2] showed that adherence to this strategy does not guarantee reduction in search effort. The present work builds on Smith and Grant. It benefits from the development of a new framework for characterizing heuristic performance that defines two policies, one concerned with enhancing the likelihood of correctly extending a partial solution, the other with minimizing the effort to prove insolubility. The Fail-First Principle can be restated as calling for adherence to the second, fail-first policy, while discounting the other, promise policy. Our work corrects some deficiencies in the work of Smith and Grant, and goes on to confirm their finding that the Fail-First Principle, as originally defined, is insufficient. We then show that adherence to the fail-first policy must be measured in terms of size of insoluble subtrees, not branch depth. We also show that for soluble problems, both policies must be considered in evaluating heuristic performance. Hence, even in its proper form the Fail-First Principle is insufficient. We also show that the “FF” series of heuristics devised by Smith and Grant is a powerful tool for evaluating heuristic performance, including the subtle relations between heuristic features and adherence to a policy

    Confinement induced instability of thin elastic film

    Full text link
    A confined incompressible elastic film does not deform uniformly when subjected to adhesive interfacial stresses but with undulations which have a characteristic wavelength scaling linearly with the thickness of the film. In the classical peel geometry, undulations appear along the contact line below a critical film thickness or below a critical curvature of the plate. Perturbation analysis of the stress equilibrium equations shows that for a critically confined film the total excess energy indeed attains a minima for a finite amplitude of the perturbations which grow with further increase in the confinement.Comment: 11 pages, 6 figure

    Assessing riveted connections to Eurocode 3

    Get PDF
    The focus of this paper is the assessment of wrought iron and early steel riveted connections in the future, with recommendations as to how different codes currently deal with the assessment and what may change if alternative codes are adopted. As British standards are being replaced by Eurocodes for design, it is inevitable that assessment codes of practice based on British standards will be replaced by those based on Eurocodes. This progression will ensure that future structures are designed and assessed using codes based on similar philosophies. However, this will also lead to older structures designed according to older codes based on different philosophies and constructed of materials not covered by the Eurocodes also being assessed according to Eurocode-based assessment codes. A similar situation already exists with structures being assessed using British standard-based assessment codes, which were written for the design of steel structures. This has resulted in the leading asset-owning organisations, such as Network Rail and Highways England, including guidance on adapting calculations to account for different material types

    Electricity in Central America: paradigms, reforms and the energy trilemma

    Get PDF
    A new global energy era is emerging, one driven by the confluence of energy security, climate politics and energy equity issues. This ‘energy trilemma’ is shaping the global political economy of energy, which in turn influences how decisions are made about how energy is provided—referred to as global energy governance. This article analyzes historical and contemporary developments in Central America’s power sectors. This is a region that has long been an implementation space for global policy priorities, but has been overlooked by those engaging with the challenges of the energy trilemma. During the 1990s and 2000s, the statist model of energy governance gave way to a market-led model in the Central American isthmus. This led to the privatization of state-owned utilities and the promotion of a regional electricity market. During this period, the dominance of largely hydro-based renewable electricity generation diminished to be replaced by imported fossil fuel-based generation. Oil price increases during the early 2000s highlighted the region’s dependence on imports, with some countries turning to energy rationing. Increasingly interventionist state policies, which now seek to reduce oil dependence, improve energy efficiency and expand access to electricity, are being pursued in the region. This interventionist turn reflects the pressures of the energy trilemma, although energy security, particularly the need to reduce dependence on imported oil, remains the most important driver

    Allocation in Practice

    Full text link
    How do we allocate scarcere sources? How do we fairly allocate costs? These are two pressing challenges facing society today. I discuss two recent projects at NICTA concerning resource and cost allocation. In the first, we have been working with FoodBank Local, a social startup working in collaboration with food bank charities around the world to optimise the logistics of collecting and distributing donated food. Before we can distribute this food, we must decide how to allocate it to different charities and food kitchens. This gives rise to a fair division problem with several new dimensions, rarely considered in the literature. In the second, we have been looking at cost allocation within the distribution network of a large multinational company. This also has several new dimensions rarely considered in the literature.Comment: To appear in Proc. of 37th edition of the German Conference on Artificial Intelligence (KI 2014), Springer LNC

    Scalable Parallel Numerical Constraint Solver Using Global Load Balancing

    Full text link
    We present a scalable parallel solver for numerical constraint satisfaction problems (NCSPs). Our parallelization scheme consists of homogeneous worker solvers, each of which runs on an available core and communicates with others via the global load balancing (GLB) method. The parallel solver is implemented with X10 that provides an implementation of GLB as a library. In experiments, several NCSPs from the literature were solved and attained up to 516-fold speedup using 600 cores of the TSUBAME2.5 supercomputer.Comment: To be presented at X10'15 Worksho

    Fracture of a biopolymer gel as a viscoplastic disentanglement process

    Full text link
    We present an extensive experimental study of mode-I, steady, slow crack dynamics in gelatin gels. Taking advantage of the sensitivity of the elastic stiffness to gel composition and history we confirm and extend the model for fracture of physical hydrogels which we proposed in a previous paper (Nature Materials, doi:10.1038/nmat1666 (2006)), which attributes decohesion to the viscoplastic pull-out of the network-constituting chains. So, we propose that, in contrast with chemically cross-linked ones, reversible gels fracture without chain scission

    Galaxies in box: A simulated view of the interstellar medium

    Full text link
    We review progress in the development of physically realistic three dimensional simulated models of the galaxy.We consider the scales from star forming molecular clouds to the full spiral disc. Models are computed using hydrodynamic (HD) or magnetohydrodynamic (MHD) equations and may include cosmic ray or tracer particles. The range of dynamical scales between the full galaxy structure and the turbulent scales of supernova (SN) explosions and even cloud collapse to form stars, make it impossible with current computing tools and resources to resolve all of these in one model. We therefore consider a hierarchy of models and how they can be related to enhance our understanding of the complete galaxy.Comment: Chapter in Large Scale Magnetic Fields in the Univers

    Random Costs in Combinatorial Optimization

    Full text link
    The random cost problem is the problem of finding the minimum in an exponentially long list of random numbers. By definition, this problem cannot be solved faster than by exhaustive search. It is shown that a classical NP-hard optimization problem, number partitioning, is essentially equivalent to the random cost problem. This explains the bad performance of heuristic approaches to the number partitioning problem and allows us to calculate the probability distributions of the optimum and sub-optimum costs.Comment: 4 pages, Revtex, 2 figures (eps), submitted to PR
    corecore