4,591 research outputs found

    The PITA System: Tabling and Answer Subsumption for Reasoning under Uncertainty

    Full text link
    Many real world domains require the representation of a measure of uncertainty. The most common such representation is probability, and the combination of probability with logic programs has given rise to the field of Probabilistic Logic Programming (PLP), leading to languages such as the Independent Choice Logic, Logic Programs with Annotated Disjunctions (LPADs), Problog, PRISM and others. These languages share a similar distribution semantics, and methods have been devised to translate programs between these languages. The complexity of computing the probability of queries to these general PLP programs is very high due to the need to combine the probabilities of explanations that may not be exclusive. As one alternative, the PRISM system reduces the complexity of query answering by restricting the form of programs it can evaluate. As an entirely different alternative, Possibilistic Logic Programs adopt a simpler metric of uncertainty than probability. Each of these approaches -- general PLP, restricted PLP, and Possibilistic Logic Programming -- can be useful in different domains depending on the form of uncertainty to be represented, on the form of programs needed to model problems, and on the scale of the problems to be solved. In this paper, we show how the PITA system, which originally supported the general PLP language of LPADs, can also efficiently support restricted PLP and Possibilistic Logic Programs. PITA relies on tabling with answer subsumption and consists of a transformation along with an API for library functions that interface with answer subsumption

    Degeneracy: a link between evolvability, robustness and complexity in biological systems

    Get PDF
    A full accounting of biological robustness remains elusive; both in terms of the mechanisms by which robustness is achieved and the forces that have caused robustness to grow over evolutionary time. Although its importance to topics such as ecosystem services and resilience is well recognized, the broader relationship between robustness and evolution is only starting to be fully appreciated. A renewed interest in this relationship has been prompted by evidence that mutational robustness can play a positive role in the discovery of adaptive innovations (evolvability) and evidence of an intimate relationship between robustness and complexity in biology. This paper offers a new perspective on the mechanics of evolution and the origins of complexity, robustness, and evolvability. Here we explore the hypothesis that degeneracy, a partial overlap in the functioning of multi-functional components, plays a central role in the evolution and robustness of complex forms. In support of this hypothesis, we present evidence that degeneracy is a fundamental source of robustness, it is intimately tied to multi-scaled complexity, and it establishes conditions that are necessary for system evolvability

    The Use of Parametric and Non Parametric Frontier Methods to Measure the Productive Efficiency in the Industrial Sector. A Comparative Study

    Get PDF
    Parametric frontier models and non-parametric methods have monopolised the recent literature on productive efficiency measurement. Empirical applications have usually dealt with either one or the other group of techniques. This paper applies a range of both types of approaches to an industrial organisation setup. The joint use can improve the accuracy of both, although some methodological difficulties can arise. The robustness of different methods in ranking productive units allows us to make an comparative analysis of them. Empirical results concern productive and market demand structure, returns-to-scale, and productive inefficiency sources. The techniques are illustrated using data from the US electric power industry.Productive efficiency; parametric frontiers; DEA; industrial sector

    Multilevel semantic analysis and problem-solving in the flight domain

    Get PDF
    A computer based cockpit system which is capable of assisting the pilot in such important tasks as monitoring, diagnosis, and trend analysis was developed. The system is properly organized and is endowed with a knowledge base so that it enhances the pilot's control over the aircraft while simultaneously reducing his workload

    A specialised constraint approach for stable matching problems

    Get PDF
    Constraint programming is a generalised framework designed to solve combinatorial problems. This framework is made up of a set of predefined independent components and generalised algorithms. This is a very versatile structure which allows for a variety of rich combinatorial problems to be represented and solved relatively easily. Stable matching problems consist of a set of participants wishing to be matched into pairs or groups in a stable manner. A matching is said to be stable if there is no pair or group of participants that would rather make a private arrangement to improve their situation and thus undermine the matching. There are many important "real life" applications of stable matching problems across the world. Some of which includes the Hospitals/Residents problem in which a set of graduating medical students, also known as residents, need to be assigned to hospital posts. Some authorities assign children to schools as a stable matching problem. Many other such problems are also tackled as stable matching problems. A number of classical stable matching problems have efficient specialised algorithmic solutions. Constraint programming solutions to stable matching problems have been investigated in the past. These solutions have been able to match the theoretically optimal time complexities of the algorithmic solutions. However, empirical evidence has shown that in reality these constraint solutions run significantly slower than the specialised algorithmic solutions. Furthermore, their memory requirements prohibit them from solving problems which the specialised algorithmic solutions can solve in a fraction of a second. My contribution investigates the possibility of modelling stable matching problems as specialised constraints. The motivation behind this approach was to find solutions to these problems which maintain the versatility of the constraint solutions, whilst significantly reducing the performance gap between constraint and specialised algorithmic solutions. To this end specialised constraint solutions have been developed for the stable marriage problem and the Hospitals/Residents problem. Empirical evidence has been presented which shows that these solutions can solve significantly larger problems than previously published constraint solutions. For these larger problem instances it was seen that the specialised constraint solutions came within a factor of four of the time required by algorithmic solutions. It has also been shown that, through further specialisation, these constraint solutions can be made to run significantly faster. However, these improvements came at the cost of versatility. As a demonstration of the versatility of these solutions it is shown that, by adding simple side constraints, richer problems can be easily modelled. These richer problems add additional criteria and/or an optimisation requirement to the original stable matching problems. Many of these problems have been proven to be NP-Hard and some have no known algorithmic solutions. Included with these models are results from empirical studies which show that these are indeed feasible solutions to the richer problems. Results from the studies also provide some insight into the structure of these problems, some of which have had little or no previous study

    Is utility transferable? A revealed preference analysis

    Get PDF
    We provide a revealed preference analysis of the transferable utility hypothesis, which is widely used in economic models. First, we establish revealed preference conditions that must be satisfied for observed group behavior to be consistent with Pareto efficiency under transferable utility. Next, we show that these conditions are easily testable by means of integer programming methods. The tests are entirely nonparametric, which makes them robust with respect to specification errors. Finally, we demonstrate the practical usefulness of our conditions by means of an application to Spanish consumption data. To the best of our knowledge, this is the first empirical test of the transferable utility hypothesis.

    Putting Iterative Proportional Fitting on the researcher’s desk

    Get PDF
    ‘Iterative Proportional Fitting’ (IPF) is a mathematical procedure originally developed to combine the information from two or more datasets. IPF is a well-established technique with the theoretical and practical considerations behind the method thoroughly explored and reported. In this paper the theory of IPF is investigated with a mathematical definition of the procedure and a review of the relevant literature given. So that IPF can be readily accessible to researchers the procedure has been automated in Visual Basic and a description of the program and a ‘User Guide’ are provided. IPF is employed in various disciplines but has been particularly useful in census-related analysis to provide updated population statistics and to estimate individual-level attribute characteristics. To illustrate the practical application of IPF various case studies are described. In the future, demand for individual-level data is thought likely to increase and it is believed that the IPF procedure and Visual Basic program have the potential to facilitate research in geography and other disciplines

    Explanation-Based Large Neighborhood Search

    Get PDF
    International audienceOne of the most well-known and widely used local search techniques for solving optimization problems in Constraint Programming is the Large Neigh-borhood Search (LNS) algorithm. Such a technique is, by nature, very flexible and can be easily integrated within standard backtracking procedures. One of its drawbacks is that the relaxation process is quite often problem dependent. Several works have been dedicated to overcome this issue through problem independent parameters. Nevertheless, such generic approaches need to be carefully parameter-ized at the instance level. In this paper, we demonstrate that the issue of finding a problem independent neighborhood generation technique for LNS can be addressed using explanation-based neighborhoods. An explanation is a subset of constraints and decisions which justifies a solver event such as a domain modification or a conflict. We evaluate our proposal for a set of optimization problems. We show that our approach is at least competitive with or even better than state-of-the-art algorithms and can be easily combined with state-of-the-art neighborhoods. Such results pave the way to a new use of explanation-based approaches for improving search
    corecore