85,503 research outputs found

    A Framework for the Flexible Integration of a Class of Decision Procedures into Theorem Provers

    Get PDF
    The role of decision procedures is often essential in theorem proving. Decision procedures can reduce the search space of heuristic components of a prover and increase its abilities. However, in some applications only a small number of conjectures fall within the scope of the available decision procedures. Some of these conjectures could in an informal sense fall ‘just outside’ that scope. In these situations a problem arises because lemmas have to be invoked or the decision procedure has to communicate with the heuristic component of a theorem prover. This problem is also related to the general problem of how to exibly integrate decision procedures into heuristic theorem provers. In this paper we address such problems and describe a framework for the exible integration of decision procedures into other proof methods. The proposed framework can be used in different theorem provers, for different theories and for different decision procedures. New decision procedures can be simply ‘plugged-in’ to the system. As an illustration, we describe an instantiation of this framework within the Clam proof-planning system, to which it is well suited. We report on some results using this implementation

    A hyper-heuristic for adaptive scheduling in computational grids

    Get PDF
    In this paper we present the design and implementation of an hyper-heuristic for efficiently scheduling independent jobs in computational grids. An efficient scheduling of jobs to grid resources depends on many parameters, among others, the characteristics of the resources and jobs (such as computing capacity, consistency of computing, workload, etc.). Moreover, these characteristics change over time due to the dynamic nature of grid environment, therefore the planning of jobs to resources should be adaptively done. Existing ad hoc scheduling methods (batch and immediate mode) have shown their efficacy for certain types of resource and job characteristics. However, as stand alone methods, they are not able to produce the best planning of jobs to resources for different types of Grid resources and job characteristics. In this work we have designed and implemented a hyper-heuristic that uses a set of ad hoc (immediate and batch mode) scheduling methods to provide the scheduling of jobs to Grid resources according to the Grid and job characteristics. The hyper-heuristic is a high level algorithm, which examines the state and characteristics of the Grid system (jobs and resources), and selects and applies the ad hoc method that yields the best planning of jobs. The resulting hyper-heuristic based scheduler can be thus used to develop network-aware applications that need efficient planning of jobs to resources. The hyper-heuristic has been tested and evaluated in a dynamic setting through a prototype of a Grid simulator. The experimental evaluation showed the usefulness of the hyper-heuristic for planning of jobs to resources as compared to planning without knowledge of the resource and job characteristics.Peer ReviewedPostprint (author's final draft

    Multi-objective model for optimizing railway infrastructure asset renewal

    Get PDF
    Trabalho inspirado num problema real da empresa Infraestruturas de Portugal, EP.A multi-objective model for managing railway infrastructure asset renewal is presented. The model aims to optimize three objectives, while respecting operational constraints: levelling investment throughout multiple years, minimizing total cost and minimizing work start postponements. Its output is an optimized intervention schedule. The model is based on a case study from a Portuguese infrastructure management company, which specified the objectives and constraints, and reflects management practice on railway infrastructure. The results show that investment levelling greatly influences the other objectives and that total cost fluctuations may range from insignificant to important, depending on the condition of the infrastructure. The results structure is argued to be general and suggests a practical methodology for analysing trade-offs and selecting a solution for implementation.info:eu-repo/semantics/publishedVersio

    Autonomous 3D Exploration of Large Structures Using an UAV Equipped with a 2D LIDAR

    Get PDF
    This paper addressed the challenge of exploring large, unknown, and unstructured industrial environments with an unmanned aerial vehicle (UAV). The resulting system combined well-known components and techniques with a new manoeuvre to use a low-cost 2D laser to measure a 3D structure. Our approach combined frontier-based exploration, the Lazy Theta* path planner, and a flyby sampling manoeuvre to create a 3D map of large scenarios. One of the novelties of our system is that all the algorithms relied on the multi-resolution of the octomap for the world representation. We used a Hardware-in-the-Loop (HitL) simulation environment to collect accurate measurements of the capability of the open-source system to run online and on-board the UAV in real-time. Our approach is compared to different reference heuristics under this simulation environment showing better performance in regards to the amount of explored space. With the proposed approach, the UAV is able to explore 93% of the search space under 30 min, generating a path without repetition that adjusts to the occupied space covering indoor locations, irregular structures, and suspended obstaclesUnión Europea Marie Sklodowska-Curie 64215Unión Europea MULTIDRONE (H2020-ICT-731667)Uniión Europea HYFLIERS (H2020-ICT-779411

    New efficient algorithms for multiple change-point detection with kernels

    Get PDF
    Several statistical approaches based on reproducing kernels have been proposed to detect abrupt changes arising in the full distribution of the observations and not only in the mean or variance. Some of these approaches enjoy good statistical properties (oracle inequality, \ldots). Nonetheless, they have a high computational cost both in terms of time and memory. This makes their application difficult even for small and medium sample sizes (n<104n< 10^4). This computational issue is addressed by first describing a new efficient and exact algorithm for kernel multiple change-point detection with an improved worst-case complexity that is quadratic in time and linear in space. It allows dealing with medium size signals (up to n≈105n \approx 10^5). Second, a faster but approximation algorithm is described. It is based on a low-rank approximation to the Gram matrix. It is linear in time and space. This approximation algorithm can be applied to large-scale signals (n≥106n \geq 10^6). These exact and approximation algorithms have been implemented in \texttt{R} and \texttt{C} for various kernels. The computational and statistical performances of these new algorithms have been assessed through empirical experiments. The runtime of the new algorithms is observed to be faster than that of other considered procedures. Finally, simulations confirmed the higher statistical accuracy of kernel-based approaches to detect changes that are not only in the mean. These simulations also illustrate the flexibility of kernel-based approaches to analyze complex biological profiles made of DNA copy number and allele B frequencies. An R package implementing the approach will be made available on github
    • …
    corecore