86,311 research outputs found
A Generic library of problem-solving methods for scheduling applications
In this paper we describe a generic library of problem-solving methods (PSMs) for scheduling applications. Although, some attempts have been made in the past at developing libraries of scheduling methods, these only provide limited coverage: in some cases they are specific to a particular scheduling domain; in other cases they simply implement a particular scheduling technique; in other cases they fail to provide the required degree of depth and precision. Our library is based on a structured approach, whereby we first develop a scheduling task ontology, and then construct a task-specific but domain independent model of scheduling problem-solving, which generalises from specific approaches to scheduling problem-solving. Different PSMs are then constructed uniformly by specialising the generic model of scheduling problem-solving. Our library has been evaluated on a number of real-life and benchmark applications to demonstrate its generic and comprehensive nature
The Epistemology of scheduling problems
Scheduling is a knowledge-intensive task spanning over many activities in day-to-day life. It deals with the temporally-bound assignment of jobs to resources. Although scheduling has been extensively researched in the AI community for the past 30 years, efforts have primarily focused on specific applications, algorithms, or 'scheduling shells' and no comprehensive analysis exists on the nature of scheduling problems, which provides a formal account of what scheduling is, independently of the way scheduling problems can be approached. Research on KBS development by reuse makes use of ontologies, to provide knowledge-level specifications of reusable KBS components. In this paper we describe a task ontology, which formally characterises the nature of scheduling problems, independently of particular application domains and in-dependently of how the problems can be solved. Our results provide a comprehensive, domain-independent and formally specified refer-ence model for scheduling applications. This can be used as the ba-sis for further analyses of the class of scheduling problems and also as a concrete reusable resource to support knowledge acquisition and system development in scheduling applications
A machine learning approach for efficient uncertainty quantification using multiscale methods
Several multiscale methods account for sub-grid scale features using coarse
scale basis functions. For example, in the Multiscale Finite Volume method the
coarse scale basis functions are obtained by solving a set of local problems
over dual-grid cells. We introduce a data-driven approach for the estimation of
these coarse scale basis functions. Specifically, we employ a neural network
predictor fitted using a set of solution samples from which it learns to
generate subsequent basis functions at a lower computational cost than solving
the local problems. The computational advantage of this approach is realized
for uncertainty quantification tasks where a large number of realizations has
to be evaluated. We attribute the ability to learn these basis functions to the
modularity of the local problems and the redundancy of the permeability patches
between samples. The proposed method is evaluated on elliptic problems yielding
very promising results.Comment: Journal of Computational Physics (2017
Recommended from our members
An Ontological formalization of the planning task
In this paper we propose a generic task ontology, which formalizes the space of planning problems. Although planning is one of the oldest researched areas in Artificial Intelligence and attempts have been made in the past at developing task ontologies for planning, these formalizations suffer from serious limitations: they do not exhibit the required level of formalization and precision and they usually fail to include some of the key concepts required for specifying planning problems. In con-trast with earlier proposals, our task ontology formalizes the nature of the planning task independently of any planning paradigm, specific domains, or applications and provides a fine-grained, precise and comprehensive characterization of the space of planning problems. Finally, in addition to producing a formal specification we have also operationalized the ontology into a set of executable definitions, which provide a concrete reusable resource for knowledge acquisition and system development in planning applications
Neural ODEs with stochastic vector field mixtures
It was recently shown that neural ordinary differential equation models
cannot solve fundamental and seemingly straightforward tasks even with
high-capacity vector field representations. This paper introduces two other
fundamental tasks to the set that baseline methods cannot solve, and proposes
mixtures of stochastic vector fields as a model class that is capable of
solving these essential problems. Dynamic vector field selection is of critical
importance for our model, and our approach is to propagate component
uncertainty over the integration interval with a technique based on forward
filtering. We also formalise several loss functions that encourage desirable
properties on the trajectory paths, and of particular interest are those that
directly encourage fewer expected function evaluations. Experimentally, we
demonstrate that our model class is capable of capturing the natural dynamics
of human behaviour; a notoriously volatile application area. Baseline
approaches cannot adequately model this problem
E-BioFlow: Different Perspectives on Scientific Workflows
We introduce a new type of workflow design system called\ud
e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control flow perspective, the data flow perspective, and the resource perspective. All three perspectives are of\ud
equal importance, but workflow designers from different domains prefer different perspectives as entry points for their design, and a single workflow designer may prefer different perspectives in different stages of workflow design. Each perspective provides its own type of information, visualisation and support for validation. Combining these three perspectives in a single application provides a new and flexible way of modelling workflows
Recommended from our members
Individual data analysis and Unified Theories of Cognition: A methodological proposal
Unified theories regularly appear in psychology. They also regularly fail to fulfil all of their goals. Newell (1990) called for their revival, using computer modelling as a way to avoid the pitfalls of previous attempts. His call, embodied in the Soar project has so far, however, failed to produce the breakthrough it promised. One of the reasons for the lack of success of Newell’s approach is that the methodology commonly used in psychology, based on controlling potentially confounding variables by using group data, is not the best way forward for developing unified theories of cognition. Instead, we propose an approach where (a) the problems related to group averages are alleviated by analysing subjects individually; (b) there is a close interaction between theory building and experimentation; and (c) computer technology is used to routinely test versions of the theory on a wide range of data. The advantages of this approach heavily outweigh the disadvantages
A comparison of techniques for learning and using mathematics and a study of their relationship to logical principles
Various techniques exist for learning mathematical concepts, like experimentation and exploration, respectively using mathematics, like modelling and simulation. For a clear application of such techniques in mathematics education, there should be a clear distinction between these techniques.
A recently developed theory of fuzzy concepts can be applied to analyse the four mentioned concepts. For all four techniques one can pose the question of their relationship to deduction, induction and abduction as logical principles. An empirical study was conducted with 12-13 aged students, aiming at checking the three reasoning processes
- …