232,551 research outputs found
The Influence of k-Dependence on the Complexity of Planning
A planning problem is k-dependent if each action has at most k pre-conditions on variables unaffected by the action. This concept is well-founded since k is a constant for all but a few of the standard planning domains, and is known to have implications for tractability. In this paper, we present several new complexity results for P(k), the class of k-dependent planning problems with binary variables and polytree causal graphs. The problem of plan generation for P(k) is equivalent to determining how many times each variable can change. Using this fact, we present a polytime plan generation algorithm for P(2) and P(3). For constant k> 3, we introduce and use the notion of a cover to find conditions under which plan generation for P(k) is polynomial
Working with Complexity: a Participatory Systems-Based Process for Planning and Evaluating Rural Water, Sanitation and Hygiene Services
Individuals working within the water, sanitation and hygiene for development (WASH) sector grapple daily with complex technical, social, economic, and environmental issues that often produce unexpected outcomes that are difficult to plan for and resolve. Here we propose a method we are calling the ‘Participatory Systems-based Planning and Evaluation Process’ (PS-PEP) that combines structural factor analysis and collaborative modeling to guide teams of practitioners, researchers, and other stakeholders through a process of modeling and interpreting how factors systemically and dynamically influence sustained access to WASH services. The use and utility of the PS-PEP is demonstrated with a regional team of water committee members in the municipality of Jalapa, Nicaragua who participated in a two-day modeling workshop. Water committee members left the workshop with a clear set of action items for water service planning and management in Jalapa, informed by the analysis of systemic influences and dependencies between key service factors. In so doing, we find that the PS-PEP provides a powerful tool for WASH project or program planning, evaluation, management and policy, the continued use of which could offer unprecedented growth in understanding of WASH service complexity for a broad spectrum of service contexts
The organisation of sociality: a manifesto for a new science of multi-agent systems
In this paper, we pose and motivate a challenge, namely the need for a new science of multi-agent systems. We propose that this new science should be grounded, theoretically on a richer conception of sociality, and methodologically on the extensive use of computational modelling for real-world applications and social simulations. Here, the steps we set forth towards meeting that challenge are mainly theoretical. In this respect, we provide a new model of multi-agent systems that reflects a fully explicated conception of cognition, both at the individual and the collective level. Finally, the mechanisms and principles underpinning the model will be examined with particular emphasis on the contributions provided by contemporary organisation theory
Pedestrian demand modelling of large cities: an applied example from London
This paper introduces a methodology for the development of city wide pedestrian demand models and shows its application to London. The approach used for modelling is Multiple Regression Analysis of independent variables against the dependent variable of observed pedestrian flows. The test samples were from manual observation studies of average total pedestrian flow per hour on 237 sample sites. The model will provide predicted flow values for all 7,526 street segments in the 25 square kilometres of Central London. It has been independently validated by Transport for London and is being tested against further observation data. The longer term aim is to extend the model to the entire greater London area and to incorporate additional policy levers for use as a transport planning and evaluation tool
Influence-Optimistic Local Values for Multiagent Planning --- Extended Version
Recent years have seen the development of methods for multiagent planning
under uncertainty that scale to tens or even hundreds of agents. However, most
of these methods either make restrictive assumptions on the problem domain, or
provide approximate solutions without any guarantees on quality. Methods in the
former category typically build on heuristic search using upper bounds on the
value function. Unfortunately, no techniques exist to compute such upper bounds
for problems with non-factored value functions. To allow for meaningful
benchmarking through measurable quality guarantees on a very general class of
problems, this paper introduces a family of influence-optimistic upper bounds
for factored decentralized partially observable Markov decision processes
(Dec-POMDPs) that do not have factored value functions. Intuitively, we derive
bounds on very large multiagent planning problems by subdividing them in
sub-problems, and at each of these sub-problems making optimistic assumptions
with respect to the influence that will be exerted by the rest of the system.
We numerically compare the different upper bounds and demonstrate how we can
achieve a non-trivial guarantee that a heuristic solution for problems with
hundreds of agents is close to optimal. Furthermore, we provide evidence that
the upper bounds may improve the effectiveness of heuristic influence search,
and discuss further potential applications to multiagent planning.Comment: Long version of IJCAI 2015 paper (and extended abstract at AAMAS
2015
Detailed analysis of the cell-inactivation mechanism by accelerated protons and light ions
Published survival data for V79 cells irradiated by monoenergetic protons,
helium-3, carbon, and oxygen ions and for CHO cells irradiated by carbon ions
have been analyzed using the probabilistic two-stage model of cell
inactivation. Three different classes of DNA damages formed by traversing
particles have been distinguished, namely severe single-track damages which
might lead to cell inactivation directly, less severe damages where cell
inactivation is caused by their combinations, and damages of negligible
severity that can be repaired easily. Probabilities of single ions to form
these damages have been assessed in dependence on their linear energy transfer
(LET) values.
Damage induction probabilities increase with atomic number and LET. While
combined damages play crucial role at lower LET values, single-track damages
dominate in high-LET regions. The yields of single-track lethal damages for
protons have been compared with the Monte Carlo estimates of complex DNA
lesions, indicating that lethal events correlate well with complex DNA
double-strand breaks. The decrease in the single-track damage probability for
protons of LET above approx. 30 keV/m, suggested by limited experimental
evidence, is discussed, together with the consequent differences in the
mechanisms of biological effects between protons and heavier ions. Applications
of the results in hadrontherapy treatment planning are outlined.Comment: submitted to Physics in Medicine and Biolog
Taming Numbers and Durations in the Model Checking Integrated Planning System
The Model Checking Integrated Planning System (MIPS) is a temporal least
commitment heuristic search planner based on a flexible object-oriented
workbench architecture. Its design clearly separates explicit and symbolic
directed exploration algorithms from the set of on-line and off-line computed
estimates and associated data structures. MIPS has shown distinguished
performance in the last two international planning competitions. In the last
event the description language was extended from pure propositional planning to
include numerical state variables, action durations, and plan quality objective
functions. Plans were no longer sequences of actions but time-stamped
schedules. As a participant of the fully automated track of the competition,
MIPS has proven to be a general system; in each track and every benchmark
domain it efficiently computed plans of remarkable quality. This article
introduces and analyzes the most important algorithmic novelties that were
necessary to tackle the new layers of expressiveness in the benchmark problems
and to achieve a high level of performance. The extensions include critical
path analysis of sequentially generated plans to generate corresponding optimal
parallel plans. The linear time algorithm to compute the parallel plan bypasses
known NP hardness results for partial ordering by scheduling plans with respect
to the set of actions and the imposed precedence relations. The efficiency of
this algorithm also allows us to improve the exploration guidance: for each
encountered planning state the corresponding approximate sequential plan is
scheduled. One major strength of MIPS is its static analysis phase that grounds
and simplifies parameterized predicates, functions and operators, that infers
knowledge to minimize the state description length, and that detects domain
object symmetries. The latter aspect is analyzed in detail. MIPS has been
developed to serve as a complete and optimal state space planner, with
admissible estimates, exploration engines and branching cuts. In the
competition version, however, certain performance compromises had to be made,
including floating point arithmetic, weighted heuristic search exploration
according to an inadmissible estimate and parameterized optimization
- …