90 research outputs found

    On the computational complexity of temporal projection and some related problems

    Get PDF
    One kind of temporal reasoning is temporal projection -the computation of the consequences for a set of events. This problem is related to a number of other temporal reasoning tasks such as story understanding, plan validation, and planning. We show that one particular simple case of temporal projection on partially ordered events turns out to be harder than previously conjectured. However, given the restrictions of this problem, planning and story understanding are easy. Additionally, we show that plan validation, one of the intended applications of temporal projection, is tractable for an even larger class of plans. The incomplete decision procedure for the temporal projection problem that has been proposed by other authors, however, fails to be complete in the case where we have shown plan validation to be tractable

    On the computational complexity of temporal projection and some related problems

    Get PDF
    One kind of temporal reasoning is temporal projection -the computation of the consequences for a set of events. This problem is related to a number of other temporal reasoning tasks such as story understanding, plan validation, and planning. We show that one particular simple case of temporal projection on partially ordered events turns out to be harder than previously conjectured. However, given the restrictions of this problem, planning and story understanding are easy. Additionally, we show that plan validation, one of the intended applications of temporal projection, is tractable for an even larger class of plans. The incomplete decision procedure for the temporal projection problem that has been proposed by other authors, however, fails to be complete in the case where we have shown plan validation to be tractable

    Refining complexity analyses in planning by exploiting the exponential time hypothesis

    Get PDF
    The use of computational complexity in planning, and in AI in general, has always been a disputed topic. A major problem with ordinary worst-case analyses is that they do not provide any quantitative information: they do not tell us much about the running time of concrete algorithms, nor do they tell us much about the running time of optimal algorithms. We address problems like this by presenting results based on the exponential time hypothesis (ETH), which is a widely accepted hypothesis concerning the time complexity of 3-SAT. By using this approach, we provide, for instance, almost matching upper and lower bounds onthe time complexity of propositional planning.Funding Agencies|National Graduate School in Computer Science (CUGS), Sweden; Swedish Research Council (VR) [621-2014-4086]</p

    Controversial significance of early S100B levels after cardiac surgery

    Get PDF
    BACKGROUND: The brain-derived protein S100B has been shown to be a useful marker of brain injury of different etiologies. Cognitive dysfunction after cardiac surgery using cardiopulmonary bypass has been reported to occur in up to 70% of patients. In this study we tried to evaluate S100B as a marker for cognitive dysfunction after coronary bypass surgery with cardiopulmonary bypass in a model where the inflow of S100B from shed mediastinal blood was corrected for. METHODS: 56 patients scheduled for coronary artery bypass grafting underwent prospective neuropsychological testing. The test scores were standardized and an impairment index was constructed. S100B was sampled at the end of surgery, hourly for the first 6 hours, and then 8, 10, 15, 24 and 48 hours after surgery. None of the patients received autotransfusion. RESULTS: In simple linear analysis, no significant relation was found between S100B levels and neuropsychological outcome. In a backwards stepwise regression analysis the three variables, S100B levels at the end of cardiopulmonary bypass, S100B levels 1 hour later and the age of the patients were found to explain part of the neuropsychological deterioration (r = 0.49, p < 0.005). CONCLUSIONS: In this study we found that S100B levels 1 hour after surgery seem to be the most informative. Our attempt to control the increased levels of S100B caused by contamination from the surgical field did not yield different results. We conclude that the clinical value of S100B as a predictive measurement of postoperative cognitive dysfunction after cardiac surgery is limited

    Some Fixed Parameter Tractability Results for Planning with Non-Acyclic Domain-Transition Graphs

    No full text
    BÀckström studied the parameterised complexity of planning when the domain-transition graphs (DTGs) are acyclic. He used the parameters d (domain size), k (number of paths in the DTGs) and w (treewidth of the causal graph), and showed that planning is fixed-parameter tractable (fpt) in these parameters, and fpt in only parameter k if the causal graph is a polytree. We continue this work by considering some additional cases of non-acyclic DTGs. In particular, we consider the case where each strongly connected component (SCC) in a DTG must be a simple cycle, and we show that planning is fpt for this case if the causal graph is a polytree. This is done by first preprocessing the instance to construct an equivalent abstraction and then apply BÀckströms technique to this abstraction. We use the parameters d and k, reinterpreting this as the number of paths in the condensation of a DTG, and the two new parameters c (the number of contracted cycles along a path) and pmax (an upper bound for walking around cycles, when not unbounded)

    Planning using Transformation between Equivalent Formalisms: A Case Study of Efficiency

    No full text
    We have considered two planning formalisms which are known to be expressively equivalent---the CPS formalism, using propositional atoms, and the SAS + formalism, using multi-valued state variables. As a case study, we have modified a well-known partial-order planner for CPS into an `identical&apos; planner for SAS + and we have considered two encodings of SAS + into CPS. It turns out that it is more efficient to solve SAS + instances by using the SAS + planner directly than to first encode them as CPS instances and use the CPS planner. For one encoding of SAS + into CPS, which is a polynomial reduction, the CPS planner has a polynomially or exponentially larger search space, depending on goal selection strategy. For the other encoding, which is not a polynomial reduction, the CPS search is of the same size as the SAS + search space, but the cost per node can be exponentially higher in this case. On the other hand, solving CPS instances by encoding them as SAS + instances and..

    Expressive Equivalence of Planning Formalisms

    No full text
    A concept of expressive equivalence for planning formalisms based on polynomial transformations is defined. It is argued that this definition is reasonable and useful both from a theoretical and from a practical perspective; if two languages are equivalent, then theoretical results carry over and, more practically, we can model an application problem in one language and then easily use a planner for the other language. In order to cope with the problem of exponentially sized solutions for planning problems an even stronger concept of expressive equivalence is introduced, using the novel ESP-reduction. Four different formalisms for propositional planning are then analyzed, namely two variants of STRIPS, ground TWEAK and the SAS + formalism. Although these may seem to exhibit different degrees of expressive power, it is proven that they are, in fact, expressively equivalent under ESP reduction. This means that neither negative goals, partial initial states nor multi-value..
    • 

    corecore