15,849 research outputs found

    Causes that Make a Difference

    Get PDF
    Biologists studying complex causal systems typically identify some factors as causes and treat other factors as background conditions. For example, when geneticists explain biological phenomena, they often foreground genes and relegate the cellular milieu to the background. But factors in the milieu are as causally necessary as genes for the production of phenotypic traits, even traits at the molecular level such as amino acid sequences. Gene-centered biology has been criticized on the grounds that because there is parity among causes, the “privileging” of genes reflects a reductionist bias, not an ontological difference. The idea that there is an ontological parity among causes is related to a philosophical puzzle identified by John Stuart Mill: what, other than our interests or biases, could possibly justify identifying some causes as the actual or operative ones, and other causes as mere background? The aim of this paper is to solve this conceptual puzzle and to explain why there is not an ontological parity among genes and the other factors. It turns out that solving this puzzle helps answer a seemingly unrelated philosophical question: what kind of causal generality matters in biology

    Complexity of Manipulating Sequential Allocation

    Full text link
    Sequential allocation is a simple allocation mechanism in which agents are given pre-specified turns and each agents gets the most preferred item that is still available. It has long been known that sequential allocation is not strategyproof. Bouveret and Lang (2014) presented a polynomial-time algorithm to compute a best response of an agent with respect to additively separable utilities and claimed that (1) their algorithm correctly finds a best response, and (2) each best response results in the same allocation for the manipulator. We show that both claims are false via an example. We then show that in fact the problem of computing a best response is NP-complete. On the other hand, the insights and results of Bouveret and Lang (2014) for the case of two agents still hold

    Tool use and related errors in ideational apraxia: The quantitative simulation of patient error profiles

    Get PDF
    The behaviour of ideational apraxic patients on simple tasks involving multiple objects is typically marked by a variety of errors. While some of these errors concern the sequential organisation of action through time, many relate to the misuse of, or failure to use, necessary or appropriate tools. In this paper we apply the computational model of Cooper & Shallice (2000) to five standard multiple object tasks used in clinical assessment and demonstrate how, when lesioned, the model can account for the error profiles of two ideational apraxic patients discussed by Rumiati et al. (2001). Application of the model to the multiple object tasks demonstrates the generality of the model, while the account of the error profiles extends previous work (Cooper et al., 2005) in which ideational apraxia was argued to arise from a generalised disturbance of object representations that are held to trigger action schemas

    What's Decidable About Sequences?

    Full text link
    We present a first-order theory of sequences with integer elements, Presburger arithmetic, and regular constraints, which can model significant properties of data structures such as arrays and lists. We give a decision procedure for the quantifier-free fragment, based on an encoding into the first-order theory of concatenation; the procedure has PSPACE complexity. The quantifier-free fragment of the theory of sequences can express properties such as sortedness and injectivity, as well as Boolean combinations of periodic and arithmetic facts relating the elements of the sequence and their positions (e.g., "for all even i's, the element at position i has value i+3 or 2i"). The resulting expressive power is orthogonal to that of the most expressive decidable logics for arrays. Some examples demonstrate that the fragment is also suitable to reason about sequence-manipulating programs within the standard framework of axiomatic semantics.Comment: Fixed a few lapses in the Mergesort exampl

    The Analysis of design and manufacturing tasks using haptic and immersive VR - Some case studies

    Get PDF
    The use of virtual reality in interactive design and manufacture has been researched extensively but the practical application of this technology in industry is still very much in its infancy. This is surprising as one would have expected that, after some 30 years of research commercial applications of interactive design or manufacturing planning and analysis would be widespread throughout the product design domain. One of the major but less well known advantages of VR technology is that logging the user gives a great deal of rich data which can be used to automatically generate designs or manufacturing instructions, analyse design and manufacturing tasks, map engineering processes and, tentatively, acquire expert knowledge. The authors feel that the benefits of VR in these areas have not been fully disseminated to the wider industrial community and - with the advent of cheaper PC-based VR solutions - perhaps a wider appreciation of the capabilities of this type of technology may encourage companies to adopt VR solutions for some of their product design processes. With this in mind, this paper will describe in detail applications of haptics in assembly demonstrating how user task logging can lead to the analysis of design and manufacturing tasks at a level of detail not previously possible as well as giving usable engineering outputs. The haptic 3D VR study involves the use of a Phantom and 3D system to analyse and compare this technology against real-world user performance. This work demonstrates that the detailed logging of tasks in a virtual environment gives considerable potential for understanding how virtual tasks can be mapped onto their real world equivalent as well as showing how haptic process plans can be generated in a similar manner to the conduit design and assembly planning HMD VR tool reported in PART A. The paper concludes with a view as to how the authors feel that the use of VR systems in product design and manufacturing should evolve in order to enable the industrial adoption of this technology in the future

    The time course of routine action

    Get PDF
    Previous studies of action selection in routinized tasks have used error rates as their sole dependent measure (e.g. Reason, 1979; Schwartz et al., 1998). Consequently, conclusions about the underlying mechanisms of correct behavior are necessarily indirect. The present experiment examines the performance of normal subjects in the prototypical coffee task (Botvinick & Plaut, 2004) when carried out in a virtual environment on screen. This has the advantage of (a) constraining the possible errors more tightly than a real world environment, and (b) giving access to latencies as an additional, finer grained measure of performance. We report error data and timing of action selection at the crucial branching points for the production of routinized task sequences both with and without a secondary task. Processing branching points leads to increased latencies. The presence of the secondary task has a greater effect on latencies at branching points than at equivalent non-branching points. Furthermore, error data and latencies dissociate, suggesting that the exact timing is a valid and valuable source of information when trying to understand the processes that govern routine tasks. The results of the experiment are discussed in relation to their implication for computational accounts of routine action selection
    • …
    corecore