101,992 research outputs found

    Postural adjustments in catching: on the interplay between segment stabilization and equilibrium control

    Get PDF
    The purpose of this study was to investigate postural adjustments in one-handed ball catching. Specifically, the functional role of anticipatory postural adjustments (APA) during the initial arm raising and subsequent postural adjustments (SPA) for equilibrium control and ball-hand impact were scrutinized. Full-body kinematics and kinetics allowed an analysis of the mechanical consequences of raising up the arm and preparing for ball-hand impact. APA for catching were suggested to be for segment stabilization. SPA had a functional role for equilibrium control by an inverted pendulum mechanism but were also involved in preparing for the impact of the ball on the hand, which was illustrated by an increased postural response at the end of the movement. These results were compared with raising up the arm in a well-studied reaction-time task, for which an additional counter rotation equilibrium mechanism was observed. Together, our findings demonstrate that postural adjustments should be investigated in relation to their specific functional task constraints, rather than generalizing the functional role of these postural adjustments over different tasks

    Fixed priority scheduling with pre-emption thresholds and cache-related pre-emption delays: integrated analysis and evaluation

    Get PDF
    Commercial off-the-shelf programmable platforms for real-time systems typically contain a cache to bridge the gap between the processor speed and main memory speed. Because cache-related pre-emption delays (CRPD) can have a significant influence on the computation times of tasks, CRPD have been integrated in the response time analysis for fixed-priority pre-emptive scheduling (FPPS). This paper presents CRPD aware response-time analysis of sporadic tasks with arbitrary deadlines for fixed-priority pre-emption threshold scheduling (FPTS), generalizing earlier work. The analysis is complemented by an optimal (pre-emption) threshold assignment algorithm, assuming the priorities of tasks are given. We further improve upon these results by presenting an algorithm that searches for a layout of tasks in memory that makes a task set schedulable. The paper includes an extensive comparative evaluation of the schedulability ratios of FPPS and FPTS, taking CRPD into account. The practical relevance of our work stems from FPTS support in AUTOSAR, a standardized development model for the automotive industry

    Generalizing attentional control across dimensions and tasks: evidence from transfer of proportion-congruent effects

    Get PDF
    Three experiments investigated transfer of list-wide proportion congruent (LWPC) effects from a set of congruent and incongruent items with different frequency (inducer task) to a set of congruent and incongruent items with equal frequency (diagnostic task). Experiments 1 and 2 mixed items from horizontal and vertical Simon tasks. Tasks always involved different stimuli that varied on the same dimension (colour) in Experiment 1 and on different dimensions (colour, shape) in Experiment 2. Experiment 3 mixed trials from a manual Simon task with trials from a vocal Stroop task, with colour being the relevant stimulus in both tasks. There were two major results. First, we observed transfer of LWPC effects in Experiments 1 and 3, when tasks shared the relevant dimension, but not in Experiment 2. Second, sequential modulations of congruency effects transferred in Experiment 1 only. Hence, the different transfer patterns suggest that LWPC effects and sequential modulations arise from different mechanisms. Moreover, the observation of transfer supports an account of LWPC effects in terms of list-wide cognitive control, while being at odds with accounts in terms of stimulus–response (contingency) learning and item-specific control

    The Limits of Post-Selection Generalization

    Full text link
    While statistics and machine learning offers numerous methods for ensuring generalization, these methods often fail in the presence of adaptivity---the common practice in which the choice of analysis depends on previous interactions with the same dataset. A recent line of work has introduced powerful, general purpose algorithms that ensure post hoc generalization (also called robust or post-selection generalization), which says that, given the output of the algorithm, it is hard to find any statistic for which the data differs significantly from the population it came from. In this work we show several limitations on the power of algorithms satisfying post hoc generalization. First, we show a tight lower bound on the error of any algorithm that satisfies post hoc generalization and answers adaptively chosen statistical queries, showing a strong barrier to progress in post selection data analysis. Second, we show that post hoc generalization is not closed under composition, despite many examples of such algorithms exhibiting strong composition properties

    An Infinitesimal Probabilistic Model for Principal Component Analysis of Manifold Valued Data

    Full text link
    We provide a probabilistic and infinitesimal view of how the principal component analysis procedure (PCA) can be generalized to analysis of nonlinear manifold valued data. Starting with the probabilistic PCA interpretation of the Euclidean PCA procedure, we show how PCA can be generalized to manifolds in an intrinsic way that does not resort to linearization of the data space. The underlying probability model is constructed by mapping a Euclidean stochastic process to the manifold using stochastic development of Euclidean semimartingales. The construction uses a connection and bundles of covariant tensors to allow global transport of principal eigenvectors, and the model is thereby an example of how principal fiber bundles can be used to handle the lack of global coordinate system and orientations that characterizes manifold valued statistics. We show how curvature implies non-integrability of the equivalent of Euclidean principal subspaces, and how the stochastic flows provide an alternative to explicit construction of such subspaces. We describe estimation procedures for inference of parameters and prediction of principal components, and we give examples of properties of the model on embedded surfaces

    Teleological Dispositions

    Get PDF

    Spontaneous Meta-Arithmetic as the First Step Toward School Algebra

    Get PDF
    Taking as a point of departure the vision of school algebra as a formalized meta-discourse of arithmetic, we have been following six pairs of 7th-grade students (12-13 years old) as they gradually modify their spontaneous meta-arithmetic toward the “official” algebraic form of talk. In this paper we take a look at the very beginning of this process. Preliminary analyses of data have shown, unsurprisingly, that while reflecting on arithmetic processes and relations, the uninitiated 7th graders were employing colloquial means, which could not protect them against occasional ambiguities. More unexpectedly, this spontaneous meta-arithmetic, although not supported by any previous algebraic schooling, displayed some algebra-like features, not to be normally found in everyday discourses

    Generalizing the Taylor Principle

    Get PDF
    The paper generalizes the Taylor principle---the proposition that central banks can stabilize the macroeconomy by raising their interest rate instrument more than one-for-one in response to higher inflation---to an environment in which reaction coefficients in the monetary policy rule evolve according to a Markov process. We derive a long-run Taylor principle that delivers unique bounded equilibria in two standard models. Policy can satisfy the Taylor principle in the long run, even while deviating from it substantially for brief periods or modestly for prolonged periods. Macroeconomic volatility can be higher in periods when the Taylor principle is not satisfied, not because of indeterminacy, but because monetary policy amplifies the impacts of fundamental shocks. Regime change alters the qualitative and quantitative predictions of a conventional new Keynesian model, yielding fresh interpretations of existing empirical work.
    corecore