1,691 research outputs found
Oculomotor involvement in spatial working memory is task-specific
Many everyday tasks, such as remembering where you parked, require the capacity to store and manipulate information about the visual and spatial properties of the world. The ability to represent, remember, and manipulate spatial information is known as visuospatial working memory (VSWM). Despite substantial interest in VSWM the mechanisms responsible for this ability remain debated. One influential idea is that VSWM depends on activity in the eye-movement (oculomotor) system. However, this has proved difficult to test because experimental paradigms that disrupt oculomotor control also interfere with other cognitive systems, such as spatial attention. Here, we present data from a novel paradigm that selectively disrupts activation in the oculomotor system. We show that the inability to make eye-movements is associated with impaired performance on the Corsi Blocks task, but not on Arrow Span, Visual Patterns, Size Estimation or Digit Span tasks. It is argued that the oculomotor system is required to encode and maintain spatial locations indicted by a change in physical salience, but not non-salient spatial locations indicated by the meaning of a symbolic cue. This suggestion offers a way to reconcile the currently conflicting evidence regarding the role of the oculomotor system in spatial working memory
The impact of secondary tasks on multitasking in a virtual environment
One experiment is described that examined the possible involvement of working memory in the Virtual Errands Test (McGeorge et al., 2000), which requires participants to complete errands within a virtual environment, presented on a computer screen. Time was limited, therefore participants had to swap between tasks (multitask) efficiently to complete the errands. Forty-two undergraduates participated, all attempting the test twice. On one of these occasions they were asked to perform a concurrent task throughout (order of single and dual task conditions was counterbalanced). The type of secondary task was manipulated between-groups. Twenty-one participants were asked to randomly generate months of the year aloud in the dual-task condition, while another twenty-one were asked to suppress articulation by repeating the word âDecemberâ. An overall dual-task effect on the virtual errands test was observed, although this was qualified by an interaction with the order of single and dual task conditions. Analysis of the secondary task data showed a drop in performance (relative to baseline) under dual-task conditions, and that drop was greater for the random generation group and the articulatory suppression group. These data are interpreted as suggesting that the central executive and phonological loop components of working memory are implicated in this test of multitasking
Isotopic investigation of diet and residential mobility in the Neolithic of the Lower Rhine Basin
Multiple isotopic systems (C, N, O, S, Sr, Pb) are applied to investigate diet and mobility amongst the Middle Neolithic populations at Schipluiden and Swifterbant (Netherlands). A review of carbon and nitrogen isotope analyses of European Mesolithic and Neolithic populations shows a shift in diet from the Mesolithic to the Neolithic, but also great variety in Neolithic diets, several of which incorporate fish. At Swifterbant (c.4300â4000 BC) the population had a diet largely based on terrestrial and freshwater resources, despite proximity to tidal waters. Only one individual (of 10) showed evidence for migration. In contrast at Schipluiden (c.3600â3400 BC) there were migrants who had a diet lower in marine resources than those without evidence for migration. The faunal spectrum and isotopic similarities with sites in the Iron Gates Gorge suggest that sturgeon may have been important. There is some evidence that migrants at Schipluiden were not accorded the formal burial given to locally born people
High risk prescribing in older adults: Prevalence, clinical and economic implications and potential for intervention at the population level
Background: High risk prescribing can compromise independent wellbeing and quality of life in older adults. The aims of this project are to determine the prevalence, risk factors, clinical consequences, and costs of high risk prescribing, and to assess the impact of interventions on high risk prescribing in older people. Methods. The proposed project will utilise data from the 45 and Up Study, a large scale cohort of 267,153 men and women aged 45 and over recruited during 2006-2009 from the state of New South Wales, Australia linked to a range of administrative health datasets. High risk prescribing will be assessed using three indicators: polypharmacy (use of five or more medicines); Beers Criteria (an explicit measure of potentially inappropriate medication use); and Drug Burden Index (a pharmacologic dose-dependent measure of cumulative exposure to anticholinergic and sedative medicines). Individual risk factors from the 45 and Up Study questionnaire, and health system characteristics from health datasets that are associated with the likelihood of high risk prescribing will be identified. The main outcome measures will include hospitalisation (first admission to hospital, total days in hospital, cause-specific hospitalisation); admission to institutionalised care; all-cause mortality, and, where possible, cause-specific mortality. Economic costs to the health care system and implications of high risk prescribing will be also investigated. In addition, changes in high risk prescribing will be evaluated in relation to certain routine medicines-related interventions. The statistical analysis will be conducted using standard pharmaco-epidemiological methods including descriptive analysis, univariate and multivariate regression analysis, controlling for relevant confounding factors, using a number of different approaches. Discussion. The availability of large-scale data is useful to identify opportunities for improving prescribing, and health in older adults. The size of the 45 and Up Study, along with linkage to health databases provides an important opportunity to investigate the relationship between high risk prescribing and adverse outcomes in a real-world population of older adults. © 2013 Gnjidic et al.; licensee BioMed Central Ltd
Reconstruction of Hydraulic Data by Machine Learning
Numerical simulation models associated with hydraulic engineering take a wide
array of data into account to produce predictions: rainfall contribution to the
drainage basin (characterized by soil nature, infiltration capacity and
moisture), current water height in the river, topography, nature and geometry
of the river bed, etc. This data is tainted with uncertainties related to an
imperfect knowledge of the field, measurement errors on the physical parameters
calibrating the equations of physics, an approximation of the latter, etc.
These uncertainties can lead the model to overestimate or underestimate the
flow and height of the river. Moreover, complex assimilation models often
require numerous evaluations of physical solvers to evaluate these
uncertainties, limiting their use for some real-time operational applications.
In this study, we explore the possibility of building a predictor for river
height at an observation point based on drainage basin time series data. An
array of data-driven techniques is assessed for this task, including
statistical models, machine learning techniques and deep neural network
approaches. These are assessed on several metrics, offering an overview of the
possibilities related to hydraulic time-series. An important finding is that
for the same hydraulic quantity, the best predictors vary depending on whether
the data is produced using a physical model or real observations.Comment: Submitted to SimHydro 201
Recommended from our members
Contrasting Effects of Energy Transfer in Determining Efficiency Improvements in Ternary Polymer Solar Cells
Crystallizable, high-mobility conjugated polymers have been employed as secondary donor materials in ternary polymer solar cells in order to improve device efficiency by broadening their spectral response range and enhancing charge dissociation and transport. We demonstrate contrasting effects of two crystallizable polymers, namely PffBT4T-2OD and PDPP2TBT, in determining the efficiency improvements in PTB7-Th:PC71BM host blends. A notable power conversion efficient of 11% can be obtained by introducing 10% PffBT4T-2OD (relative to PTB7-Th), while the efficiency of PDPP2TBT-incorporated ternary devices decreased dramatically despite an enhancement in hole mobility and light absorption. Blend morphology studies suggest that both PffBT4T-2OD and PDPP2TBT are well dissolved within the host PTB7-Th phase and facilitate an increased degree of phase separation between polymer and fullerene domains. Whilst negligible charge transfer was determined in binary blends of each polymer mixture, effective energy transfer was identified from PffBT4T-2OD to PTB7-Th that contributes to an improvement in ternary blend device efficiency. In contrast, energy transfer from PTB7-Th to PDPP2TBT worsened the efficiency of the ternary device due to inefficient charge dissociation between PDPP2TBT and PC71BM
The Parametric Inverse Problem in Transient Scattering
Scattering problems in many areas of applied physics are governed by the wave equation. In the most usual situation, we are given the incident wave (input) and the scatterer(s) and attempt, through analytical, experimental, or numerical methods, to produce the scattered waves (output). Such procedures can be carried out in either the frequency domain or the time domain and are categorized under the general heading of âforward problems.â In a less usual, but no less important situation, we are given the incident wave (input) and the scattered waves (output) and attempt to find the scatterer(s) that produced the output. In this case, we call the procedures âinverseâ problems
Paramedic confidence in estimating external blood loss
Introduction: Studies have identifed that visual estimation of blood loss is highly inaccurate, however no research has investigated therelationship between this practice and the confdence of estimation by paramedics. The aim of this study was to determineparamedic confdence in the estimation of, and reporting of external blood loss due to medical or trauma aetiology, within an Australasian paramedic context.
Methods: Between July and September 2015, a cross-sectional survey was distributed through Australasian paramedic professional bodies to determine confdence in estimating and documentation of external blood loss. Using Likert scale and free text responses, participants provided demographic information and their self-perceived confdence in estimating and documenting external blood loss.
Results: Five thousand six hundred paramedics were invited to participate in an online survey. Two hundred and eight responses were received (3.8% response rate). A total of 86.6% of participants reported documenting blood loss in clinical reports, however only 47.8% of participants believed their estimation of external blood loss was accurate with 13% reporting underestimation and 33.5% reporting overestimation of blood loss. Additionally, only 51.6% of participants agreed to strongly agreed that they were confdent in their estimation of blood loss.
Conclusion: This research demonstrates the majority of paramedics estimate and document external blood loss, yet nearly half do not feel confdent in doing so, despite indicating its importance. Educational and organisational changes are recommended to refect the clear evidence against this practice. Further research is recommended to identify appropriate physiological parameters and practical assessment tools to replace this inaccurate form of clinical assessment
Multi-institutional evaluation of a Pareto navigation guided automated radiotherapy planning solution for prostate cancer
\ua9 The Author(s) 2024.Background: Current automated planning solutions are calibrated using trial and error or machine learning on historical datasets. Neither method allows for the intuitive exploration of differing trade-off options during calibration, which may aid in ensuring automated solutions align with clinical preference. Pareto navigation provides this functionality and offers a potential calibration alternative. The purpose of this study was to validate an automated radiotherapy planning solution with a novel multi-dimensional Pareto navigation calibration interface across two external institutions for prostate cancer. Methods: The implemented âPareto Guided Automated Planningâ (PGAP) methodology was developed in RayStation using scripting and consisted of a Pareto navigation calibration interface built upon a âProtocol Based Automatic Iterative Optimisationâ planning framework. 30 previous patients were randomly selected by each institution (IA and IB), 10 for calibration and 20 for validation. Utilising the Pareto navigation interface automated protocols were calibrated to the institutionsâ clinical preferences. A single automated plan (VMATAuto) was generated for each validation patient with plan quality compared against the previously treated clinical plan (VMATClinical) both quantitatively, using a range of DVH metrics, and qualitatively through blind review at the external institution. Results: PGAP led to marked improvements across the majority of rectal dose metrics, with Dmean reduced by 3.7 Gy and 1.8 Gy for IA and IB respectively (p < 0.001). For bladder, results were mixed with low and intermediate dose metrics reduced for IB but increased for IA. Differences, whilst statistically significant (p < 0.05) were small and not considered clinically relevant. The reduction in rectum dose was not at the expense of PTV coverage (D98% was generally improved with VMATAuto), but was somewhat detrimental to PTV conformality. The prioritisation of rectum over conformality was however aligned with preferences expressed during calibration and was a key driver in both institutions demonstrating a clear preference towards VMATAuto, with 31/40 considered superior to VMATClinical upon blind review. Conclusions: PGAP enabled intuitive adaptation of automated protocols to an institutionâs planning aims and yielded plans more congruent with the institutionâs clinical preference than the locally produced manual clinical plans
- âŠ