766 research outputs found
A deep learning account of how language affects thought
How can words shape meaning? Shared labels highlight commonalities between concepts whereas contrasting labels make differences apparent. To address such findings, we propose a deep learning account that spans perception to decision (i.e. labelling). The model takes photographs as input, transforms them to semantic representations through computations that parallel the ventral visual stream, and finally determines the appropriate linguistic label. The underlying theory is that minimising error on two prediction tasks (predicting the meaning and label of a stimulus) requires a compromise in the network's semantic representations. Thus, differences in label use, whether across languages or levels of expertise, manifest in differences in the semantic representations that support label discrimination. We confirm these predictions in simulations involving fine-grained and coarse-grained labels. We hope these and allied efforts which model perception, semantics, and labelling at scale will advance developmental and neurocomputational accounts of concept and language learning
Task inhibition, conflict, and the n-2 repetition cost: A combined computational and empirical approach
Task inhibition (also known as backward inhibition) is an hypothesised form of cognitive inhibition evident in multi-task situations, with the role of facilitating switching between multiple, competing tasks. This article presents a novel cognitive computational model of a backward inhibition mechanism. By combining aspects of previous cognitive models in task switching and conflict monitoring, the model instantiates the theoretical proposal that backward inhibition is the direct result of conflict between multiple task representations. In a first simulation, we demonstrate that the model produces two effects widely observed in the empirical literature, specifically, reaction time costs for both (n-1) task switches and n-2 task repeats. Through a systematic search of parameter space, we demonstrate that these effects are a general property of the modelâs theoretical content, and not specific parameter settings. We further demonstrate that the model captures previously reported empirical effects of inter-trial interval on n-2 switch costs. A final simulation extends the paradigm of switching between tasks of asymmetric difficulty to three tasks, and generates novel predictions for n-2 repetition costs. Specifically, the model predicts that n-2 repetition costs associated with hard-easy-hard alternations are greater than for easy-hard-easy alternations. Finally, we report two behavioural experiments testing this hypothesis, with results consistent with the model predictions
Assessing Cumulative Health Risks from Exposure to Environmental MixturesâThree Fundamental Questions
Differential exposure to mixtures of environmental agents, including biological, chemical, physical, and psychosocial stressors, can contribute to increased vulnerability of human populations and ecologic systems. Cumulative risk assessment is a tool for organizing and analyzing information to evaluate the probability and seriousness of harmful effects caused by either simultaneous and/or sequential exposure to multiple environmental stressors. In this article we focus on elucidating key challenges that must be addressed to determine whether and to what degree differential exposure to environmental mixtures contributes to increased vulnerability of exposed populations. In particular, the emphasis is on examining three fundamental and interrelated questions that must be addressed as part of the process to assess cumulative risk: a) Which mixtures are most important from a public health perspective? and b) What is the nature (i.e., duration, frequency, timing) and magnitude (i.e., exposure concentration and dose) of relevant cumulative exposures for the population of interest? c) What is the mechanism (e.g., toxicokinetic or toxicodynamic) and consequence (e.g., additive, less than additive, more than additive) of the mixtureâs interactive effects on exposed populations? The focus is primarily on human health effects from chemical mixtures, and the goal is to reinforce the need for improved assessment of cumulative exposure and better understanding of the biological mechanisms that determine toxicologic interactions among mixture constituents
Using BOX-PCR to exclude a clonal outbreak of melioidosis
Background
Although melioidosis in endemic regions is usually caused by a diverse range of Burkholderia pseudomallei strains, clonal outbreaks from contaminated potable water have been described. Furthermore B. pseudomallei is classified as a CDC Group B bioterrorism agent. Ribotyping, pulsed-field gel electrophoresis (PFGE) and multilocus sequence typing (MLST) have been used to identify genetically related B. pseudomallei isolates, but they are time consuming and technically challenging for many laboratories.
Methods
We have adapted repetitive sequence typing using a BOX A1R primer for typing B. pseudomallei and compared BOX-PCR fingerprinting results on a wide range of well-characterized B. pseudomallei isolates with MLST and PFGE performed on the same isolates.
Results
BOX-PCR typing compared favourably with MLST and PFGE performed on the same isolates, both discriminating between the majority of multilocus sequence types and showing relatedness between epidemiologically linked isolates from various outbreak clusters.
Conclusion
Our results suggest that BOX-PCR can be used to exclude a clonal outbreak of melioidosis within 10 hours of receiving the bacterial strains
Short-term acclimation in adults does not predict offspring acclimation potential to hypoxia
Abstract The prevalence of hypoxic areas in coastal waters is predicted to increase and lead to reduced biodiversity. While the adult stages of many estuarine invertebrates can cope with short periods of hypoxia, it remains unclear whether that ability is present if animals are bred and reared under chronic hypoxia. We firstly investigated the effect of moderate, short-term environmental hypoxia (40% air saturation for one week) on metabolic performance in adults of an estuarine amphipod, and the fitness consequences of prolonged exposure. We then reared the offspring of hypoxia-exposed parents under hypoxia, and assessed their oxyregulatory ability under declining oxygen tensions as juveniles and adults. Adults from the parental generation were able to acclimate their metabolism to hypoxia after one week, employing mechanisms typically associated with prolonged exposure. Their progeny, however, did not develop the adult pattern of respiratory regulation when reared under chronic hypoxia, but instead exhibited a poorer oxyregulatory ability than their parents. We conclude that species apparently hypoxia-tolerant when tested in short-term experiments, could be physiologically compromised as adults if they develop under hypoxia. Consequently, we propose that the increased prevalence of hypoxia in coastal regions will have marked effects in some species currently considered hypoxia tolerant
Measurement of the Forward-Backward Asymmetry in the B -> K(*) mu+ mu- Decay and First Observation of the Bs -> phi mu+ mu- Decay
We reconstruct the rare decays , , and in a data sample
corresponding to collected in collisions at
by the CDF II detector at the Fermilab Tevatron
Collider. Using and decays we report the branching ratios. In addition, we report
the measurement of the differential branching ratio and the muon
forward-backward asymmetry in the and decay modes, and the
longitudinal polarization in the decay mode with respect to the squared
dimuon mass. These are consistent with the theoretical prediction from the
standard model, and most recent determinations from other experiments and of
comparable accuracy. We also report the first observation of the {\mathcal{B}}(B^0_s \to
\phi\mu^+\mu^-) = [1.44 \pm 0.33 \pm 0.46] \times 10^{-6}27 \pm 6B^0_s$ decay observed.Comment: 7 pages, 2 figures, 3 tables. Submitted to Phys. Rev. Let
Measurements of the properties of Lambda_c(2595), Lambda_c(2625), Sigma_c(2455), and Sigma_c(2520) baryons
We report measurements of the resonance properties of Lambda_c(2595)+ and
Lambda_c(2625)+ baryons in their decays to Lambda_c+ pi+ pi- as well as
Sigma_c(2455)++,0 and Sigma_c(2520)++,0 baryons in their decays to Lambda_c+
pi+/- final states. These measurements are performed using data corresponding
to 5.2/fb of integrated luminosity from ppbar collisions at sqrt(s) = 1.96 TeV,
collected with the CDF II detector at the Fermilab Tevatron. Exploiting the
largest available charmed baryon sample, we measure masses and decay widths
with uncertainties comparable to the world averages for Sigma_c states, and
significantly smaller uncertainties than the world averages for excited
Lambda_c+ states.Comment: added one reference and one table, changed order of figures, 17
pages, 15 figure
Search for a New Heavy Gauge Boson Wprime with Electron + missing ET Event Signature in ppbar collisions at sqrt(s)=1.96 TeV
We present a search for a new heavy charged vector boson decaying
to an electron-neutrino pair in collisions at a center-of-mass
energy of 1.96\unit{TeV}. The data were collected with the CDF II detector
and correspond to an integrated luminosity of 5.3\unit{fb}^{-1}. No
significant excess above the standard model expectation is observed and we set
upper limits on . Assuming standard
model couplings to fermions and the neutrino from the boson decay to
be light, we exclude a boson with mass less than
1.12\unit{TeV/}c^2 at the 95\unit{%} confidence level.Comment: 7 pages, 2 figures Submitted to PR
Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV
The performance of muon reconstruction, identification, and triggering in CMS
has been studied using 40 inverse picobarns of data collected in pp collisions
at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection
criteria covering a wide range of physics analysis needs have been examined.
For all considered selections, the efficiency to reconstruct and identify a
muon with a transverse momentum pT larger than a few GeV is above 95% over the
whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4,
while the probability to misidentify a hadron as a muon is well below 1%. The
efficiency to trigger on single muons with pT above a few GeV is higher than
90% over the full eta range, and typically substantially better. The overall
momentum scale is measured to a precision of 0.2% with muons from Z decays. The
transverse momentum resolution varies from 1% to 6% depending on pseudorapidity
for muons with pT below 100 GeV and, using cosmic rays, it is shown to be
better than 10% in the central region up to pT = 1 TeV. Observed distributions
of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO
- âŚ