3,410 research outputs found
Neural scaling laws for an uncertain world
Autonomous neural systems must efficiently process information in a wide
range of novel environments, which may have very different statistical
properties. We consider the problem of how to optimally distribute receptors
along a one-dimensional continuum consistent with the following design
principles. First, neural representations of the world should obey a neural
uncertainty principle---making as few assumptions as possible about the
statistical structure of the world. Second, neural representations should
convey, as much as possible, equivalent information about environments with
different statistics. The results of these arguments resemble the structure of
the visual system and provide a natural explanation of the behavioral
Weber-Fechner law, a foundational result in psychology. Because the derivation
is extremely general, this suggests that similar scaling relationships should
be observed not only in sensory continua, but also in neural representations of
``cognitive' one-dimensional quantities such as time or numerosity
Towards a neural-level cognitive architecture: modeling behavior in working memory tasks with neurons
Constrained by results from classic behavioral experiments we
provide a neural-level cognitive architecture for modeling behavior
in working memory tasks. We propose a canonical
microcircuit that can be used as a building block for working
memory, decision making and cognitive control. The controller
controls gates to route the flow of information between
the working memory and the evidence accumulator and sets
parameters of the circuits. We show that this type of cognitive
architecture can account for results in behavioral experiments
such as judgment of recency, probe recognition and delayedmatch-
to-sample. In addition, the neural dynamics generated
by the cognitive architecture provides a good match with neurophysiological
data from rodents and monkeys. For instance,
it generates cells tuned to a particular amount of elapsed time
(time cells), to a particular position in space (place cells) and
to a particular amount of accumulated evidence.http://sites.bu.edu/tcn/files/2019/05/Cogsci2019_TiganjEtal.pdfAccepted manuscrip
Evidence accumulation in a Laplace domain decision space
Evidence accumulation models of simple decision-making have long assumed that
the brain estimates a scalar decision variable corresponding to the
log-likelihood ratio of the two alternatives. Typical neural implementations of
this algorithmic cognitive model assume that large numbers of neurons are each
noisy exemplars of the scalar decision variable. Here we propose a neural
implementation of the diffusion model in which many neurons construct and
maintain the Laplace transform of the distance to each of the decision bounds.
As in classic findings from brain regions including LIP, the firing rate of
neurons coding for the Laplace transform of net accumulated evidence grows to a
bound during random dot motion tasks. However, rather than noisy exemplars of a
single mean value, this approach makes the novel prediction that firing rates
grow to the bound exponentially, across neurons there should be a distribution
of different rates. A second set of neurons records an approximate inversion of
the Laplace transform, these neurons directly estimate net accumulated
evidence. In analogy to time cells and place cells observed in the hippocampus
and other brain regions, the neurons in this second set have receptive fields
along a "decision axis." This finding is consistent with recent findings from
rodent recordings. This theoretical approach places simple evidence
accumulation models in the same mathematical language as recent proposals for
representing time and space in cognitive models for memory.Comment: Revised for CB
Rapid presentation rate negatively impacts the contiguity effect in free recall
It is well-known that in free recall participants tend to recall
words presented close together in time in sequence, reflecting
a form of temporal binding in memory. This contiguity effect
is robust, having been observed across many different experimental
manipulations. In order to explore a potential boundary
on the contiguity effect, participants performed a free recall
task in which items were presented at rates ranging from 2 Hz
to 8 Hz. Participants were still able to recall items even at
the fastest presentation rate, though accuracy decreased. Importantly,
the contiguity effect flattened as presentation rates
increased. These findings illuminate possible constraints on
the temporal encoding of episodic memories.http://sites.bu.edu/tcn/files/2019/05/RSVP_FR.pdfAccepted manuscrip
Action Evaluation in the Theory and Practice of Conflict Resolution
Questions of evaluation are important to conveners, participants and funders of conflict resolution initiatives. Yet good evaluation is tied to a number of complicated questions concerning what constitutes success and failure in projects that may be multi-dimensional or only part of an effort to settle a larger conflict. Rothman has offered Action Evaluation as a methodology that seeks to incorporate goal setting and evaluation into project designs. He argues that this will improve a project by monitoring the changing nature of goals through the life of a conflict resolution intervention, and action evaluation’s self-conscious attention to goal setting offers a mechanism for developing and committing an intervention to specific internal and external standards of evaluation. This article examines Action Evaluation as a theory of practice, considering its conceptual strengths and examining specific issues of its implementation
Some Guidelines for Conceptualizing Success in Conflict Resolution Evaluation
The immediate job of project evaluation is to decide what worked and what didn’t. However, the more challenging task is making sense of why success or failure occurred and in so doing to propose appropriate future action. Effective evaluation of conflict resolution initiatives is complicated since interventions involve multiple goals and cross-level connections where indirect effects are often not seen in the short-run. This paper argues that there is no single best instrument or method for evaluating the extent to which conflict resolution practice has been successful. However, this does not mean that evaluation should be ignored. Instead projects need to develop methods that are good enough to be applied in contextually appropriate ways. To assist in this process, this article offers six guidelines for deciding when, how, and the extent to which specific conflict resolution interventions are effective. Good evaluation requires a self-conscious effort to articulate the most significant goals of different groups of participants and to track goal evolution in the course of a project using multiple, operational criteria. It should addresses the question of transfer, the ways in which direct work with only a small number of project participants, is expected to have more extensive, indirect effects on the course of the wider conflict. If it is done well, good evaluation helps practitioners define future activities and helps interveners and funders to imagine good-enough conflict management asking not whether they have fully resolved a complicated conflict but whether they have improved conditions sufficiently so that the parties in the conflict are more likely to develop the capacity to manage it constructively in the future
Constitutional Law - Charitable Tax Exemptions - Granting of Tax Benefits to Discriminatory Fraternal Orders is a Violation of the Equal Protection Aspect of the Fifth Amendment
Constitutional Law - Charitable Tax Exemptions - Granting of Tax Benefits to Discriminatory Fraternal Orders is a Violation of the Equal Protection Aspect of the Fifth Amendment
- …