3,245 research outputs found
The Importance Of Community: Investing In Effective Community-Based Crime Prevention Strategies
After more than a year of listening to our community, researching evidence-based practices, and evaluating our own efforts, 'The Importance of Community' inaugural report unequivocally asserts that our greatest potential of reducing homicides and incarceration as a result of committing a crime is deeply rooted in collective community action and targeted interventions aimed at serving narrowly defined populations. In this report, The Indianapolis Foundation will summarize years of community-based recommendations and provides a specific community investment plan based on multiple community convenings, crime prevention related reports, and listening to our community
Empowerment as a metric for Optimization in HCI
We propose a novel metric for optimizing human-computer interfaces, based on the information-theoretic capacity of empowerment, a task-independent universal utility measure. Empowerment measures, for agent-environment systems with stochastic transitions, how much influence, which can be sensed by the agent sensors, an agent has on its environment. It captures the uncertainty in human-machine systems arising from different sources (i.e. noise, delays, errors, etc.) as a single quantity. We suggest the potential empowerment has as an objective optimality criterion in user interface design optimization, contributing to the more solid theoretical foundations of HCI.Peer reviewedFinal Accepted Versio
Block-Conditional Missing at Random Models for Missing Data
Two major ideas in the analysis of missing data are (a) the EM algorithm
[Dempster, Laird and Rubin, J. Roy. Statist. Soc. Ser. B 39 (1977) 1--38] for
maximum likelihood (ML) estimation, and (b) the formulation of models for the
joint distribution of the data and missing data indicators , and
associated "missing at random"; (MAR) condition under which a model for
is unnecessary [Rubin, Biometrika 63 (1976) 581--592]. Most previous work has
treated and as single blocks, yielding selection or pattern-mixture
models depending on how their joint distribution is factorized. This paper
explores "block-sequential"; models that interleave subsets of the variables
and their missing data indicators, and then make parameter restrictions based
on assumptions in each block. These include models that are not MAR. We examine
a subclass of block-sequential models we call block-conditional MAR (BCMAR)
models, and an associated block-monotone reduced likelihood strategy that
typically yields consistent estimates by selectively discarding some data.
Alternatively, full ML estimation can often be achieved via the EM algorithm.
We examine in some detail BCMAR models for the case of two multinomially
distributed categorical variables, and a two block structure where the first
block is categorical and the second block arises from a (possibly multivariate)
exponential family distribution.Comment: Published in at http://dx.doi.org/10.1214/10-STS344 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Model of Coordination Flow in Remote Collaborative Interaction
© 2015 IEEEWe present an information-theoretic approach for modelling coordination in human-human interaction and measuring coordination flows in a remote collaborative tracking task. Building on Shannon's mutual information, coordination flow measures, for stochastic collaborative systems, how much influence, the environment has on the joint control of collaborating parties. We demonstrate the application of the approach on interactive human data recorded in a user study and reveal the amount of effort required for creating rigorous models. Our initial results suggest the potential coordination flow has - as an objective, task-independent measure - in supporting designers of human collaborative systems and in providing better theoretical foundations for the science of Human-Computer Interaction
Fatigue damage of notched boron/epoxy laminates under constant amplitude loading
Fatigue damage in (0, + or - 45) and (0, + or - 45,90) boron/epoxy laminates was studied with X-ray radiography and scanning electron microscopy. In addition, limited tests for residual strength and stiffness were performed. The results of this study suggest that in boron/epoxy laminates the 45-degree plies play a key role in the fatigue process of boron/epoxy laminates that contain them. The fatigue process in the + or - 45-degree plies starts as intralaminar matrix cracks
Surface-sampled simulations of turbulent flow at high Reynolds number
A new approach to turbulence simulation, based on a combination of large-eddy
simulation (LES) for the whole flow and an array of non-space-filling
quasi-direct numerical simulations (QDNS), which sample the response of
near-wall turbulence to large-scale forcing, is proposed and evaluated. The
technique overcomes some of the cost limitations of turbulence simulation,
since the main flow is treated with a coarse-grid LES, with the equivalent of
wall functions supplied by the near-wall sampled QDNS. Two cases are tested, at
friction Reynolds number Re=4200 and 20,000. The total grid node count
for the first case is less than half a million and less than two million for
the second case, with the calculations only requiring a desktop computer. A
good agreement with published DNS is found at Re=4200, both in terms of
the mean velocity profile and the streamwise velocity fluctuation statistics,
which correctly show a substantial increase in near-wall turbulence levels due
to a modulation of near-wall streaks by large-scale structures. The trend
continues at Re=20,000, in agreement with experiment, which represents
one of the major achievements of the new approach. A number of detailed aspects
of the model, including numerical resolution, LES-QDNS coupling strategy and
sub-grid model are explored. A low level of grid sensitivity is demonstrated
for both the QDNS and LES aspects. Since the method does not assume a law of
the wall, it can in principle be applied to flows that are out of equilibrium.Comment: Author accepted version. Accepted for publication in the
International Journal for Numerical Methods in Fluids on 26 April 201
Accounting for the Electoral Effects of Short Term Economic Fluctuations: The Role of Incumbency-Oriented and Policy-Oriented Voting
Several previous studies have found considerable evidence of incumbency-oriented voting, i.e. voting for or against the incumbent president and candidates of his party on the basis of fluctuations in economic conditions. This study explores the hypothesis that voting in response to economic conditions is often policy
The Rationality of Candidates who Challenge Incumbents in Congressional Elections
Making use of the numerous resources available to them, incumbent congressmen have come to enjoy very high rates of success in getting reelected. Typically, however, incumbents are challenged by relatively weak, unknown candidates, while potentially much stronger candidates are deterred. So why do these weak candidates engage in such apparently foolish behavior?
Previous research has suggested several answers to this question. It is commonly argued that weak, inexperienced candidates either misperceive the odds against them, or that they are actually using a congressional campaign to pursue nonpolitical goals or political goals other than winning office. Others point out that weak candidates may be induced to run by a low probability of victory because their political opportunity costs are low or because a stronger than expected showing may serve as an investment in future campaigns. This paper argues, however, that there is a much simpler and direct reason why weak candidates choose to run against incumbents, and that is that they do so so as to maximize their probability of being elected to Congress
The Demise of California's Public Schools Reconsidered
I’m not sure when I first got interested in this particular line of research—the fact that I have a son who is now 10 and that we had to make a lot of decisions about his educational future probably got me a bit worried, but I think it actually dates back to when we first arrived in California in the fall of ’79. It seemed that all anyone was talking about was Proposition 13, which had passed by a nearly 2-to-1 margin (65 to 35 percent) the previous year. Everywhere we went, it was Proposition 13 this and Proposition 13 that. Some people felt that the voters had just gotten into an angry snit and had irrationally gone on an antigovernment crusade without thinking about the consequences; people on the other side felt that they had been provoked by then-governor Jerry Brown’s inane fiscal policies. I don’t know if we ever sorted that out, but the conventional wisdom, both among public-policy experts and the voters on the street, has been that Proposition 13 was roughly equal to the Sylmar earthquake, except that we inflicted it upon ourselves
Approval Voting: the Case of the 1968 Election
For most of this nation's history, electoral democracy has consisted mainly of competition between the candidates of two major national parties. Thus in most elections for national office the present method of categorical voting, that is, voters vote for one candidate only, has served adequately. Except for the electoral college, this method satisfies a simple democratic criterion - the winner is prefferes to the loser by a simple majority of the voters
- …