5,057 research outputs found
Empowerment as a metric for Optimization in HCI
We propose a novel metric for optimizing human-computer interfaces, based on the information-theoretic capacity of empowerment, a task-independent universal utility measure. Empowerment measures, for agent-environment systems with stochastic transitions, how much influence, which can be sensed by the agent sensors, an agent has on its environment. It captures the uncertainty in human-machine systems arising from different sources (i.e. noise, delays, errors, etc.) as a single quantity. We suggest the potential empowerment has as an objective optimality criterion in user interface design optimization, contributing to the more solid theoretical foundations of HCI.Peer reviewedFinal Accepted Versio
Fatigue damage of notched boron/epoxy laminates under constant amplitude loading
Fatigue damage in (0, + or - 45) and (0, + or - 45,90) boron/epoxy laminates was studied with X-ray radiography and scanning electron microscopy. In addition, limited tests for residual strength and stiffness were performed. The results of this study suggest that in boron/epoxy laminates the 45-degree plies play a key role in the fatigue process of boron/epoxy laminates that contain them. The fatigue process in the + or - 45-degree plies starts as intralaminar matrix cracks
The Importance Of Community: Investing In Effective Community-Based Crime Prevention Strategies
After more than a year of listening to our community, researching evidence-based practices, and evaluating our own efforts, 'The Importance of Community' inaugural report unequivocally asserts that our greatest potential of reducing homicides and incarceration as a result of committing a crime is deeply rooted in collective community action and targeted interventions aimed at serving narrowly defined populations. In this report, The Indianapolis Foundation will summarize years of community-based recommendations and provides a specific community investment plan based on multiple community convenings, crime prevention related reports, and listening to our community
The Rationality of Candidates who Challenge Incumbents in Congressional Elections
Making use of the numerous resources available to them, incumbent congressmen have come to enjoy very high rates of success in getting reelected. Typically, however, incumbents are challenged by relatively weak, unknown candidates, while potentially much stronger candidates are deterred. So why do these weak candidates engage in such apparently foolish behavior?
Previous research has suggested several answers to this question. It is commonly argued that weak, inexperienced candidates either misperceive the odds against them, or that they are actually using a congressional campaign to pursue nonpolitical goals or political goals other than winning office. Others point out that weak candidates may be induced to run by a low probability of victory because their political opportunity costs are low or because a stronger than expected showing may serve as an investment in future campaigns. This paper argues, however, that there is a much simpler and direct reason why weak candidates choose to run against incumbents, and that is that they do so so as to maximize their probability of being elected to Congress
The Demise of California's Public Schools Reconsidered
I’m not sure when I first got interested in this particular line of research—the fact that I have a son who is now 10 and that we had to make a lot of decisions about his educational future probably got me a bit worried, but I think it actually dates back to when we first arrived in California in the fall of ’79. It seemed that all anyone was talking about was Proposition 13, which had passed by a nearly 2-to-1 margin (65 to 35 percent) the previous year. Everywhere we went, it was Proposition 13 this and Proposition 13 that. Some people felt that the voters had just gotten into an angry snit and had irrationally gone on an antigovernment crusade without thinking about the consequences; people on the other side felt that they had been provoked by then-governor Jerry Brown’s inane fiscal policies. I don’t know if we ever sorted that out, but the conventional wisdom, both among public-policy experts and the voters on the street, has been that Proposition 13 was roughly equal to the Sylmar earthquake, except that we inflicted it upon ourselves
Approval Voting: the Case of the 1968 Election
For most of this nation's history, electoral democracy has consisted mainly of competition between the candidates of two major national parties. Thus in most elections for national office the present method of categorical voting, that is, voters vote for one candidate only, has served adequately. Except for the electoral college, this method satisfies a simple democratic criterion - the winner is prefferes to the loser by a simple majority of the voters
Quantitative magnetic resonance image analysis via the EM algorithm with stochastic variation
Quantitative Magnetic Resonance Imaging (qMRI) provides researchers insight
into pathological and physiological alterations of living tissue, with the help
of which researchers hope to predict (local) therapeutic efficacy early and
determine optimal treatment schedule. However, the analysis of qMRI has been
limited to ad-hoc heuristic methods. Our research provides a powerful
statistical framework for image analysis and sheds light on future localized
adaptive treatment regimes tailored to the individual's response. We assume in
an imperfect world we only observe a blurred and noisy version of the
underlying pathological/physiological changes via qMRI, due to measurement
errors or unpredictable influences. We use a hidden Markov random field to
model the spatial dependence in the data and develop a maximum likelihood
approach via the Expectation--Maximization algorithm with stochastic variation.
An important improvement over previous work is the assessment of variability in
parameter estimation, which is the valid basis for statistical inference. More
importantly, we focus on the expected changes rather than image segmentation.
Our research has shown that the approach is powerful in both simulation studies
and on a real dataset, while quite robust in the presence of some model
assumption violations.Comment: Published in at http://dx.doi.org/10.1214/07-AOAS157 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Block-Conditional Missing at Random Models for Missing Data
Two major ideas in the analysis of missing data are (a) the EM algorithm
[Dempster, Laird and Rubin, J. Roy. Statist. Soc. Ser. B 39 (1977) 1--38] for
maximum likelihood (ML) estimation, and (b) the formulation of models for the
joint distribution of the data and missing data indicators , and
associated "missing at random"; (MAR) condition under which a model for
is unnecessary [Rubin, Biometrika 63 (1976) 581--592]. Most previous work has
treated and as single blocks, yielding selection or pattern-mixture
models depending on how their joint distribution is factorized. This paper
explores "block-sequential"; models that interleave subsets of the variables
and their missing data indicators, and then make parameter restrictions based
on assumptions in each block. These include models that are not MAR. We examine
a subclass of block-sequential models we call block-conditional MAR (BCMAR)
models, and an associated block-monotone reduced likelihood strategy that
typically yields consistent estimates by selectively discarding some data.
Alternatively, full ML estimation can often be achieved via the EM algorithm.
We examine in some detail BCMAR models for the case of two multinomially
distributed categorical variables, and a two block structure where the first
block is categorical and the second block arises from a (possibly multivariate)
exponential family distribution.Comment: Published in at http://dx.doi.org/10.1214/10-STS344 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Presidential Influence on Congressional Appropriations Decisions
We investigate the extent to which possession of the veto allows the president to influence congressional decisions regarding regular annual appropriations legislation. The most important implication of our analysis is that the influence the veto conveys is asymmetrical: it allows the president to restrain Congress when he prefers to appropriate less to an agency than they do; it does not provide him an effective means of extracting higher appropriations from Congress when he prefers to spend more than they do. This asymmetry derives from Constitutional limitations on the veto, the sequencing of the appropriations process provided by the Budget and Accounting Act of 1920, and the presence of a de facto reversionary expenditure level contained in continuing resolutions (Fanno, 1966). We find strong support for this proposition in a regression of presidential requests upon congressional appropriations decisions
- …
