21,263 research outputs found
Transparency by Design: Closing the Gap Between Performance and Interpretability in Visual Reasoning
Visual question answering requires high-order reasoning about an image, which
is a fundamental capability needed by machine systems to follow complex
directives. Recently, modular networks have been shown to be an effective
framework for performing visual reasoning tasks. While modular networks were
initially designed with a degree of model transparency, their performance on
complex visual reasoning benchmarks was lacking. Current state-of-the-art
approaches do not provide an effective mechanism for understanding the
reasoning process. In this paper, we close the performance gap between
interpretable models and state-of-the-art visual reasoning methods. We propose
a set of visual-reasoning primitives which, when composed, manifest as a model
capable of performing complex reasoning tasks in an explicitly-interpretable
manner. The fidelity and interpretability of the primitives' outputs enable an
unparalleled ability to diagnose the strengths and weaknesses of the resulting
model. Critically, we show that these primitives are highly performant,
achieving state-of-the-art accuracy of 99.1% on the CLEVR dataset. We also show
that our model is able to effectively learn generalized representations when
provided a small amount of data containing novel object attributes. Using the
CoGenT generalization task, we show more than a 20 percentage point improvement
over the current state of the art.Comment: CVPR 2018 pre-prin
Bayesian analysis of multiple direct detection experiments
Bayesian methods offer a coherent and efficient framework for implementing
uncertainties into induction problems. In this article, we review how this
approach applies to the analysis of dark matter direct detection experiments.
In particular we discuss the exclusion limit of XENON100 and the debated hints
of detection under the hypothesis of a WIMP signal. Within parameter inference,
marginalizing consistently over uncertainties to extract robust posterior
probability distributions, we find that the claimed tension between XENON100
and the other experiments can be partially alleviated in isospin violating
scenario, while elastic scattering model appears to be compatible with the
frequentist statistical approach. We then move to model comparison, for which
Bayesian methods are particularly well suited. Firstly, we investigate the
annual modulation seen in CoGeNT data, finding that there is weak evidence for
a modulation. Modulation models due to other physics compare unfavorably with
the WIMP models, paying the price for their excessive complexity. Secondly, we
confront several coherent scattering models to determine the current best
physical scenario compatible with the experimental hints. We find that
exothermic and inelastic dark matter are moderatly disfavored against the
elastic scenario, while the isospin violating model has a similar evidence.
Lastly the Bayes' factor gives inconclusive evidence for an incompatibility
between the data sets of XENON100 and the hints of detection. The same question
assessed with goodness of fit would indicate a 2 sigma discrepancy. This
suggests that more data are therefore needed to settle this question.Comment: 29 pages, 8 figures; invited review for the special issue of the
journal Physics of the Dark Universe; matches the published versio
Self-healing composites: A review
Self-healing composites are composite materials capable of automatic recovery when damaged. They are inspired by biological systems such as the human skin which are naturally able to heal themselves. This paper reviews work on self-healing composites with a focus on capsule-based and vascular healing systems. Complementing previous survey articles, the paper provides an updated overview of the various self-healing concepts proposed over the past 15 years, and a comparative analysis of healing mechanisms and fabrication techniques for building capsules and vascular networks. Based on the analysis, factors that influence healing performance are presented to reveal key barriers and potential research directions
Inhibition and young children's performance on the Tower of London task
Young children, when performing problem solving tasks, show a tendency to break task rules and produce incomplete solutions. We propose that this tendency can be explained by understanding problem solving within the context of the development of “executive functions” – general cognitive control functions, which serve to regulate the operation of the cognitive system. This proposal is supported by the construction of two computational models that simulate separately the performance of 3–4 year old and 5–6 year old children on the Tower of London planning task. We seek in particular to capture the emerging role of inhibition in the older group. The basic framework within which the models are developed is derived from Fox and Das’ Domino model [Fox, J., & Das, S. (2000). Safe and sound: Artificial intelligence in hazardous applications. Cambridge, MA: MIT Press] and Norman and Shallice’s [Norman, D.A., & Shallice, T. (1986). Attention to action: Willed and automatic control of behaviour. In R. Davidson, G. Schwartz, & D. Shapiro (Eds.), Consciousness and Self Regulation (Vol. 4). New York: Plenum] theory of willed and automatic action. Two strategies and a simple perceptual bias are implemented within the models and comparisons between model and child performance reveal a good fit for the key dependent measures (number of rule breaks and percentage of incomplete solutions) of the two groups
Entropy of a quantum channel
The von Neumann entropy of a quantum state is a central concept in physics
and information theory, having a number of compelling physical interpretations.
There is a certain perspective that the most fundamental notion in quantum
mechanics is that of a quantum channel, as quantum states, unitary evolutions,
measurements, and discarding of quantum systems can each be regarded as certain
kinds of quantum channels. Thus, an important goal is to define a consistent
and meaningful notion of the entropy of a quantum channel. Motivated by the
fact that the entropy of a state can be formulated as the difference of
the number of physical qubits and the "relative entropy distance" between
and the maximally mixed state, here we define the entropy of a channel
as the difference of the number of physical qubits of the channel
output with the "relative entropy distance" between and the
completely depolarizing channel. We prove that this definition satisfies all of
the axioms, recently put forward in [Gour, IEEE Trans. Inf. Theory 65, 5880
(2019)], required for a channel entropy function. The task of quantum channel
merging, in which the goal is for the receiver to merge his share of the
channel with the environment's share, gives a compelling operational
interpretation of the entropy of a channel. We define Renyi and min-entropies
of a channel and prove that they satisfy the axioms required for a channel
entropy function. Among other results, we also prove that a smoothed version of
the min-entropy of a channel satisfies the asymptotic equipartition property.Comment: v2: 29 pages, 1 figur
Seasonal Use of Marijuana and Cocaine by Arrestees in Anchorage, Alaska
Previously presented at the Western Society of Criminology, Honolulu, HI, Feb 2005.This paper explores the relation between season (fall, winter, spring and summer) and drug use among arrestees. The analysis examines seasonal differences of proportions of drug tests positive for marijuana or cocaine among recently arrested and booked suspects in Anchorage, Alaska. The study is based on Arrestee Drug Abuse Monitoring (ADAM) data collected in Anchorage during the period between 1999 and the third quarter of 2003.Paper supported in part by Grant No. 2002-BJ-CX-K018 from the
Bureau of Justice Statistics, Office of Justice Programs, U.S. Department of Justice.Abstract /
Seasonal Use of Drugs /
Data and Methods of Analysis /
Seasonality and Marijuana Use /
Seasonality and Cocaine Use /
Discussion /
Reference
- …