19,412 research outputs found
New Complexity Results and Algorithms for the Minimum Tollbooth Problem
The inefficiency of the Wardrop equilibrium of nonatomic routing games can be
eliminated by placing tolls on the edges of a network so that the socially
optimal flow is induced as an equilibrium flow. A solution where the minimum
number of edges are tolled may be preferable over others due to its ease of
implementation in real networks. In this paper we consider the minimum
tollbooth (MINTB) problem, which seeks social optimum inducing tolls with
minimum support. We prove for single commodity networks with linear latencies
that the problem is NP-hard to approximate within a factor of through
a reduction from the minimum vertex cover problem. Insights from network design
motivate us to formulate a new variation of the problem where, in addition to
placing tolls, it is allowed to remove unused edges by the social optimum. We
prove that this new problem remains NP-hard even for single commodity networks
with linear latencies, using a reduction from the partition problem. On the
positive side, we give the first exact polynomial solution to the MINTB problem
in an important class of graphs---series-parallel graphs. Our algorithm solves
MINTB by first tabulating the candidate solutions for subgraphs of the
series-parallel network and then combining them optimally
Learning associations between clinical information and motion-based descriptors using a large scale MR-derived cardiac motion atlas
The availability of large scale databases containing imaging and non-imaging
data, such as the UK Biobank, represents an opportunity to improve our
understanding of healthy and diseased bodily function. Cardiac motion atlases
provide a space of reference in which the motion fields of a cohort of subjects
can be directly compared. In this work, a cardiac motion atlas is built from
cine MR data from the UK Biobank (~ 6000 subjects). Two automated quality
control strategies are proposed to reject subjects with insufficient image
quality. Based on the atlas, three dimensionality reduction algorithms are
evaluated to learn data-driven cardiac motion descriptors, and statistical
methods used to study the association between these descriptors and non-imaging
data. Results show a positive correlation between the atlas motion descriptors
and body fat percentage, basal metabolic rate, hypertension, smoking status and
alcohol intake frequency. The proposed method outperforms the ability to
identify changes in cardiac function due to these known cardiovascular risk
factors compared to ejection fraction, the most commonly used descriptor of
cardiac function. In conclusion, this work represents a framework for further
investigation of the factors influencing cardiac health.Comment: 2018 International Workshop on Statistical Atlases and Computational
Modeling of the Hear
How many radio-loud quasars can be detected by the Gamma-Ray Large Area Space Telescope?
In the unification scheme, radio quasars and FR II radio galaxies come from
the same parent population, but viewed at different angles. Based on the
Comptonization models for the gamma-ray emission from active galactic nuclei
(AGNs), we estimate the number of radio quasars and FR II radio galaxies to be
detected by the Gamma-Ray Large Area Space Telescope (GLAST) using the
luminosity function (LF) of their parent population derived from the
flat-spectrum radio quasar (FSRQ) LF. We find that ~1200 radio quasars will be
detected by GLAST, if the soft seed photons for Comptonization come from the
regions outside the jets. We also consider the synchrotron self-Comptonization
(SSC) model, and find it unlikely to be responsible for gamma-ray emission from
radio quasars. We find that no FR II radio galaxies will be detected by GLAST.
Our results show that most radio AGNs to be detected by GLAST will be FSRQs
(~99 % for the external Comptonization model, EC model), while the remainder
(~1 %) will be steep-spectrum radio quasars (SSRQs). This implies that FSRQs
will still be good candidates for identifying gamma-ray AGNs even for the GLAST
sources. The contribution of all radio quasars and FR II radio galaxies to the
extragalactic gamma-ray background (EGRB) is calculated, which accounts for ~30
% of the EGRB.Comment: 4 pages, accepted by ApJ Letter
Recommended from our members
Large-scale Quality Control of Cardiac Imaging in Population Studies: Application to UK Biobank
In large population studies such as the UK Biobank (UKBB), quality control of the acquired images by visual assessment is unfeasible. In this paper, we apply a recently developed fully-automated quality control pipeline for cardiac MR (CMR) images to the first 19,265 short-axis (SA) cine stacks from the UKBB. We present the results for the three estimated quality metrics (heart coverage, inter-slice motion and image contrast in the cardiac region) as well as their potential associations with factors including acquisition details and subject-related phenotypes. Up to 14.2% of the analysed SA stacks had sub-optimal coverage (i.e. missing basal and/or apical slices), however most of them were limited to the first year of acquisition. Up to 16% of the stacks were affected by noticeable inter-slice motion (i.e. average inter-slice misalignment greater than 3.4 mm). Inter-slice motion was positively correlated with weight and body surface area. Only 2.1% of the stacks had an average end-diastolic cardiac image contrast below 30% of the dynamic range. These findings will be highly valuable for both the scientists involved in UKBB CMR acquisition and for the ones who use the dataset for research purposes
Structural Change in (Economic) Time Series
Methods for detecting structural changes, or change points, in time series
data are widely used in many fields of science and engineering. This chapter
sketches some basic methods for the analysis of structural changes in time
series data. The exposition is confined to retrospective methods for univariate
time series. Several recent methods for dating structural changes are compared
using a time series of oil prices spanning more than 60 years. The methods
broadly agree for the first part of the series up to the mid-1980s, for which
changes are associated with major historical events, but provide somewhat
different solutions thereafter, reflecting a gradual increase in oil prices
that is not well described by a step function. As a further illustration, 1990s
data on the volatility of the Hang Seng stock market index are reanalyzed.Comment: 12 pages, 6 figure
Testing linear hypotheses in high-dimensional regressions
For a multivariate linear model, Wilk's likelihood ratio test (LRT)
constitutes one of the cornerstone tools. However, the computation of its
quantiles under the null or the alternative requires complex analytic
approximations and more importantly, these distributional approximations are
feasible only for moderate dimension of the dependent variable, say .
On the other hand, assuming that the data dimension as well as the number
of regression variables are fixed while the sample size grows, several
asymptotic approximations are proposed in the literature for Wilk's \bLa
including the widely used chi-square approximation. In this paper, we consider
necessary modifications to Wilk's test in a high-dimensional context,
specifically assuming a high data dimension and a large sample size .
Based on recent random matrix theory, the correction we propose to Wilk's test
is asymptotically Gaussian under the null and simulations demonstrate that the
corrected LRT has very satisfactory size and power, surely in the large and
large context, but also for moderately large data dimensions like or
. As a byproduct, we give a reason explaining why the standard chi-square
approximation fails for high-dimensional data. We also introduce a new
procedure for the classical multiple sample significance test in MANOVA which
is valid for high-dimensional data.Comment: Accepted 02/2012 for publication in "Statistics". 20 pages, 2 pages
and 2 table
Heisenberg antiferromagnet on Cayley trees: low-energy spectrum and even/odd site imbalance
To understand the role of local sublattice imbalance in low-energy spectra of
s=1/2 quantum antiferromagnets, we study the s=1/2 quantum nearest neighbor
Heisenberg antiferromagnet on the coordination 3 Cayley tree. We perform
many-body calculations using an implementation of the density matrix
renormalization group (DMRG) technique for generic tree graphs. We discover
that the bond-centered Cayley tree has a quasidegenerate set of a low-lying
tower of states and an "anomalous" singlet-triplet finite-size gap scaling. For
understanding the construction of the first excited state from the many-body
ground state, we consider a wave function ansatz given by the single-mode
approximation, which yields a high overlap with the DMRG wave function.
Observing the ground-state entanglement spectrum leads us to a picture of the
low-energy degrees of freedom being "giant spins" arising out of sublattice
imbalance, which helps us analytically understand the scaling of the
finite-size spin gap. The Schwinger-boson mean-field theory has been
generalized to nonuniform lattices, and ground states have been found which are
spatially inhomogeneous in the mean-field parameters.Comment: 19 pages, 12 figures, 6 tables. Changes made to manuscript after
referee suggestions: parts reorganized, clarified discussion on Fibonacci
tree, typos correcte
- …