50,932 research outputs found
A Practical Method to Estimate Information Content in the Context of 4D-Var Data Assimilation. I: Methodology
Data assimilation obtains improved estimates of the state of a physical system
by combining imperfect model results with sparse and noisy observations of reality.
Not all observations used in data assimilation are equally valuable. The ability to
characterize the usefulness of different data points is important for analyzing the
effectiveness of the assimilation system, for data pruning, and for the design of future
sensor systems.
This paper focuses on the four dimensional variational (4D-Var) data assimilation
framework. Metrics from information theory are used to quantify the contribution
of observations to decreasing the uncertainty with which the system state is known.
We establish an interesting relationship between different information-theoretic metrics
and the variational cost function/gradient under Gaussian linear assumptions.
Based on this insight we derive an ensemble-based computational procedure to estimate
the information content of various observations in the context of 4D-Var. The
approach is illustrated on linear and nonlinear test problems. In the companion paper
[Singh et al.(2011)] the methodology is applied to a global chemical data assimilation
problem
Evaluation of Motion Artifact Metrics for Coronary CT Angiography
Purpose
This study quantified the performance of coronary artery motion artifact metrics relative to human observer ratings. Motion artifact metrics have been used as part of motion correction and bestâphase selection algorithms for Coronary Computed Tomography Angiography (CCTA). However, the lack of ground truth makes it difficult to validate how well the metrics quantify the level of motion artifact. This study investigated five motion artifact metrics, including two novel metrics, using a dynamic phantom, clinical CCTA images, and an observer study that provided groundâtruth motion artifact scores from a series of pairwise comparisons. Method
Five motion artifact metrics were calculated for the coronary artery regions on both phantom and clinical CCTA images: positivity, entropy, normalized circularity, Fold Overlap Ratio (FOR), and LowâIntensity Region Score (LIRS). CT images were acquired of a dynamic cardiac phantom that simulated cardiac motion and contained six iodineâfilled vessels of varying diameter and with regions of soft plaque and calcifications. Scans were repeated with different gantry start angles. Images were reconstructed at five phases of the motion cycle. Clinical images were acquired from 14 CCTA exams with patient heart rates ranging from 52 to 82 bpm. The vessel and shading artifacts were manually segmented by three readers and combined to create groundâtruth artifact regions. Motion artifact levels were also assessed by readers using a pairwise comparison method to establish a groundâtruth reader score. The Kendall\u27s Tau coefficients were calculated to evaluate the statistical agreement in ranking between the motion artifacts metrics and reader scores. Linear regression between the reader scores and the metrics was also performed. Results
On phantom images, the Kendall\u27s Tau coefficients of the five motion artifact metrics were 0.50 (normalized circularity), 0.35 (entropy), 0.82 (positivity), 0.77 (FOR), 0.77(LIRS), where higher Kendall\u27s Tau signifies higher agreement. The FOR, LIRS, and transformed positivity (the fourth root of the positivity) were further evaluated in the study of clinical images. The Kendall\u27s Tau coefficients of the selected metrics were 0.59 (FOR), 0.53 (LIRS), and 0.21 (Transformed positivity). In the study of clinical data, a Motion Artifact Score, defined as the product of FOR and LIRS metrics, further improved agreement with reader scores, with a Kendall\u27s Tau coefficient of 0.65. Conclusion
The metrics of FOR, LIRS, and the product of the two metrics provided the highest agreement in motion artifact ranking when compared to the readers, and the highest linear correlation to the reader scores. The validated motion artifact metrics may be useful for developing and evaluating methods to reduce motion in Coronary Computed Tomography Angiography (CCTA) images
Full Resolution Image Compression with Recurrent Neural Networks
This paper presents a set of full-resolution lossy image compression methods
based on neural networks. Each of the architectures we describe can provide
variable compression rates during deployment without requiring retraining of
the network: each network need only be trained once. All of our architectures
consist of a recurrent neural network (RNN)-based encoder and decoder, a
binarizer, and a neural network for entropy coding. We compare RNN types (LSTM,
associative LSTM) and introduce a new hybrid of GRU and ResNet. We also study
"one-shot" versus additive reconstruction architectures and introduce a new
scaled-additive framework. We compare to previous work, showing improvements of
4.3%-8.8% AUC (area under the rate-distortion curve), depending on the
perceptual metric used. As far as we know, this is the first neural network
architecture that is able to outperform JPEG at image compression across most
bitrates on the rate-distortion curve on the Kodak dataset images, with and
without the aid of entropy coding.Comment: Updated with content for CVPR and removed supplemental material to an
external link for size limitation
Assessing systemic risk due to fire sales spillover through maximum entropy network reconstruction
Assessing systemic risk in financial markets is of great importance but it
often requires data that are unavailable or available at a very low frequency.
For this reason, systemic risk assessment with partial information is
potentially very useful for regulators and other stakeholders. In this paper we
consider systemic risk due to fire sales spillover and portfolio rebalancing by
using the risk metrics defined by Greenwood et al. (2015). By using the Maximum
Entropy principle we propose a method to assess aggregated and single bank's
systemicness and vulnerability and to statistically test for a change in these
variables when only the information on the size of each bank and the
capitalization of the investment assets are available. We prove the
effectiveness of our method on 2001-2013 quarterly data of US banks for which
portfolio composition is available.Comment: 36 pages, 6 figures, Accepted on Journal of Economic Dynamics and
Contro
- âŠ