95,790 research outputs found
Network reconstruction via density sampling
Reconstructing weighted networks from partial information is necessary in many important circumstances, e.g. for a correct estimation of systemic risk. It has been shown that, in order to achieve an accurate reconstruction, it is crucial to reliably replicate the empirical degree sequence, which is however unknown in many realistic situations. More recently, it has been found that the knowledge of the degree sequence can be replaced by the knowledge of the strength sequence, which is typically accessible, complemented by that of the total number of links, thus considerably relaxing the observational requirements. Here we further relax these requirements and devise a procedure valid when even the the total number of links is unavailable. We assume that, apart from the heterogeneity induced by the degree sequence itself, the network is homogeneous, so that its (global) link density can be estimated by sampling subsets of nodes with representative density. We show that the best way of sampling nodes is the random selection scheme, any other procedure being biased towards unrealistically large, or small, link densities. We then introduce our core technique for reconstructing both the topology and the link weights of the unknown network in detail. When tested on real economic and financial data sets, our method achieves a remarkable accuracy and is very robust with respect to the sampled subsets, thus representing a reliable practical tool whenever the available topological information is restricted to small portions of nodes
SketchSampler: Sketch-based 3D Reconstruction via View-dependent Depth Sampling
Reconstructing a 3D shape based on a single sketch image is challenging due
to the large domain gap between a sparse, irregular sketch and a regular, dense
3D shape. Existing works try to employ the global feature extracted from sketch
to directly predict the 3D coordinates, but they usually suffer from losing
fine details that are not faithful to the input sketch. Through analyzing the
3D-to-2D projection process, we notice that the density map that characterizes
the distribution of 2D point clouds (i.e., the probability of points projected
at each location of the projection plane) can be used as a proxy to facilitate
the reconstruction process. To this end, we first translate a sketch via an
image translation network to a more informative 2D representation that can be
used to generate a density map. Next, a 3D point cloud is reconstructed via a
two-stage probabilistic sampling process: first recovering the 2D points (i.e.,
the x and y coordinates) by sampling the density map; and then predicting the
depth (i.e., the z coordinate) by sampling the depth values at the ray
determined by each 2D point. Extensive experiments are conducted, and both
quantitative and qualitative results show that our proposed approach
significantly outperforms other baseline methods.Comment: 16 pages, 7 figures, accepted by ECCV 202
High-resolution distributed sampling of bandlimited fields with low-precision sensors
The problem of sampling a discrete-time sequence of spatially bandlimited
fields with a bounded dynamic range, in a distributed,
communication-constrained, processing environment is addressed. A central unit,
having access to the data gathered by a dense network of fixed-precision
sensors, operating under stringent inter-node communication constraints, is
required to reconstruct the field snapshots to maximum accuracy. Both
deterministic and stochastic field models are considered. For stochastic
fields, results are established in the almost-sure sense. The feasibility of
having a flexible tradeoff between the oversampling rate (sensor density) and
the analog-to-digital converter (ADC) precision, while achieving an exponential
accuracy in the number of bits per Nyquist-interval per snapshot is
demonstrated. This exposes an underlying ``conservation of bits'' principle:
the bit-budget per Nyquist-interval per snapshot (the rate) can be distributed
along the amplitude axis (sensor-precision) and space (sensor density) in an
almost arbitrary discrete-valued manner, while retaining the same (exponential)
distortion-rate characteristics. Achievable information scaling laws for field
reconstruction over a bounded region are also derived: With N one-bit sensors
per Nyquist-interval, Nyquist-intervals, and total network
bitrate (per-sensor bitrate ), the maximum pointwise distortion goes to zero as
or . This is shown to be possible
with only nearest-neighbor communication, distributed coding, and appropriate
interpolation algorithms. For a fixed, nonzero target distortion, the number of
fixed-precision sensors and the network rate needed is always finite.Comment: 17 pages, 6 figures; paper withdrawn from IEEE Transactions on Signal
Processing and re-submitted to the IEEE Transactions on Information Theor
Bayesian Cosmic Web Reconstruction: BARCODE for Clusters
We describe the Bayesian BARCODE formalism that has been designed towards the
reconstruction of the Cosmic Web in a given volume on the basis of the sampled
galaxy cluster distribution. Based on the realization that the massive compact
clusters are responsible for the major share of the large scale tidal force
field shaping the anisotropic and in particular filamentary features in the
Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster
configurations, we resort to a state-of-the-art constrained reconstruction
technique to find a proper statistically sampled realization of the original
initial density and velocity field in the same cosmic region. Ultimately, the
subsequent gravitational evolution of these initial conditions towards the
implied Cosmic Web configuration can be followed on the basis of a proper
analytical model or an N-body computer simulation. The BARCODE formalism
includes an implicit treatment for redshift space distortions. This enables a
direct reconstruction on the basis of observational data, without the need for
a correction of redshift space artifacts. In this contribution we provide a
general overview of the the Cosmic Web connection with clusters and a
description of the Bayesian BARCODE formalism. We conclude with a presentation
of its successful workings with respect to test runs based on a simulated large
scale matter distribution, in physical space as well as in redshift space.Comment: 18 pages, 8 figures, Proceedings of IAU Symposium 308 "The Zeldovich
Universe: Genesis and Growth of the Cosmic Web", 23-28 June 2014, Tallinn,
Estoni
HYDRA: Hybrid Deep Magnetic Resonance Fingerprinting
Purpose: Magnetic resonance fingerprinting (MRF) methods typically rely on
dictio-nary matching to map the temporal MRF signals to quantitative tissue
parameters. Such approaches suffer from inherent discretization errors, as well
as high computational complexity as the dictionary size grows. To alleviate
these issues, we propose a HYbrid Deep magnetic ResonAnce fingerprinting
approach, referred to as HYDRA.
Methods: HYDRA involves two stages: a model-based signature restoration phase
and a learning-based parameter restoration phase. Signal restoration is
implemented using low-rank based de-aliasing techniques while parameter
restoration is performed using a deep nonlocal residual convolutional neural
network. The designed network is trained on synthesized MRF data simulated with
the Bloch equations and fast imaging with steady state precession (FISP)
sequences. In test mode, it takes a temporal MRF signal as input and produces
the corresponding tissue parameters.
Results: We validated our approach on both synthetic data and anatomical data
generated from a healthy subject. The results demonstrate that, in contrast to
conventional dictionary-matching based MRF techniques, our approach
significantly improves inference speed by eliminating the time-consuming
dictionary matching operation, and alleviates discretization errors by
outputting continuous-valued parameters. We further avoid the need to store a
large dictionary, thus reducing memory requirements.
Conclusions: Our approach demonstrates advantages in terms of inference
speed, accuracy and storage requirements over competing MRF method
Reactive explorers to unravel network topology
A procedure is developed and tested to recover the distribution of
connectivity of an a priori unknown network, by sampling the dynamics of an
ensemble made of reactive walkers. The relative weight between reaction and
relocation is gauged by a scalar control parameter, which can be adjusted at
will. Different equilibria are attained by the system, following the externally
imposed modulation, and reflecting the interplay between reaction and diffusion
terms. The information gathered on the observation node is used to predict the
stationary density as displayed by the system, via a direct implementation of
the celebrated Heterogeneous Mean Field (HMF) approximation. This knowledge
translates into a linear problem which can be solved to return the entries of
the sought distribution. A variant of the model is then considered which
consists in assuming a localized source where the reactive constituents are
injected, at a rate that can be adjusted as a stepwise function of time. The
linear problem obtained when operating in this setting allows one to recover a
fair estimate of the underlying system size. Numerical experiments are carried
so as to challenge the predictive ability of the theory
- …