1,068 research outputs found
Bayesian changepoint analysis for atomic force microscopy and soft material indentation
Material indentation studies, in which a probe is brought into controlled
physical contact with an experimental sample, have long been a primary means by
which scientists characterize the mechanical properties of materials. More
recently, the advent of atomic force microscopy, which operates on the same
fundamental principle, has in turn revolutionized the nanoscale analysis of
soft biomaterials such as cells and tissues. This paper addresses the
inferential problems associated with material indentation and atomic force
microscopy, through a framework for the changepoint analysis of pre- and
post-contact data that is applicable to experiments across a variety of
physical scales. A hierarchical Bayesian model is proposed to account for
experimentally observed changepoint smoothness constraints and measurement
error variability, with efficient Monte Carlo methods developed and employed to
realize inference via posterior sampling for parameters such as Young's
modulus, a key quantifier of material stiffness. These results are the first to
provide the materials science community with rigorous inference procedures and
uncertainty quantification, via optimized and fully automated high-throughput
algorithms, implemented as the publicly available software package BayesCP. To
demonstrate the consistent accuracy and wide applicability of this approach,
results are shown for a variety of data sets from both macro- and
micro-materials experiments--including silicone, neurons, and red blood
cells--conducted by the authors and others.Comment: 20 pages, 6 figures; submitted for publicatio
Massive MIMO is a Reality -- What is Next? Five Promising Research Directions for Antenna Arrays
Massive MIMO (multiple-input multiple-output) is no longer a "wild" or
"promising" concept for future cellular networks - in 2018 it became a reality.
Base stations (BSs) with 64 fully digital transceiver chains were commercially
deployed in several countries, the key ingredients of Massive MIMO have made it
into the 5G standard, the signal processing methods required to achieve
unprecedented spectral efficiency have been developed, and the limitation due
to pilot contamination has been resolved. Even the development of fully digital
Massive MIMO arrays for mmWave frequencies - once viewed prohibitively
complicated and costly - is well underway. In a few years, Massive MIMO with
fully digital transceivers will be a mainstream feature at both sub-6 GHz and
mmWave frequencies. In this paper, we explain how the first chapter of the
Massive MIMO research saga has come to an end, while the story has just begun.
The coming wide-scale deployment of BSs with massive antenna arrays opens the
door to a brand new world where spatial processing capabilities are
omnipresent. In addition to mobile broadband services, the antennas can be used
for other communication applications, such as low-power machine-type or
ultra-reliable communications, as well as non-communication applications such
as radar, sensing and positioning. We outline five new Massive MIMO related
research directions: Extremely large aperture arrays, Holographic Massive MIMO,
Six-dimensional positioning, Large-scale MIMO radar, and Intelligent Massive
MIMO.Comment: 20 pages, 9 figures, submitted to Digital Signal Processin
Community detection and stochastic block models: recent developments
The stochastic block model (SBM) is a random graph model with planted
clusters. It is widely employed as a canonical model to study clustering and
community detection, and provides generally a fertile ground to study the
statistical and computational tradeoffs that arise in network and data
sciences.
This note surveys the recent developments that establish the fundamental
limits for community detection in the SBM, both with respect to
information-theoretic and computational thresholds, and for various recovery
requirements such as exact, partial and weak recovery (a.k.a., detection). The
main results discussed are the phase transitions for exact recovery at the
Chernoff-Hellinger threshold, the phase transition for weak recovery at the
Kesten-Stigum threshold, the optimal distortion-SNR tradeoff for partial
recovery, the learning of the SBM parameters and the gap between
information-theoretic and computational thresholds.
The note also covers some of the algorithms developed in the quest of
achieving the limits, in particular two-round algorithms via graph-splitting,
semi-definite programming, linearized belief propagation, classical and
nonbacktracking spectral methods. A few open problems are also discussed
Estimating the number of endmembers in hyperspectral images using the normal compositional model and a hierarchical Bayesian algorithm.
This paper studies a semi-supervised Bayesian unmixing algorithm for hyperspectral images. This algorithm is based on the normal compositional model recently introduced by Eismann and Stein. The normal compositional model assumes that each pixel of the image is modeled as a linear combination of an unknown number of pure materials, called endmembers. However, contrary to the classical linear mixing model, these endmembers are supposed to be random in order to model uncertainties regarding their knowledge. This paper proposes to estimate the mixture coefficients of the Normal Compositional Model (referred to as abundances) as well as their number using a reversible jump Bayesian algorithm. The performance of the proposed methodology is evaluated thanks to simulations conducted on synthetic and real AVIRIS images
- …