17,096 research outputs found
Optimization of Massive Full-Dimensional MIMO for Positioning and Communication
Massive Full-Dimensional multiple-input multiple-output (FD-MIMO) base
stations (BSs) have the potential to bring multiplexing and coverage gains by
means of three-dimensional (3D) beamforming. Key technical challenges for their
deployment include the presence of limited-resolution front ends and the
acquisition of channel state information (CSI) at the BSs. This paper
investigates the use of FD-MIMO BSs to provide simultaneously high-rate data
communication and mobile 3D positioning in the downlink. The analysis
concentrates on the problem of beamforming design by accounting for imperfect
CSI acquisition via Time Division Duplex (TDD)-based training and for the
finite resolution of analog-to-digital converter (ADC) and digital-to-analog
converter (DAC) at the BSs. Both \textit{unstructured beamforming} and a
low-complexity \textit{Kronecker beamforming} solution are considered, where
for the latter the beamforming vectors are decomposed into separate azimuth and
elevation components. The proposed algorithmic solutions are based on Bussgang
theorem, rank-relaxation and successive convex approximation (SCA) methods.
Comprehensive numerical results demonstrate that the proposed schemes can
effectively cater to both data communication and positioning services,
providing only minor performance degradations as compared to the more
conventional cases in which either function is implemented. Moreover, the
proposed low-complexity Kronecker beamforming solutions are seen to guarantee a
limited performance loss in the presence of a large number of BS antennas.Comment: 30 pages, 6 figure
Person re-identification by robust canonical correlation analysis
Person re-identification is the task to match people in surveillance cameras at different time and location. Due to significant view and pose change across non-overlapping cameras, directly matching data from different views is a challenging issue to solve. In this letter, we propose a robust canonical correlation analysis (ROCCA) to match people from different views in a coherent subspace. Given a small training set as in most re-identification problems, direct application of canonical correlation analysis (CCA) may lead to poor performance due to the inaccuracy in estimating the data covariance matrices. The proposed ROCCA with shrinkage estimation and smoothing technique is simple to implement and can robustly estimate the data covariance matrices with limited training samples. Experimental results on two publicly available datasets show that the proposed ROCCA outperforms regularized CCA (RCCA), and achieves state-of-the-art matching results for person re-identification as compared to the most recent methods
Quantum field tomography
We introduce the concept of quantum field tomography, the efficient and
reliable reconstruction of unknown quantum fields based on data of correlation
functions. At the basis of the analysis is the concept of continuous matrix
product states, a complete set of variational states grasping states in quantum
field theory. We innovate a practical method, making use of and developing
tools in estimation theory used in the context of compressed sensing such as
Prony methods and matrix pencils, allowing us to faithfully reconstruct quantum
field states based on low-order correlation functions. In the absence of a
phase reference, we highlight how specific higher order correlation functions
can still be predicted. We exemplify the functioning of the approach by
reconstructing randomised continuous matrix product states from their
correlation data and study the robustness of the reconstruction for different
noise models. We also apply the method to data generated by simulations based
on continuous matrix product states and using the time-dependent variational
principle. The presented approach is expected to open up a new window into
experimentally studying continuous quantum systems, such as encountered in
experiments with ultra-cold atoms on top of atom chips. By virtue of the
analogy with the input-output formalism in quantum optics, it also allows for
studying open quantum systems.Comment: 31 pages, 5 figures, minor change
Machine Learning and Integrative Analysis of Biomedical Big Data.
Recent developments in high-throughput technologies have accelerated the accumulation of massive amounts of omics data from multiple sources: genome, epigenome, transcriptome, proteome, metabolome, etc. Traditionally, data from each source (e.g., genome) is analyzed in isolation using statistical and machine learning (ML) methods. Integrative analysis of multi-omics and clinical data is key to new biomedical discoveries and advancements in precision medicine. However, data integration poses new computational challenges as well as exacerbates the ones associated with single-omics studies. Specialized computational approaches are required to effectively and efficiently perform integrative analysis of biomedical data acquired from diverse modalities. In this review, we discuss state-of-the-art ML-based approaches for tackling five specific computational challenges associated with integrative analysis: curse of dimensionality, data heterogeneity, missing data, class imbalance and scalability issues
Designing structured tight frames via an alternating projection method
Tight frames, also known as general Welch-bound- equality sequences, generalize orthonormal systems. Numerous applications - including communications, coding, and sparse approximation- require finite-dimensional tight frames that possess additional structural properties. This paper proposes an alternating projection method that is versatile enough to solve a huge class of inverse eigenvalue problems (IEPs), which includes the frame design problem. To apply this method, one needs only to solve a matrix nearness problem that arises naturally from the design specifications. Therefore, it is the fast and easy to develop versions of the algorithm that target new design problems. Alternating projection will often succeed even if algebraic constructions are unavailable. To demonstrate that alternating projection is an effective tool for frame design, the paper studies some important structural properties in detail. First, it addresses the most basic design problem: constructing tight frames with prescribed vector norms. Then, it discusses equiangular tight frames, which are natural dictionaries for sparse approximation. Finally, it examines tight frames whose individual vectors have low peak-to-average-power ratio (PAR), which is a valuable property for code-division multiple-access (CDMA) applications. Numerical experiments show that the proposed algorithm succeeds in each of these three cases. The appendices investigate the convergence properties of the algorithm
Selection Bias in News Coverage: Learning it, Fighting it
News entities must select and filter the coverage they broadcast through
their respective channels since the set of world events is too large to be
treated exhaustively. The subjective nature of this filtering induces biases
due to, among other things, resource constraints, editorial guidelines,
ideological affinities, or even the fragmented nature of the information at a
journalist's disposal. The magnitude and direction of these biases are,
however, widely unknown. The absence of ground truth, the sheer size of the
event space, or the lack of an exhaustive set of absolute features to measure
make it difficult to observe the bias directly, to characterize the leaning's
nature and to factor it out to ensure a neutral coverage of the news. In this
work, we introduce a methodology to capture the latent structure of media's
decision process on a large scale. Our contribution is multi-fold. First, we
show media coverage to be predictable using personalization techniques, and
evaluate our approach on a large set of events collected from the GDELT
database. We then show that a personalized and parametrized approach not only
exhibits higher accuracy in coverage prediction, but also provides an
interpretable representation of the selection bias. Last, we propose a method
able to select a set of sources by leveraging the latent representation. These
selected sources provide a more diverse and egalitarian coverage, all while
retaining the most actively covered events
Understanding Compressive Adversarial Privacy
Designing a data sharing mechanism without sacrificing too much privacy can
be considered as a game between data holders and malicious attackers. This
paper describes a compressive adversarial privacy framework that captures the
trade-off between the data privacy and utility. We characterize the optimal
data releasing mechanism through convex optimization when assuming that both
the data holder and attacker can only modify the data using linear
transformations. We then build a more realistic data releasing mechanism that
can rely on a nonlinear compression model while the attacker uses a neural
network. We demonstrate in a series of empirical applications that this
framework, consisting of compressive adversarial privacy, can preserve
sensitive information
- …