17,172 research outputs found

    Density of Spherically-Embedded Stiefel and Grassmann Codes

    Full text link
    The density of a code is the fraction of the coding space covered by packing balls centered around the codewords. This paper investigates the density of codes in the complex Stiefel and Grassmann manifolds equipped with the chordal distance. The choice of distance enables the treatment of the manifolds as subspaces of Euclidean hyperspheres. In this geometry, the densest packings are not necessarily equivalent to maximum-minimum-distance codes. Computing a code's density follows from computing: i) the normalized volume of a metric ball and ii) the kissing radius, the radius of the largest balls one can pack around the codewords without overlapping. First, the normalized volume of a metric ball is evaluated by asymptotic approximations. The volume of a small ball can be well-approximated by the volume of a locally-equivalent tangential ball. In order to properly normalize this approximation, the precise volumes of the manifolds induced by their spherical embedding are computed. For larger balls, a hyperspherical cap approximation is used, which is justified by a volume comparison theorem showing that the normalized volume of a ball in the Stiefel or Grassmann manifold is asymptotically equal to the normalized volume of a ball in its embedding sphere as the dimension grows to infinity. Then, bounds on the kissing radius are derived alongside corresponding bounds on the density. Unlike spherical codes or codes in flat spaces, the kissing radius of Grassmann or Stiefel codes cannot be exactly determined from its minimum distance. It is nonetheless possible to derive bounds on density as functions of the minimum distance. Stiefel and Grassmann codes have larger density than their image spherical codes when dimensions tend to infinity. Finally, the bounds on density lead to refinements of the standard Hamming bounds for Stiefel and Grassmann codes.Comment: Two-column version (24 pages, 6 figures, 4 tables). To appear in IEEE Transactions on Information Theor

    Communication Over MIMO Broadcast Channels Using Lattice-Basis Reduction

    Full text link
    A simple scheme for communication over MIMO broadcast channels is introduced which adopts the lattice reduction technique to improve the naive channel inversion method. Lattice basis reduction helps us to reduce the average transmitted energy by modifying the region which includes the constellation points. Simulation results show that the proposed scheme performs well, and as compared to the more complex methods (such as the perturbation method) has a negligible loss. Moreover, the proposed method is extended to the case of different rates for different users. The asymptotic behavior of the symbol error rate of the proposed method and the perturbation technique, and also the outage probability for the case of fixed-rate users is analyzed. It is shown that the proposed method, based on LLL lattice reduction, achieves the optimum asymptotic slope of symbol-error-rate (called the precoding diversity). Also, the outage probability for the case of fixed sum-rate is analyzed.Comment: Submitted to IEEE Trans. on Info. Theory (Jan. 15, 2006), Revised (Jun. 12, 2007

    Of beta diversity, variance, evenness, and dissimilarity

    Get PDF
    The amount of variation in species composition among sampling units or beta diversity has become a primary tool for connecting the spatial structure of species assemblages to ecological processes. Many different measures of beta diversity have been developed. Among them, the total variance in the community composition matrix has been proposed as a single-number estimate of beta diversity. In this study, I first show that this measure summarizes the compositional variation among sampling units after nonlinear transformation of species abundances. Therefore, it is not always adequate for estimating beta diversity. Next, I propose an alternative approach for calculating beta diversity in which variance is substituted by a weighted measure of concentration (i.e., an inverse measure of evenness). The relationship between this new measure of beta diversity and so-called multiple-site dissimilarity measures is also discussed

    On the Proximity Factors of Lattice Reduction-Aided Decoding

    Full text link
    Lattice reduction-aided decoding features reduced decoding complexity and near-optimum performance in multi-input multi-output communications. In this paper, a quantitative analysis of lattice reduction-aided decoding is presented. To this aim, the proximity factors are defined to measure the worst-case losses in distances relative to closest point search (in an infinite lattice). Upper bounds on the proximity factors are derived, which are functions of the dimension nn of the lattice alone. The study is then extended to the dual-basis reduction. It is found that the bounds for dual basis reduction may be smaller. Reasonably good bounds are derived in many cases. The constant bounds on proximity factors not only imply the same diversity order in fading channels, but also relate the error probabilities of (infinite) lattice decoding and lattice reduction-aided decoding.Comment: remove redundant figure

    Profile identification via weighted related metric scaling : an application to dependent Spanish children

    Get PDF
    Disability and dependency (lack of autonomy in performing common everyday actions) affect health status and quality of life, therefore they are significant public health issues. The main purpose of this study is to establish the existing relationship among different variables (continuous, categorical and binary) referred to children between 3 and 6 years old and their functional dependence in basic activities of daily living. We combine different types of information via weighted related metric scaling to obtain homogeneous profiles for dependent Spanish children. The redundant information between groups of variables is modeled with an interaction parameter that can be optimized according to several criteria. In this paper, the goal is to obtain maximum explained variability in an Euclidean configuration. Data comes from the Survey about Disabilities, Personal Autonomy and Dependence Situations, EDAD 2008, (Spanish National Institute of Statistics, 2008)ADL, Disability, Mixed-type data, Public health, Related metric scaling

    Robust MMSE Precoding Strategy for Multiuser MIMO Relay Systems with Switched Relaying and Side Information

    No full text
    In this work, we propose a minimum mean squared error (MMSE) robust base station (BS) precoding strategy based on switched relaying (SR) processing and limited transmission of side information for interference suppression in the downlink of multiuser multiple-input multiple-output (MIMO) relay systems. The BS and the MIMO relay station (RS) are both equipped with a codebook of interleaving matrices. For a given channel state information (CSI) the selection function at the BS chooses the optimum interleaving matrix from the codebook based on two optimization criteria to design the robust precoder. Prior to the payload transmission the BS sends the index corresponding to the selected interleaving matrix to the RS, where the best interleaving matrix is selected to build the optimum relay processing matrix. The entries of the codebook are randomly generated unitary matrices. Simulation results show that the performance of the proposed techniques is significantly better than prior art in the case of imperfect CSI.
    corecore