2,336 research outputs found

    Universal Compressed Sensing

    Full text link
    In this paper, the problem of developing universal algorithms for compressed sensing of stochastic processes is studied. First, R\'enyi's notion of information dimension (ID) is generalized to analog stationary processes. This provides a measure of complexity for such processes and is connected to the number of measurements required for their accurate recovery. Then a minimum entropy pursuit (MEP) optimization approach is proposed, and it is proven that it can reliably recover any stationary process satisfying some mixing constraints from sufficient number of randomized linear measurements, without having any prior information about the distribution of the process. It is proved that a Lagrangian-type approximation of the MEP optimization problem, referred to as Lagrangian-MEP problem, is identical to a heuristic implementable algorithm proposed by Baron et al. It is shown that for the right choice of parameters the Lagrangian-MEP algorithm, in addition to having the same asymptotic performance as MEP optimization, is also robust to the measurement noise. For memoryless sources with a discrete-continuous mixture distribution, the fundamental limits of the minimum number of required measurements by a non-universal compressed sensing decoder is characterized by Wu et al. For such sources, it is proved that there is no loss in universal coding, and both the MEP and the Lagrangian-MEP asymptotically achieve the optimal performance

    Compression-Based Compressed Sensing

    Full text link
    Modern compression algorithms exploit complex structures that are present in signals to describe them very efficiently. On the other hand, the field of compressed sensing is built upon the observation that "structured" signals can be recovered from their under-determined set of linear projections. Currently, there is a large gap between the complexity of the structures studied in the area of compressed sensing and those employed by the state-of-the-art compression codes. Recent results in the literature on deterministic signals aim at bridging this gap through devising compressed sensing decoders that employ compression codes. This paper focuses on structured stochastic processes and studies the application of rate-distortion codes to compressed sensing of such signals. The performance of the formerly-proposed compressible signal pursuit (CSP) algorithm is studied in this stochastic setting. It is proved that in the very low distortion regime, as the blocklength grows to infinity, the CSP algorithm reliably and robustly recovers nn instances of a stationary process from random linear projections as long as their count is slightly more than nn times the rate-distortion dimension (RDD) of the source. It is also shown that under some regularity conditions, the RDD of a stationary process is equal to its information dimension (ID). This connection establishes the optimality of the CSP algorithm at least for memoryless stationary sources, for which the fundamental limits are known. Finally, it is shown that the CSP algorithm combined by a family of universal variable-length fixed-distortion compression codes yields a family of universal compressed sensing recovery algorithms

    Outage Performance of Uplink Two-tier Networks Under Backhaul Constraints

    Full text link
    Multi-tier cellular communication networks constitute a promising approach to expand the coverage of cellular networks and enable them to offer higher data rates. In this paper, an uplink two-tier communication network is studied, in which macro users, femto users and femto access points are geometrically located inside the coverage area of a macro base station according to Poisson point processes. Each femtocell is assumed to have a fixed backhaul constraint that puts a limit on the maximum number of femto and macro users it can service. Under this backhaul constraint, the network adopts a special open access policy, in which each macro user is either assigned to its closest femto access point or to the macro base station, depending on the ratio between its distances from those two. Under this model, upper and lower bounds on the outage probabilities experienced by users serviced by femto access points are derived as functions of the distance between the macro base station and the femto access point serving them. Similarly, upper and lower bounds on the outage probabilities of the users serviced by the macro base station are obtained. The bounds in both cases are confirmed via simulation results

    TRANSQLATION: TRANsformer-based SQL RecommendATION

    Get PDF
    The exponential growth of data production emphasizes the importance of database management systems (DBMS) for managing vast amounts of data. However, the complexity of writing Structured Query Language (SQL) queries requires a diverse range of skills, which can be a challenge for many users. Different approaches are proposed to address this challenge by aiding SQL users in mitigating their skill gaps. One of these approaches is to design recommendation systems that provide several suggestions to users for writing their next SQL queries. Despite the availability of such recommendation systems, they often have several limitations, such as lacking sequenceawareness, session-awareness, and context-awareness. In this paper, we propose TRANSQLATION, a session-aware and sequenceaware recommendation system that recommends the fragments of the subsequent SQL query in a user session. We demonstrate that TRANSQLATION outperforms existing works by achieving, on average, 22% more recommendation accuracy when having a large amount of data and is still effective even when training data is limited. We further demonstrate that considering contextual similarity is a critical aspect that can enhance the accuracy and relevance of recommendations in query recommendation systems.acceptedVersio

    Acute respiratory distress syndrome: new definition, current and future therapeutic options.

    Get PDF
    Since acute respiratory distress syndrome (ARDS) was first described in 1967 there has been large number of studies addressing its pathogenesis and therapies. Despite this intense research activity, there are very few effective therapies for ARDS other than the use of lung protection strategies. This lack of therapeutic modalities is not only related to the complex pathogenesis of this syndrome but also the insensitive and nonspecific diagnostic criteria to diagnose ARDS. This review article will summarize the key features of the new definition of ARDS, and provide a brief overview of innovative therapeutic options that are being assessed in the management of ARDS

    Mediastinal Synovial Sarcoma.

    Get PDF
    The mediastinum is an uncommon site of synovial sarcoma which is a rare soft tissue malignancy. An 18 year old boy was admitted for right sided chest pain since one year. Based on the CT scan chest findings, tru-cut biopsy of the mass was performed and histopathology initially reported as neurofibroma. The tumor was excised via posterolateral thoracotomy and excisional biopsy established the diagnosis of primary synovial sarcoma arising from the mediastinum. Patient was started on adjuvant chemotherapy at the oncologist tumor board’s recommendation and he received 6 cycles of adriamycin and ifosfamide. At last follow up he is free of tumor recurrence

    Impaired local dynamic stability during treadmill walking predicts future falls in patients with multiple sclerosis:A prospective cohort study

    Get PDF
    Background: Falling is a significant problem in patients with multiple sclerosis (MS) and the majority of falls occur during dynamic activities. Recently, there have been evidences focusing on falls and local stability of walking based on dynamic system theory in the elderly as well as patients with cerebral concussion. However, in patient with MS, this relationship has not been fully investigated. The aim of this study was to investigate local stability of walking as a risk factor for falling in patients with MS. Methods: Seventy patients were assessed while walking at their preferred speed on a treadmill under single and dual task conditions. A cognitive task (backward counting) was used to assess the importance of dual tasking to fall risk. Trunk kinematics were collected using a cluster marker over the level of T7 and a 7-camera motion capture system. To quantify local stability of walking, maximal finite-time Lyapunov exponent was calculated from a 12-dimensional state space reconstruction based on 3-dimensional trunk linear and angular velocity time series. Participants were classified as fallers (≥1) and non-fallers based on their prospective fall occurrence. Findings: 30 (43%) participants recorded ≥1 falls and were classified as fallers. The results of multiple logistic regression analysis revealed that short-term local dynamic stability in the single task condition (P < 0.05, odds ratio = 2.214 (1.037–4.726)) was the significant fall predictor. Interpretation: The results may indicate that the assessment of local stability of walking can identify patients who would benefit from gait retraining and fall prevention programs

    Atrial Fibrillation Detection from Wrist Photoplethysmography Signals Using Smartwatches

    Get PDF
    Detection of atrial fibrillation (AF) from a wrist watch photoplethysmogram (PPG) signal is important because the wrist watch form factor enables long term continuous monitoring of arrhythmia in an easy and non-invasive manner. We have developed a novel method not only to detect AF from a smart wrist watch PPG signal, but also to determine whether the recorded PPG signal is corrupted by motion artifacts or not. We detect motion and noise artifacts based on the accelerometer signal and variable frequency complex demodulation based time-frequency analysis of the PPG signal. After that, we use the root mean square of successive differences and sample entropy, calculated from the beat-to-beat intervals of the PPG signal, to distinguish AF from normal rhythm. We then use a premature atrial contraction detection algorithm to have more accurate AF identification and to reduce false alarms. Two separate datasets have been used in this study to test the efficacy of the proposed method, which shows a combined sensitivity, specificity and accuracy of 98.18%, 97.43% and 97.54% across the datasets
    • …
    corecore