653 research outputs found

    A Distributed Approach to Interference Alignment in OFDM-based Two-tiered Networks

    Full text link
    In this contribution, we consider a two-tiered network and focus on the coexistence between the two tiers at physical layer. We target our efforts on a long term evolution advanced (LTE-A) orthogonal frequency division multiple access (OFDMA) macro-cell sharing the spectrum with a randomly deployed second tier of small-cells. In such networks, high levels of co-channel interference between the macro and small base stations (MBS/SBS) may largely limit the potential spectral efficiency gains provided by the frequency reuse 1. To address this issue, we propose a novel cognitive interference alignment based scheme to protect the macro-cell from the cross-tier interference, while mitigating the co-tier interference in the second tier. Remarkably, only local channel state information (CSI) and autonomous operations are required in the second tier, resulting in a completely self-organizing approach for the SBSs. The optimal precoder that maximizes the spectral efficiency of the link between each SBS and its served user equipment is found by means of a distributed one-shot strategy. Numerical findings reveal non-negligible spectral efficiency enhancements with respect to traditional time division multiple access approaches at any signal to noise (SNR) regime. Additionally, the proposed technique exhibits significant robustness to channel estimation errors, achieving remarkable results for the imperfect CSI case and yielding consistent performance enhancements to the network.Comment: 15 pages, 10 figures, accepted and to appear in IEEE Transactions on Vehicular Technology Special Section: Self-Organizing Radio Networks, 2013. Authors' final version. Copyright transferred to IEE

    Long-Range Communications in Unlicensed Bands: the Rising Stars in the IoT and Smart City Scenarios

    Full text link
    Connectivity is probably the most basic building block of the Internet of Things (IoT) paradigm. Up to know, the two main approaches to provide data access to the \emph{things} have been based either on multi-hop mesh networks using short-range communication technologies in the unlicensed spectrum, or on long-range, legacy cellular technologies, mainly 2G/GSM, operating in the corresponding licensed frequency bands. Recently, these reference models have been challenged by a new type of wireless connectivity, characterized by low-rate, long-range transmission technologies in the unlicensed sub-GHz frequency bands, used to realize access networks with star topology which are referred to a \emph{Low-Power Wide Area Networks} (LPWANs). In this paper, we introduce this new approach to provide connectivity in the IoT scenario, discussing its advantages over the established paradigms in terms of efficiency, effectiveness, and architectural design, in particular for the typical Smart Cities applications

    Time-varying Clock Offset Estimation in Two-way Timing Message Exchange in Wireless Sensor Networks Using Factor Graphs

    Full text link
    The problem of clock offset estimation in a two-way timing exchange regime is considered when the likelihood function of the observation time stamps is exponentially distributed. In order to capture the imperfections in node oscillators, which render a time-varying nature to the clock offset, a novel Bayesian approach to the clock offset estimation is proposed using a factor graph representation of the posterior density. Message passing using the max-product algorithm yields a closed form expression for the Bayesian inference problem.Comment: 4 pages, 2 figures, ICASSP 201

    Engineering human microbiota for disease prevention and therapy

    Get PDF

    Platforms and Protocols for the Internet of Things

    Get PDF
    Building a general architecture for the Internet of Things (IoT) is a very complex task, exacerbated by the extremely large variety of devices, link layer technologies, and services that may be involved in such a system. In this paper, we identify the main blocks of a generic IoT architecture, describing their features and requirements, and analyze the most common approaches proposed in the literature for each block. In particular, we compare three of the most important communication technologies for IoT purposes, i.e., REST, MQTT, and AMQP, and we also analyze three IoT platforms: openHAB, Sentilo, and Parse. The analysis will prove the importance of adopting an integrated approach that jointly addresses several issues and is able to flexibly accommodate the requirements of the various elements of the system. We also discuss a use case which illustrates the design challenges and the choices to make when selecting which protocols and technologies to use

    Cognitive Interference Alignment for OFDM Two-tiered Networks

    Full text link
    In this contribution, we introduce an interference alignment scheme that allows the coexistence of an orthogonal frequency division multiplexing (OFDM) macro-cell and a cognitive small-cell, deployed in a two-tiered structure and transmitting over the same bandwidth. We derive the optimal linear strategy for the single antenna secondary base station, maximizing the spectral efficiency of the opportunistic link, accounting for both signal sub-space structure and power loading strategy. Our analytical and numerical findings prove that the precoder structure proposed is optimal for the considered scenario in the face of Rayleigh and exponential decaying channels.Comment: 5 pages, 4 figures. Accepted and presented at the IEEE 13th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), 2012. Authors' final version. Copyright transferred to IEE

    Cognitive Orthogonal Precoder for Two-tiered Networks Deployment

    Full text link
    In this work, the problem of cross-tier interference in a two-tiered (macro-cell and cognitive small-cells) network, under the complete spectrum sharing paradigm, is studied. A new orthogonal precoder transmit scheme for the small base stations, called multi-user Vandermonde-subspace frequency division multiplexing (MU-VFDM), is proposed. MU-VFDM allows several cognitive small base stations to coexist with legacy macro-cell receivers, by nulling the small- to macro-cell cross-tier interference, without any cooperation between the two tiers. This cleverly designed cascaded precoder structure, not only cancels the cross-tier interference, but avoids the co-tier interference for the small-cell network. The achievable sum-rate of the small-cell network, satisfying the interference cancelation requirements, is evaluated for perfect and imperfect channel state information at the transmitter. Simulation results for the cascaded MU-VFDM precoder show a comparable performance to that of state-of-the-art dirty paper coding technique, for the case of a dense cellular layout. Finally, a comparison between MU-VFDM and a standard complete spectrum separation strategy is proposed. Promising gains in terms of achievable sum-rate are shown for the two-tiered network w.r.t. the traditional bandwidth management approach.Comment: 11 pages, 9 figures, accepted and to appear in IEEE Journal on Selected Areas in Communications: Cognitive Radio Series, 2013. Copyright transferred to IEE

    LoRa scalability : a simulation model based on interference measurements

    Get PDF
    LoRa is a long-range, low power, low bit rate and single-hop wireless communication technology. It is intended to be used in Internet of Things (IoT) applications involving battery-powered devices with low throughput requirements. A LoRaWAN network consists of multiple end nodes that communicate with one or more gateways. These gateways act like a transparent bridge towards a common network server. The amount of end devices and their throughput requirements will have an impact on the performance of the LoRaWAN network. This study investigates the scalability in terms of the number of end devices per gateway of single-gateway LoRaWAN deployments. First, we determine the intra-technology interference behavior with two physical end nodes, by checking the impact of an interfering node on a transmitting node. Measurements show that even under concurrent transmission, one of the packets can be received under certain conditions. Based on these measurements, we create a simulation model for assessing the scalability of a single gateway LoRaWAN network. We show that when the number of nodes increases up to 1000 per gateway, the losses will be up to 32%. In such a case, pure Aloha will have around 90% losses. However, when the duty cycle of the application layer becomes lower than the allowed radio duty cycle of 1%, losses will be even lower. We also show network scalability simulation results for some IoT use cases based on real data

    Extending the Lora modulation to add further parallel channels and improve the LoRaWAN network performance

    Full text link
    In this paper we present a new modulation, called DLoRa, similar in principle to the conventional LoRa modulation and compatible with it in terms of bandwidth and numerology. DLoRa departs from the conventional LoRa modulation as it is using a decreasing instantaneous frequency in the chirps instead of an increasing one as for the conventional LoRa modulation. Furthermore we describe a software environment to accurately evaluate the "isolation" of the different virtual channels created both by LoRa and DLoRa when using different Spreading Factors. Our results are in agreement with the ones present in literature for the conventional LoRa modulation and show that it is possible to double the number of channels by using simultaneously LoRa and DLora. The higher (double) number of subchannels available is the key to improve the network level performance of LoRa based networks.Comment: This work has been submitted on Feb.1 2020 to European Wireless 2020 conference for possible presentation and subsequent publication by the IEE
    corecore