895,841 research outputs found

    Distributed probabilistic-data-association-based soft reception employing base station cooperation in MIMO-aided multiuser multicell systems

    No full text
    Intercell cochannel interference (CCI) mitigation is investigated in the context of cellular systems relying on dense frequency reuse (FR). A distributed base-station (BS)-cooperation-aided soft reception scheme using the probabilistic data association (PDA) algorithm and soft combining (SC) is proposed for the uplink of multiuser multicell MIMO systems. The realistic 19-cell hexagonal cellular model relying on unity FR is considered, where both the BSs and the mobile stations (MSs) are equipped with multiple antennas. Local-cooperation-based message passing is used, instead of a global message passing chain for the sake of reducing the backhaul traffic. The PDA algorithm is employed as a low-complexity solution for producing soft information, which facilitates the employment of SC at the individual BSs to generate the final soft decision metric. Our simulations and analysis demonstrate that, despite its low additional complexity and backhaul traffic, the proposed distributed PDA-aided SC (DPDA-SC) reception scheme significantly outperforms the conventional noncooperative benchmarkers. Furthermore, since only the index of the possible discrete value of the quantized converged soft information has to be exchanged for SC in practice, the proposed DPDA-SC scheme is relatively robust to the quantization errors of the soft information exchanged. As a beneficial result, the backhaul traffic is dramatically reduced at negligible performance degradation

    Data-precoded algorithm for multiple-relay-assisted systems

    Get PDF
    A data-precoded relay-assisted (RA) scheme is proposed for a system cooperating with multiple relay nodes (RNs), each equipped with either a single-antenna or a two-antenna array. The classical RA systems using distributed space-time/frequency coding algorithms, because of the half-duplex constraint at the relays, require the use of a higher order constellation than in the case of a continuous link transmission from the base station to the user terminal. This implies a penalty in the power efficiency. The proposed precoding algorithm exploits the relation between QPSK and 4 L -QAM, by alternately transmitting through L relays, achieving full diversity, while significantly reducing power penalty. This algorithm explores the situations where a direct path (DP) is not available or has poor quality, and it is a promising solution to extend coverage or increase system capacity. We present the analytical derivation of the gain obtained with the data-precoded algorithm in comparison with distributed space-frequency block code (SFBC) ones. Furthermore, analysis of the pairwise error probability of the proposed algorithm is derived and confirmed with numerical results. We evaluate the performance of the proposed scheme and compare it relatively to the equivalent distributed SFBC scheme employing 16-QAM and non-cooperative schemes, for several link quality scenarios and scheme configurations, highlighting the advantages of the proposed scheme

    Decoupled signal detection for the uplink of massive MIMO in 5G heterogeneous networks

    Get PDF
    Massive multiple-input multiple-output (MIMO) systems are strong candidates for future fifth-generation (5G) heterogeneous cellular networks. For 5G, a network densification with a high number of different classes of users and data service requirements is expected. Such a large number of connected devices needs to be separated in order to allow the detection of the transmitted signals according to different data requirements. In this paper, a decoupled signal detection (DSD) technique which allows the separation of the uplink signals, for each user class, at the base station (BS) is proposed for massive MIMO systems. A mathematical signal model for massive MIMO systems with centralized and distributed antennas in heterogeneous networks is also developed. The performance of the proposed algorithm is evaluated and compared with existing detection schemes in a realistic scenario with distributed antennas. A sum-rate analysis and a computational cost study for DSD are also presented. Simulation results show an excellent performance of the proposed algorithm when combined with linear and successive interference cancellation detection techniques

    A BASE-STATION FREQUENCY BAND SHARING USING COALITION PROCESS

    Get PDF
    In this RRM the resource is dynamically allotted in distributed and centralized manners so that spectral efficiency is maximized across the entire network. To do this a distinctive interference mapping technique is implemented to assistance with determining whether distributed or centralized mode is relevant per base station. A brand new radio resource management (RRM) technique for enhancing the downlink performance in soft-frequency reuse based lengthy term evolution (LTE) systems is presented. Whenever a distributed approach is granted to some base station it cause the whole spectrum while whenever a centralized approach is imposed on the base station it are only allotted a subset of the spectrum. The suggested then utilizes the confederation conception a feeling that when the allocation approach is determined the individual base stations can seize control of the allocated resource. When coupled with proportional justness scheduling, this RRM may also take advantage of multiuser diversity. Therefore, to be able to implement the SFR approach effectively in LTE heterogeneous cellular systems (Honest), all Smuts have adaptive interference avoidance capacity [5].It'll be shown through mathematical analysis and computer simulations that this method offers significant enhancements in terms of sum rate and excellence of service by growing the guaranteed data rate per user

    Improving hydrologic modeling of runoff processes using data-driven models

    Get PDF
    2021 Spring.Includes bibliographical references.Accurate rainfall–runoff simulation is essential for responding to natural disasters, such as floods and droughts, and for proper water resources management in a wide variety of fields, including hydrology, agriculture, and environmental studies. A hydrologic model aims to analyze the nonlinear and complex relationship between rainfall and runoff based on empirical equations and multiple parameters. To obtain reliable results of runoff simulations, it is necessary to consider three tasks, namely, reasonably diagnosing the modeling performance, managing the uncertainties in the modeling outcome, and simulating runoff considering various conditions. Recently, with the advancement of computing systems, technology, resources, and information, data-driven models are widely used in various fields such as language translation, image classification, and time-series analysis. In addition, as spatial and temporal resolutions of observations are improved, the applicability of data-driven models, which require massive amounts of datasets, is rapidly increasing. In hydrology, rainfall–runoff simulation requires various datasets including meteorological, topographical, and soil properties with multiple time steps from sub-hourly to monthly. This research investigates whether data-driven approaches can be effectively applied for runoff analysis. In particular, this research aims to explore if data-driven models can 1) reasonably evaluate hydrologic models, 2) improve the modeling performance, and 3) predict hourly runoff using distributed forcing datasets. The details of these three research aspects are as follows: First, this research developed a hydrologic assessment tool using a hybrid framework, which combines two data-driven models, to evaluate the performance of a hydrologic model for runoff simulation. The National Water Model, which is a fully distributed hydrologic model, was used as the physical-based model. The developed assessment tool aims to provide easy-to-understand performance ratings for the simulated hydrograph components, namely, the rising and recession limbs, as well as for the entire hydrograph, against observed runoff data. In this research, four performance ratings were used. This is the first research that tries to apply data-driven models for evaluating the performance of the National Water Model and the results are expected to reasonably diagnose the model's ability for runoff simulations based on a short-term time step. Second, correction of errors inherent in the predicted runoff is essential for efficient water management. Hydrologic models include various parameters that cannot be measured directly, but they can be adjusted to improve the predictive performance. However, even a calibrated model still has obvious errors in predicting runoff. In this research, a data-driven model was applied to correct errors in the predicted runoff from the National Water Model and improve its predictive performance. The proposed method uses historic errors in runoff to predict new errors as a post-processor. This research shows that data-driven models, which can build algorithms based on the relationships between datasets, have strong potential for correcting errors and improving the predictive performance of hydrologic models. Finally, to simulate rainfall-runoff accurately, it is essential to consider various factors such as precipitation, soil property, and runoff coming from upstream regions. With improvements in observation systems and resources, various types of forcing datasets, including remote-sensing based data and data-assimilation system products, are available for hydrologic analysis. In this research, various data-driven models with distributed forcing datasets were applied to perform hourly runoff predictions. The forcing datasets included different hydrologic factors such as soil moisture, precipitation, land surface temperature, and base flow, which were obtained from a data assimilation system. The predicted results were evaluated in terms of seasonal and event-based performances and compared with those of the National Water Model. The results demonstrated that data-driven models for hourly runoff forecasting are effective and useful for short-term runoff prediction and developing flood warning system during wet season

    An Optimized Multi-Layer Resource Management in Mobile Edge Computing Networks: A Joint Computation Offloading and Caching Solution

    Full text link
    Nowadays, data caching is being used as a high-speed data storage layer in mobile edge computing networks employing flow control methodologies at an exponential rate. This study shows how to discover the best architecture for backhaul networks with caching capability using a distributed offloading technique. This article used a continuous power flow analysis to achieve the optimum load constraints, wherein the power of macro base stations with various caching capacities is supplied by either an intelligent grid network or renewable energy systems. This work proposes ubiquitous connectivity between users at the cell edge and offloading the macro cells so as to provide features the macro cell itself cannot cope with, such as extreme changes in the required user data rate and energy efficiency. The offloading framework is then reformed into a neural weighted framework that considers convergence and Lyapunov instability requirements of mobile-edge computing under Karush Kuhn Tucker optimization restrictions in order to get accurate solutions. The cell-layer performance is analyzed in the boundary and in the center point of the cells. The analytical and simulation results show that the suggested method outperforms other energy-saving techniques. Also, compared to other solutions studied in the literature, the proposed approach shows a two to three times increase in both the throughput of the cell edge users and the aggregate throughput per cluster

    New Method of Measuring TCP Performance of IP Network using Bio-computing

    Full text link
    The measurement of performance of Internet Protocol IP network can be done by Transmission Control Protocol TCP because it guarantees send data from one end of the connection actually gets to the other end and in the same order it was send, otherwise an error is reported. There are several methods to measure the performance of TCP among these methods genetic algorithms, neural network, data mining etc, all these methods have weakness and can't reach to correct measure of TCP performance. This paper proposed a new method of measuring TCP performance for real time IP network using Biocomputing, especially molecular calculation because it provides wisdom results and it can exploit all facilities of phylogentic analysis. Applying the new method at real time on Biological Kurdish Messenger BIOKM model designed to measure the TCP performance in two types of protocols File Transfer Protocol FTP and Internet Relay Chat Daemon IRCD. This application gives very close result of TCP performance comparing with TCP performance which obtains from Little's law using same model (BIOKM), i.e. the different percentage of utilization (Busy or traffic industry) and the idle time which are obtained from a new method base on Bio-computing comparing with Little's law was (nearly) 0.13%. KEYWORDS Bio-computing, TCP performance, Phylogenetic tree, Hybridized Model (Normalized), FTP, IRCDComment: 17 Pages,10 Figures,5 Table
    • 

    corecore