2 research outputs found

    Wireless network power optimization using relay stations blossoming and withering technique

    Get PDF
    Power consumption of wireless network is increasing as the demands in wireless data rates are escalating in modern life. Base stations are the major power consumption component in the wireless network. Therefore, the main challenge is to reduce the total power consumed in the network while maintaining the network coverage and its capacity. In this paper, a new relay switching perspective is introduced for relay blossoming and withering algorithm. First, relay switching is considered as a function of time representing the rate of active relays. The effect of the rate of active relays, arrival rate and average load factor of relays on the total network power consumption is modelled. It is found that the rate of active relay function that optimises the network power consumption obeys linear first-order ordinary differential equation. The effect of different synthesised arrival rate profiles on the rate of active relay is presented. Moreover, relative relay to base station capacity parameter is defined, and its effect on the power optimisation is investigated. Based on the solutions of the ordinary differential equation, an approximate fuzzy-based relay sleeping mode is introduced. The fuzzy logic sleeping mode utilises the arrival rate and its derivative as inputs. The solution of the differential equations shows that power saving up to 45 and 30 per cent can be achieved in sleeping and idling modes, respectively, in contrast to 42 per cent achieved from the fuzzy sleeping mode. The increasing slope of the arrival rate results in less power saving

    Rain attenuation and worst month statistics verification and modeling for 5G radio link system at 26 GHz in Malaysia

    No full text
    The explosive daily dependence on wireless communication services necessitates the research to establish ultrawideband communication systems with ultrahigh bit rate transmission capabilities. The advent of the fifth-generation (5G) microwave link transmitting at millimeter-wave (mm-wave) frequency band is a promising technology to accommodate the escalating demand for wireless services. In this frequency band, however, the behavior of the transmission channel and its climatic properties are a major concern. This is of particular importance in tropical regions where the climate is mainly rainy with large raindrop size and high rainfall rate that may interact destructively with the propagating signal and cause total attenuation for the signal. International Telecommunication Union (ITU) introduced a global rain attenuation model to characterize the effect of rain on the propagating signal at a wideband of frequencies. The validity of this model in tropical regions is still an open question for research. In this paper, real measurements are conducted at Universiti Teknologi Malaysia (UTM), Johor Bahru, Malaysia, to investigate the impact of rain on the propagation of mm-waves at 26 GHz over the microwave 5G radio link system. Rainfall rate and rain attenuation data sets are collected for one year at one sample per min sampling rate. Both data sets are used to estimate signal propagation conditions in comparison to the ITU model prediction. From the presented results, it is found that at 0.01% percentage of time and rainfall rate of about 120 mm/hr, the propagated signal would experience 26.2 dB losses per kilometer traveled. In addition, there is a significant deviation between the empirical estimation of the worst month parameters and the ITU worst month parameter prediction. Similarly, rainfall rate and rain attenuation estimated through the ITU model imposes a large deviation as compared with the measurements. Furthermore, more accurate empirical worst month parameters are proposed that yielded more accurate estimation of the worst month rainfall and rain attenuation predictions in comparison to the ITU model predictions
    corecore