11 research outputs found

    Synthetic LiFi channel model using generative adversarial networks

    Get PDF
    In this paper, we present our research on modeling a synthetic light fidelity (LiFi) channel model that uses a deep learning architecture called generative adversarial networks (GAN). A research in LiFi that requires the generation of many multipath channel impulse responses (CIRs) can benefit from our proposed model. For example, future developments of autonomous (deep learning-based) network management systems that use LiFi as one of its high-speed wireless access technologies might require a dataset of many CIRs. In this paper, we use TimeGAN, which is a GAN architecture for time-series data. We will show that modifications are necessary to adopt TimeGAN in our use case. Consequently, synthetic CIRs generated by our model can track long-term dependency of LiFi multipath CIRs. The Kullback–Leibler divergence (KLD) is used in this paper to measure the small difference between samples of synthetic CIRs and real CIRs. Lastly, we also show a simple demonstration of our model that can run on a small virtual machine hosted over the Internet

    Intelligent subflow steering in MPTCP-based hybrid Wi-Fi and LiFi networks using model-augmented DRL

    Get PDF
    A hybrid Wi-Fi and light fidelity (LiFi) network combines the best of two worlds with the ubiquitous coverage of Wi-Fi and the high peak data rate of LiFi as the radio spectrum does not interfere with the light spectrum. This hybrid network might be realized by using multipath TCP (MPTCP), where Wi-Fi and LiFi paths can be simultaneously employed to potentially boost the total throughput of Wi-Fi while increasing the resilience towards network failure of LiFi due to, for example, blockage. However, naively implementing MPTCP in a hybrid Wi-Fi and LiFi network can yield an unexpected result, such as a lower throughput compared to the single-path TCP due to a Head-of-Line delay during the slow start phase of the TCP congestion control. Even though this problem can be avoided by improving the existing flow control or congestion control of TCP, these solutions still lack intelligent decision making that can improve the adaptability of MPTCP. Therefore, in this paper, we propose a model-augmented deep reinforcement learning (DRL) approach to intelligently steer MPTCP subflows (i.e., TCP connections) by using a close-to-reality scenario emulated by considering random orientation, random blockage, and random mobility of Wi-Fi-and-LiFi-enabled mobile devices. As a result, we will show later that a performance gain can be achieved compared to the state-of-the-art while maintaining ease implementation to existing MPTCP implementation

    Modeling the Random Orientation of Mobile Devices: Measurement, Analysis and LiFi Use Case

    Get PDF
    Light-fidelity (LiFi) is a networked optical wireless communication (OWC) solution for high-speed indoor connectivity for fixed and mobile optical communications. Unlike conventional radio frequency wireless systems, the OWC channel is not isotropic, meaning that the device orientation affects the channel gain significantly, particularly for mobile users. However, due to the lack of a proper model for device orientation, many studies have assumed that the receiver is vertically upward and fixed. In this paper, a novel model for device orientation based on experimental measurements of forty participants has been proposed. It is shown that the probability density function (PDF) of the polar angle can be modeled either based on a Laplace (for static users) or a Gaussian (for mobile users) distribution. In addition, a closed-form expression is obtained for the PDF of the cosine of the incidence angle based on which line-of-sight (LOS) channel gain is described in OWC channels. An approximation of this PDF based on the truncated Laplace is proposed and the accuracy of this approximation is confirmed by the Kolmogorov-Smirnov distance (KSD). Moreover, the statistics of the LOS channel gain are calculated and the random orientation of a user equipment (UE) is modeled as a random process. The influence of the random orientation on signal-to-noise-ratio (SNR) performance of OWC systems has been evaluated. Finally, an orientation-based random waypoint (ORWP) mobility model is proposed by considering the random orientation of the UE during the user's movement. The performance of ORWP is assessed on the handover rate and it is shown that it is important to take the random orientation into account.Comment: 14 pages, 7 figure

    Wireless infrared-based LiFi uplink transmission with link blockage and random device orientation

    Get PDF
    Light-fidelity (LiFi) is recognised as a promising technology for next generation wireless access networks. However, limited research efforts have been spent on the uplink (UL) transmission system in LiFi networks. In this article, a wireless infrared (IR)-based LiFi UL system is investigated. In particular, we focus on the performance of a single static user under the influence of random device orientation and link blockage. Simulations and mathematical analysis have been used to evaluate the UL system performance. The analytical expressions for the UL optical wireless channel and signal-to-noise ratio (SNR) statistics under the effects of random device orientation and link blockage are derived. The results show that the effects of random orientation and link blockage may lead to a decrease in coverage probability by 10% - 40% with various SNR thresholds

    A Tree-based Mortality Prediction Model of COVID-19 from Routine Blood Samples

    Get PDF
    COVID-19 has been declared by The World Health Organization (WHO) a global pandemic in January, 2020. Researchers have been working on formulating the best approach and solutions to cure the disease and help to prevent such pandemics in the future. A lot of efforts have been made to develop a fast and accurate early clinical assessment of the disease. Machine Learning (ML) has proven helpful for research and applications in the health domain as a way to understand real-world phenomena through data analysis. In our experiment, we collected the retrospective blood samples data set from 1,000 COVID-19 patients in Jakarta, Indonesia for the period of March to December 2020. We report our preliminary findings on the use of common blood test biomarkers in predicting COVID-19 patient mortality. This study took advantage of explainable machine learning to examine the data set. The contribution of this paper is to explain our findings on predicting COVID-19 mortality, including the role of the top 11 biomarkers found in our dataset. These findings can be generalized, especially in Indonesia, which is now at its highest peak of the epidemic. We show that tree-based AI models performed well on predicting COVID-19 mortality, while also making it easy to interpret the findings, as they lend themselves to human scrutiny and allow clinicians to interpret them and comment on their viability

    Studies of optical wireless communications: random orientation model, modulation, and hybrid WiFi and LiFi networks

    Get PDF
    Cisco has predicted that by 2022, as a result of the emerging internet-of-things traffic, there will be an internet protocol (IP) traffic explosion. An increasing amount of radio frequency (RF) spectra has been allocated to accommodate this mobile data surge, such as the recently allocated sub 6 GHz band for WiFi. A further prediction is that by the year 2035, all RF spectra will be fully utilized. This means there is a need for additional spectrum, such as the optical spectrum. One vision of future wireless networks is to aggregate multiple wireless access technologies. For example, optical wireless communications (OWC) that have narrow coverage, but a very high area spectral efficiency, can potentially complement RF communications to provide a higher peak data rate. A hybrid WiFi and light fidelity (LiFi) network is one of these examples. Compared to other OWC technologies, such as visible light communications or optical camera communications, LiFi supports bidirectional and multi-user communications. These features are also the main features of WiFi, and WiFi is predicted to carry the majority of global mobile data traffic in the future. Based on this prediction, the primary question in this thesis is how much LiFi supports WiFi in efficiently handling mobile data traffic in hybrid WiFi and LiFi networks. This is answered by calculating an offloading efficiency, which is the ratio of data that is transferred over LiFi compared to the total data. Even with the current advancements in LiFi, a few intermediate studies are needed. • The first contribution of this thesis is to model randomly-oriented mobile devices. A random orientation is needed in order to capture users’ behavior while they move and operate mobile devices. In addition, a random blockage model must be investigated as LiFi signals can be blocked by opaque objects. The main purpose of considering the random orientation and random blockage models is to ensure that the offloading efficiency is evaluated by using realistic assumptions. • The second contribution of this thesis fall under studies of single-carrier and multi-carrier modulations. The conventional pulse amplitude modulation and single-carrier with frequency domain equalization (PAM-SCFDE) is improved by adding non-linear filters or index modulation. In the low-to-moderate spectral efficiency region, up to 3 dB gain can be achieved by means of these improvements. In the high spectral efficiency region, an orthogonal frequency division multiplexing (OFDM)-based system, which is based on the common mode of the physical (PHY) layer of the ongoing LiFi standardization, i.e., IEEE 802.11bb, is used. By exploiting the fact that the wireless optical channel has a low-pass filter characteristic, an in-phase and quadrature wavelength division multiplexing (IQ-WDM) system that uses the PHY of IEEE 802.11bb is proposed. Up to 2 dB gain can be obtained by IQ-WDM compared to the common mode PHY of IEEE 802.11bb. • For the final contribution of this thesis, the OFDM system from IEEE 802.11bb is abstracted to emulate large-scale networks and to calculate the offloading efficiency. A real transport control protocol (TCP)/IP stack, which has multipath TCP (MPTCP) to support multi-connectivity between WiFi and LiFi networks, is also deployed to emulate real traffic. By calculating the offloading efficiency with such methodology over many channel realizations considering the proposed random orientation and blockage models, average offloading efficiencies of 64.54% and 75.85% are obtained for residential and enterprise scenarios, respectively. These results show the significant potential of OWC to complement RF communications

    Doubly Irregular Coded Slotted ALOHA for Massive Uncoordinated Multiway Relay Networks

    Get PDF
    Massive uncoordinated multiway relay networks (mu- mRN) is an mRN that can serve a massive number of users expecting to fully exchange information among them via a common relay. In this paper, we aim to improve normalized throughput of the mu-mRN using multiuser detection (MUD) technique with capability of K > 1. First, we present a network capacity bound of the mu-mRN with general K to investigate the theoretical limit of the network. Then, we search for many optimal degree distributions for the MUD-based mu-mRN. Second, we aim to improve the normalized throughput by 10× from the maximum normalized throughput of conventional systems. To achieve the goal, we propose the mu-mRN applying doubly irregular coded slotted ALOHA

    Vehicular Massive Multiway Relay Networks Applying Graph-Based Random Access

    Get PDF
    Recent vehicular communications problems entail various aspects, such as rapidly changing topologies and large number of users. We consider massive uncoordinated multiway relay networks (mu-mRN) that can serve a massive number of users expecting to fully exchange information among them via a common multiway relay. Relay vehicular networks system is one of potential applications of the mu-mRN. In this paper, we are interested in improving normalized throughput T of the network using multiuser detection (MUD) capability of K > 1 based on the graph-based random access to flexibly adapt the topology changes. First, we present a network capacity bound of the mu-mRN with general K to investigate the theoretical limit of the networks. Then, we search for many optimal and practical degree distributions for each theoretical bound. Second, we aim to improve the normalized throughput by 10× from the maximum normalized throughput of conventional systems. To achieve the goal, we propose the mu-mRN applying doubly irregular coded slotted ALOHA. We also propose an optimal practical encoding of the mu-mRN to closely approach the target with finite number of users

    An XGBoost Model for Age Prediction from COVID-19 Blood Test

    No full text
    COVID-19 was declared a pandemic by the World Health Organization (WHO) in January 2020. Many studies found that some specific age groups of people have a higher risk of contracting the disease. The gold standard test for the disease is a condition-specific test based on Reverse-Transcriptase Polymerase Chain Reaction (RT-PCR). We have previously shown that the results of a standard suite of non-specific blood tests can be used to indicate the presence of a COVID-19 infection with a high likelihood. We continue our research in this area with a study of the connection between the patients’ routine blood test results and their age. Predicting a person’s age from blood chemistry is not new in health science. Most often, such results are used to detect the signs of diseases associated with aging and develop new medications. The experiment described here shows that the XGBoost algorithm can be used to predict the patients’ age from their routine blood tests. The performance evaluation is very satisfactory, with R2 > 0.80 and a normalized RMSE below 0.1
    corecore