120 research outputs found

    Transmission Modeling with Smartphone-based Sensing

    Get PDF
    Infectious disease spread is difficult to accurately measure and model. Even for well-studied pathogens, uncertainties remain regarding the dynamics of mixing behavior and how to balance simulation-generated estimates with empirical data. Smartphone-based sensing data promises the availability of inferred proximate contacts, with which we can improve transmission models. This dissertation addresses the problem of informing transmission models with proximity contact data by breaking it down into three sub-questions. Firstly, can proximity contact data inform transmission models? To this question, an extended-Kalman-filter enhanced System Dynamics Susceptible-Infectious-Removed (EKF-SD-SIR) model demonstrated the filtering approach, as a framework, for informing Systems Dynamics models with proximity contact data. This combination results in recurrently-regrounded system status as empirical data arrive throughout disease transmission simulations---simultaneously considering empirical data accuracy, growing simulation error between measurements, and supporting estimation of changing model parameters. However, as revealed by this investigation, this filtering approach is limited by the quality and reliability of sensing-informed proximate contacts, which leads to the dissertation's second and third questions---investigating the impact of temporal and spatial resolution on sensing inferred proximity contact data for transmission models. GPS co-location and Bluetooth beaconing are two of those common measurement modalities to sense proximity contacts with different underlying technologies and tradeoffs. However, both measurement modalities have shortcomings and are prone to false positives or negatives when used to detect proximate contacts because unmeasured environmental influences bias the data. Will differences in sensing modalities impact transmission models informed by proximity contact data? The second part of this dissertation compares GPS- and Bluetooth-inferred proximate contacts by accessing their impact on simulated attack rates in corresponding proximate-contact-informed agent-based Susceptible-Exposed-Infectious-Recovered (ABM-SEIR) models of four distinct contagious diseases. Results show that the inferred proximate contacts resulting from these two measurement modalities are different and give rise to significantly different attack rates across multiple data collections and pathogens. While the advent of commodity mobile devices has eased the collection of proximity contact data, battery capacity and associated costs impose tradeoffs between the frequency and scanning duration used for proximate-contact detection. The choice of a balanced sensing regime involves specifying temporal resolutions and interpreting sensing data---depending on circumstances such as the characteristics of a particular pathogen, accompanying disease, and underlying population. How will the temporal resolution of sensing impact transmission models informed by proximity contact data? Furthermore, how will circumstances alter the impact of temporal resolution? The third part of this dissertation investigates the impacts of sensing regimes on findings from two sampling methods of sensing at widely varying inter-observation intervals by synthetically downsampling proximity contact data from five contact network studies---with each of these five studies measuring participant-participant contact every 5 minutes for durations of four or more weeks. The impact of downsampling is evaluated through ABM-SEIR simulations from both population- and individual-level for 12 distinct contagious diseases and associated variants of concern. Studies in this part find that for epidemiological models employing proximity contact data, both the observation paradigms and the inter-observation interval configured to collect proximity contact data exert impacts on the simulation results. Moreover, the impact is subject to the population characteristics and pathogen infectiousness reflective (such as the basic reproduction number, R0R_0). By comparing the performance of two sampling methods of sensing, we found that in most cases, periodically observing for a certain duration can collect proximity contact data that allows agent-based models to produce a reasonable estimation of the attack rate. However, higher-resolution data are preferred for modeling individual infection risk. Findings from this part of the dissertation represent a step towards providing the empirical basis for guidelines to inform data collection that is at once efficient and effective. This dissertation addresses the problem of informing transmission models with proximity contact data in three steps. Firstly, the demonstration of an EKF-SD-SIR model suggests that the filtering approach could improve System Dynamics transmission models by leveraging proximity contact data. In addition, experiments with the EKF-SD-SIR model also revealed that the filtering approach is constrained by the limited quality and reliability of sensing-data-inferred proximate contacts. The following two parts of this dissertation investigate spatial-temporal factors that could impact the quality and reliability of sensor-collected proximity contact data. In the second step, the impact of spatial resolution is illustrated by differences between two typical sensing modalities---Bluetooth beaconing versus GPS co-location. Experiments show that, in general, proximity contact data collected with Bluetooth beaconing lead to transmission models with results different from those driven by proximity contact data collected with GPS co-location. Awareness of the differences between sensing modalities can aid researchers in incorporating proximity contact data into transmission models. Finally, in the third step, the impact of temporal resolution is elucidated by investigating the differences between results of transmission models led by proximity contact data collected with varying observation frequencies. These differences led by varying observation frequencies are evaluated under circumstances with alternative assumptions regarding sampling method, disease/pathogen type, and the underlying population. Experiments show that the impact of sensing regimes is influenced by the type of diseases/pathogens and underlying population, while sampling once in a while can be a decent choice across all situations. This dissertation demonstrated the value of a filtering approach to enhance transmission models with sensor-collected proximity contact data, as well as explored spatial-temporal factors that will impact the accuracy and reliability of sensor-collected proximity contact data. Furthermore, this dissertation suggested guidance for future sensor-based proximity contact data collection and highlighted needs and opportunities for further research on sensing-inferred proximity contact data for transmission models

    Sensor Signal and Information Processing II

    Get PDF
    In the current age of information explosion, newly invented technological sensors and software are now tightly integrated with our everyday lives. Many sensor processing algorithms have incorporated some forms of computational intelligence as part of their core framework in problem solving. These algorithms have the capacity to generalize and discover knowledge for themselves and learn new information whenever unseen data are captured. The primary aim of sensor processing is to develop techniques to interpret, understand, and act on information contained in the data. The interest of this book is in developing intelligent signal processing in order to pave the way for smart sensors. This involves mathematical advancement of nonlinear signal processing theory and its applications that extend far beyond traditional techniques. It bridges the boundary between theory and application, developing novel theoretically inspired methodologies targeting both longstanding and emergent signal processing applications. The topic ranges from phishing detection to integration of terrestrial laser scanning, and from fault diagnosis to bio-inspiring filtering. The book will appeal to established practitioners, along with researchers and students in the emerging field of smart sensors processing

    Reliable Multicast transport of the video over the WiFi network

    Get PDF
    Le transport multicast est une solution efficace pour envoyer le même contenu à plusieurs récepteurs en même temps. Ce mode est principalement utilisé pour fournir des flux multimédia en temps réel. Cependant, le multicast classique de l IEEE 802.11 n'utilise aucun mécanisme d acquittement. Ainsi, l échec de réception implique la perte définitive du paquet. Cela limite la fiabilité du transport multicast et impact la qualité des applications vidéo. Pour résoudre ce problème, 802.11v et 802.11aa sont définis récemment. Le premier amendement propose Direct Multicast Service (DMS). D'autre part, le 802.11aa introduit GroupCast with Retries (GCR). GCR définit deux nouvelles politiques de retransmission : Block Ack (BACK) et Unsolicited Retry (UR).Dans cette thèse, nous évaluons et comparons les performances de 802.11v/aa. Nos résultats montrent que tous les nouveaux protocoles multicast génèrent un overhead de transmission important. En outre, DMS a une scalabilité très limitée, et GCR-BACK n'est pas approprié pour des grands groupes multicast. D autre part, nous montrons que DMS et GCR-BACK génèrent des latences de transmission importantes lorsque le nombre de récepteurs augmente. Par ailleurs, nous étudions les facteurs de pertes dans les réseaux sans fil. Nous montrons que l'indisponibilité du récepteur peut être la cause principale des pertes importantes et de leur nature en rafales. En particulier, nos résultats montrent que la surcharge du processeur peut provoquer un taux de perte de 100%, et que le pourcentage de livraison peut être limité à 35% lorsque la carte 802.11 est en mode d économie d'énergie.Pour éviter les collisions et améliorer la fiabilité du transport multicast, nous définissons le mécanisme Busy Symbol (BS). Nos résultats montrent que BS évite les collisions et assure un taux de succès de transmission très important. Afin d'améliorer davantage la fiabilité du trafic multicast, nous définissons un nouveau protocole multicast, appelé Block Negative Acknowledgement (BNAK). Ce protocole opère comme suit. L AP envoi un bloc de paquets suivi par un Block NAK Request (BNR). Le BNR permet aux membres de détecter les données manquantes et d envoyer une demande de retransmission, c.à.d. un Block NAK Response (BNAK). Un BNAK est transmis en utilisant la procédure classique d accès au canal afin d'éviter toute collision avec d'autres paquets. En plus, cette demande est acquittée. Sous l'hypothèse que 1) le récepteur est situé dans la zone de couverture du débit de transmission utilisé, 2) les collisions sont évitées et 3) le terminal a la bonne configuration, très peu de demandes de retransmission sont envoyées, et la bande passante est préservée. Nos résultats montrent que BNAK a une très grande scalabilité et génère des délais très limités. En outre, nous définissons un algorithme d'adaptation de débit pour BNAK. Nous montrons que le bon débit de transmission est sélectionné moyennant un overhead très réduit de moins de 1%. En plus, la conception de notre protocole supporte la diffusion scalable de lavvidéo. Cette caractéristique vise à résoudre la problématique de la fluctuation de la bande passante, et à prendre en considération l'hétérogénéité des récepteurs dans un réseau sans fil.The multicast transport is an efficient solution to deliver the same content to many receivers at the same time. This mode is mainly used to deliver real-time video streams. However, the conventional multicast transmissions of IEEE 802.11 do not use any feedback policy. Therefore missing packets are definitely lost. This limits the reliability of the multicast transport and impacts the quality of the video applications. To resolve this issue, the IEEE 802.11v/aa amendments have been defined recently. The former proposes the Direct Multicast Service (DMS). On the other hand, 802.11aa introduces Groupcast with Retries (GCR) service. GCR defines two retry policies: Block Ack (BACK) and Unsolicited Retry (UR).In this thesis we evaluate and compare the performance of 802.11v/aa. Our simulation results show that all the defined policies incur an important overhead. Besides, DMS has a very limited scalability, and GCR-BACK is not appropriate for large multicast groups. We show that both DMS and GCR-BACK incur important transmission latencies when the number of the multicast receivers increases. Furthermore, we investigate the loss factors in wireless networks. We show that the device unavailability may be the principal cause of the important packet losses and their bursty nature. Particularly, our results show that the CPU overload may incur a loss rate of 100%, and that the delivery ratio may be limited to 35% when the device is in the power save mode.To avoid the collisions and to enhance the reliability of the multicast transmissions, we define the Busy Symbol (BS) mechanism. Our results show that BS prevents all the collisions and ensures a very high delivery ratio for the multicast packets. To further enhance the reliability of this traffic, we define the Block Negative Acknowledgement (BNAK) retry policy. Using our protocol, the AP transmits a block of multicast packets followed by a Block NAK Request (BNR). Upon reception of a BNR, a multicast member generates a Block NAK Response (BNAK) only if it missed some packets. A BNAK is transmitted after channel contention in order to avoid any eventual collision with other feedbacks, and is acknowledged. Under the assumption that 1) the receiver is located within the coverage area of the used data rate, 2) the collisions are avoided and 3) the terminal has the required configuration, few feedbacks are generated and the bandwidth is saved. Our results show that BNAK has a very high scalability and incurs very low delays. Furthermore, we define a rate adaptation scheme for BNAK. We show that the appropriate rate is selected on the expense of a very limited overhead of less than 1%. Besides, the conception of our protocol is defined to support the scalable video streaming. This capability intends to resolve the bandwidth fluctuation issue and to consider the device heterogeneity of the group members.BORDEAUX1-Bib.electronique (335229901) / SudocSudocFranceF

    Nuclear Power - Control, Reliability and Human Factors

    Get PDF
    Advances in reactor designs, materials and human-machine interfaces guarantee safety and reliability of emerging reactor technologies, eliminating possibilities for high-consequence human errors as those which have occurred in the past. New instrumentation and control technologies based in digital systems, novel sensors and measurement approaches facilitate safety, reliability and economic competitiveness of nuclear power options. Autonomous operation scenarios are becoming increasingly popular to consider for small modular systems. This book belongs to a series of books on nuclear power published by InTech. It consists of four major sections and contains twenty-one chapters on topics from key subject areas pertinent to instrumentation and control, operation reliability, system aging and human-machine interfaces. The book targets a broad potential readership group - students, researchers and specialists in the field - who are interested in learning about nuclear power

    Deep Learning and parallelization of Meta-heuristic Methods for IoT Cloud

    Get PDF
    Healthcare 4.0 is one of the Fourth Industrial Revolution’s outcomes that make a big revolution in the medical field. Healthcare 4.0 came with more facilities advantages that improved the average life expectancy and reduced population mortality. This paradigm depends on intelligent medical devices (wearable devices, sensors), which are supposed to generate a massive amount of data that need to be analyzed and treated with appropriate data-driven algorithms powered by Artificial Intelligence such as machine learning and deep learning (DL). However, one of the most significant limits of DL techniques is the long time required for the training process. Meanwhile, the realtime application of DL techniques, especially in sensitive domains such as healthcare, is still an open question that needs to be treated. On the other hand, meta-heuristic achieved good results in optimizing machine learning models. The Internet of Things (IoT) integrates billions of smart devices that can communicate with one another with minimal human intervention. IoT technologies are crucial in enhancing several real-life smart applications that can improve life quality. Cloud Computing has emerged as a key enabler for IoT applications because it provides scalable and on-demand, anytime, anywhere access to the computing resources. In this thesis, we are interested in improving the efficacity and performance of Computer-aided diagnosis systems in the medical field by decreasing the complexity of the model and increasing the quality of data. To accomplish this, three contributions have been proposed. First, we proposed a computer aid diagnosis system for neonatal seizures detection using metaheuristics and convolutional neural network (CNN) model to enhance the system’s performance by optimizing the CNN model. Secondly, we focused our interest on the covid-19 pandemic and proposed a computer-aided diagnosis system for its detection. In this contribution, we investigate Marine Predator Algorithm to optimize the configuration of the CNN model that will improve the system’s performance. In the third contribution, we aimed to improve the performance of the computer aid diagnosis system for covid-19. This contribution aims to discover the power of optimizing the data using different AI methods such as Principal Component Analysis (PCA), Discrete wavelet transform (DWT), and Teager Kaiser Energy Operator (TKEO). The proposed methods and the obtained results were validated with comparative studies using benchmark and public medical data

    Energy Harvesting and Energy Storage Systems

    Get PDF
    This book discuss the recent developments in energy harvesting and energy storage systems. Sustainable development systems are based on three pillars: economic development, environmental stewardship, and social equity. One of the guiding principles for finding the balance between these pillars is to limit the use of non-renewable energy sources

    Recent Developments in Smart Healthcare

    Get PDF
    Medicine is undergoing a sector-wide transformation thanks to the advances in computing and networking technologies. Healthcare is changing from reactive and hospital-centered to preventive and personalized, from disease focused to well-being centered. In essence, the healthcare systems, as well as fundamental medicine research, are becoming smarter. We anticipate significant improvements in areas ranging from molecular genomics and proteomics to decision support for healthcare professionals through big data analytics, to support behavior changes through technology-enabled self-management, and social and motivational support. Furthermore, with smart technologies, healthcare delivery could also be made more efficient, higher quality, and lower cost. In this special issue, we received a total 45 submissions and accepted 19 outstanding papers that roughly span across several interesting topics on smart healthcare, including public health, health information technology (Health IT), and smart medicine

    Intelligent Transportation Related Complex Systems and Sensors

    Get PDF
    Building around innovative services related to different modes of transport and traffic management, intelligent transport systems (ITS) are being widely adopted worldwide to improve the efficiency and safety of the transportation system. They enable users to be better informed and make safer, more coordinated, and smarter decisions on the use of transport networks. Current ITSs are complex systems, made up of several components/sub-systems characterized by time-dependent interactions among themselves. Some examples of these transportation-related complex systems include: road traffic sensors, autonomous/automated cars, smart cities, smart sensors, virtual sensors, traffic control systems, smart roads, logistics systems, smart mobility systems, and many others that are emerging from niche areas. The efficient operation of these complex systems requires: i) efficient solutions to the issues of sensors/actuators used to capture and control the physical parameters of these systems, as well as the quality of data collected from these systems; ii) tackling complexities using simulations and analytical modelling techniques; and iii) applying optimization techniques to improve the performance of these systems. It includes twenty-four papers, which cover scientific concepts, frameworks, architectures and various other ideas on analytics, trends and applications of transportation-related data
    • …
    corecore