25 research outputs found

    Using RTT Variability for Adaptive Cross-Layer Approach to Multimedia Delivery in Heterogeneous Networks

    Get PDF
    A holistic approach should be made for a wider adoption of a cross-layer approach. A cross-layer design on a wireless network assumed with a certain network condition, for instance, can have a limited usage in heterogeneous environments with diverse access network technologies and time varying network performance. The first step toward a cross-layer approach is an automatic detection of the underlying access network type, so that appropriate schemes can be applied without manual configurations. To address the issue, we investigate the characteristics of round-trip time (RTT) on wireless and wired networks. We conduct extensive experiments from diverse network environments and perform quantitative analyses on RTT variability. We show that RTT variability on a wireless network exhibits greatly larger mean, standard deviation, and min-to-high percentiles at least 10 ms bigger than those of wired networks due to the MAC layer retransmissions. We also find that the impact of packet size on wireless channel is particularly significant. Thus through a simple set of testing, one can accurately classify whether or not there has been a wireless network involved. We then propose effective adaptive cross-layer schemes for multimedia delivery over error-prone links. They include limiting the MAC layer retransmissions, controlling the application layer forward error correction (FEC) level, and selecting an optimal packet size. We conduct an analysis on the interplay of those adaptive parameters given a network condition. It enables us to find optimal cross-layer adaptive parameters when they are used concurrently.IEEE Circuits & Systems Societ

    Energy-Efficient Adaptive Geosource Multicast Routing for Wireless Sensor Networks

    Get PDF
    We propose an energy-efficient adaptive geosource multicast routing (EAGER) for WSNs. It addresses the energy and scalability issues of previous location based stateless multicast protocols in WSNs. EAGER is a novel stateless multicast protocol that optimizes location-based and source-based multicast approaches in various ways. First, it uses the receiver's geographic location information to save the cost of building a multicast tree. The information can be obtained during the receiver's membership establishment stage without flooding. Second, it reduces packet overhead, and in turn, energy usage by encoding with a small sized node ID instead of potentially large bytes of location information and by dynamically using branch geographic information for common source routing path segments. Third, it decreases computation overhead at each forwarding node by determining the multicast routing paths at a multicast node (or rendezvous point (RP)). Our extensive simulation results validate that EAGER outperforms existing stateless multicast protocols in computation time, packet overhead, and energy consumption while maintaining the advantages of stateless protocols

    Network quality aware routing in error-prone wireless sensor networks

    Get PDF
    We propose a network quality aware routing (NQAR) mechanism to provide an enabling method of the delay-sensitive data delivery over error-prone wireless sensor networks. Unlike the existing routing methods that select routes with the shortest arrival latency or the minimum hop count, the proposed scheme adaptively selects the route based on the network qualities including link errors and collisions with minimum additional complexity. It is designed to avoid the paths with potential noise and collision that may cause many non-deterministic backoffs and retransmissions.We propose a generic framework to select a minimum cost route that takes the packet loss rate and collision history into account. NQAR uses a data centric approach to estimate a single-hop delay based on processing time, propagation delay, packet loss rate, number of backoffs, and the retransmission timeout between two neighboring nodes. This enables a source node to choose the shortest expected end-to-end delay path to send a delay-sensitive data. The experiment results show that NQAR reduces the end-to-end transfer delay up to approximately 50% in comparison with the latency-based directed diffusion and the hop count-based directed diffusion under the error-prone network environments. Moreover, NQAR shows better performance than those routing methods in terms of jitter, reachability, and network lifetime

    NQAR: Network Quality Aware Routing in Error-Prone Wireless Sensor Networks

    Get PDF
    We propose a network quality aware routing (NQAR) mechanism to provide an enabling method of the delay-sensitive data delivery over error-prone wireless sensor networks. Unlike the existing routing methods that select routes with the shortest arrival latency or the minimum hop count, the proposed scheme adaptively selects the route based on the network qualities including link errors and collisions with minimum additional complexity. It is designed to avoid the paths with potential noise and collision that may cause many non-deterministic backoffs and retransmissions. We propose a generic framework to select a minimum cost route that takes the packet loss rate and collision history into account. NQAR uses a data centric approach to estimate a single-hop delay based on processing time, propagation delay, packet loss rate, number of backoffs, and the retransmission timeout between two neighboring nodes. This enables a source node to choose the shortest expected end-to-end delay path to send a delay-sensitive data. The experiment results show that NQAR reduces the end-to-end transfer delay up to approximately 50% in comparison with the latency-based directed diffusion and the hop count-based directed diffusion under the error-prone network environments. Moreover, NQAR shows better performance than those routing methods in terms of jitter, reachability, and network lifetime.Peer Reviewe

    An Integration Avenue of Ground Monitoring Based on Wireless Sensor Networks

    Get PDF
    Since wireless sensor networks (WSNs) have a lot of potential capability to provide diverse services to human by monitoring things scattered in real world, they are envisioned as one of the core enabling technologies for ubiquitous computing which organizes and mediates both physical and social interactions anytime and anywhere. WSNs are being adopted in various fields and things in their zones are being monitored. However, existing WSNs are normally designed for observing special zones or regional things based on small-scale, low power, and short range technologies. Seamless system integration at a global scale is still in its infancy stage due to the lack of the fundamental integration technologies. In this paper, we present a global integration avenue of ground monitoring based on WSNs. The proposed avenue includes design, integration, and operational strategies of IP-WSN based territorial monitoring system to ensure compatibility, interoperability, and real-time. Specifically, we offer the standardization of sensing data formats using IP-WSN and database interfaces using EPC sensor network, which enable a spontaneous and systematic integration among the legacy WSN systems. Also, we categorize network topology according to topographic characteristics thereby helping deploy sensor nodes on the real environment. Therefore, the proposed technology would be a milestone for the practically deployable global territorial monitoring systems

    Artificial intelligence for diagnosis and Gleason grading of prostate cancer: The PANDA challenge

    Get PDF
    Through a community-driven competition, the PANDA challenge provides a curated diverse dataset and a catalog of models for prostate cancer pathology, and represents a blueprint for evaluating AI algorithms in digital pathology. Artificial intelligence (AI) has shown promise for diagnosing prostate cancer in biopsies. However, results have been limited to individual studies, lacking validation in multinational settings. Competitions have been shown to be accelerators for medical imaging innovations, but their impact is hindered by lack of reproducibility and independent validation. With this in mind, we organized the PANDA challenge-the largest histopathology competition to date, joined by 1,290 developers-to catalyze development of reproducible AI algorithms for Gleason grading using 10,616 digitized prostate biopsies. We validated that a diverse set of submitted algorithms reached pathologist-level performance on independent cross-continental cohorts, fully blinded to the algorithm developers. On United States and European external validation sets, the algorithms achieved agreements of 0.862 (quadratically weighted kappa, 95% confidence interval (CI), 0.840-0.884) and 0.868 (95% CI, 0.835-0.900) with expert uropathologists. Successful generalization across different patient populations, laboratories and reference standards, achieved by a variety of algorithmic approaches, warrants evaluating AI-based Gleason grading in prospective clinical trials.KWF Kankerbestrijding ; Netherlands Organization for Scientific Research (NWO) ; Swedish Research Council European Commission ; Swedish Cancer Society ; Swedish eScience Research Center ; Ake Wiberg Foundation ; Prostatacancerforbundet ; Academy of Finland ; Cancer Foundation Finland ; Google Incorporated ; MICCAI board challenge working group ; Verily Life Sciences ; EIT Health ; Karolinska Institutet ; MICCAI 2020 satellite event team ; ERAPerMe

    High performance cloud auditing and applications

    No full text
    This book mainly focuses on cloud security and high performance computing for cloud auditing. The book discusses emerging challenges and techniques developed for high performance semantic cloud auditing, and presents the state of the art in cloud auditing, computing and security techniques with focus on technical aspects and feasibility of auditing issues in federated cloud computing environments.   In summer 2011, the United States Air Force Research Laboratory (AFRL) CyberBAT Cloud Security and Auditing Team initiated the exploration of the cloud security challenges and future cloud auditing research directions that are covered in this book. This work was supported by the United States government funds from the Air Force Office of Scientific Research (AFOSR), the AFOSR Summer Faculty Fellowship Program (SFFP), the Air Force Research Laboratory (AFRL) Visiting Faculty Research Program (VFRP), the National Science Foundation (NSF) and the National Institute of Health (NIH). All chapters were partially supported by the AFOSR Information Operations and Security Program extramural and intramural funds (AFOSR/RSL Program Manager: Dr. Robert Herklotz).   Key Features: ·         Contains surveys of cyber threats and security issues in cloud computing and presents secure cloud architectures ·         Presents in-depth cloud auditing techniques, federated cloud security architectures, cloud access control models, and access assured information sharing technologies ·         Outlines a wide range of challenges and provides solutions to manage and control very large and complex data sets                                           
    corecore