265 research outputs found

    On a Hybrid Preamble/Soft-Output Demapper Approach for Time Synchronization for IEEE 802.15.6 Narrowband WBAN

    Full text link
    In this paper, we present a maximum likelihood (ML) based time synchronization algorithm for Wireless Body Area Networks (WBAN). The proposed technique takes advantage of soft information retrieved from the soft demapper for the time delay estimation. This algorithm has a low complexity and is adapted to the frame structure specified by the IEEE 802.15.6 standard for the narrowband systems. Simulation results have shown good performance which approach the theoretical mean square error limit bound represented by the Cramer Rao Bound (CRB)

    A comprehensive survey of wireless body area networks on PHY, MAC, and network layers solutions

    Get PDF
    Recent advances in microelectronics and integrated circuits, system-on-chip design, wireless communication and intelligent low-power sensors have allowed the realization of a Wireless Body Area Network (WBAN). A WBAN is a collection of low-power, miniaturized, invasive/non-invasive lightweight wireless sensor nodes that monitor the human body functions and the surrounding environment. In addition, it supports a number of innovative and interesting applications such as ubiquitous healthcare, entertainment, interactive gaming, and military applications. In this paper, the fundamental mechanisms of WBAN including architecture and topology, wireless implant communication, low-power Medium Access Control (MAC) and routing protocols are reviewed. A comprehensive study of the proposed technologies for WBAN at Physical (PHY), MAC, and Network layers is presented and many useful solutions are discussed for each layer. Finally, numerous WBAN applications are highlighted

    Analysis of aircraft microwave measurements of the ocean surface

    Get PDF
    A data system was developed to process, from calibrated brightness temperature to computation of estimated parameters, the microwave measurements obtained by the NASA CV-990 aircraft during the 1972 Meteorological Expedition. A primary objective of the study was the implementation of an integrated software system at the computing facility of NASA/GSFC, and its application to the 1972 data. A single test case involving measurements away from and over a heavy rain cell was chosen to examine the effect of clouds upon the ability to infer ocean surface parameters. The results indicate substantial agreement with those of the theoretical study; namely, that the values obtained for the surface properties are consistent with available ground-truth information, and are reproducible except within the heaviest portions of the rain cell, at which nonlinear (or saturation) effects become apparent. Finally, it is seen that uncorrected instrumental effects introduce systematic errors which may limit the accuracy of the method

    Real-time signal detection and classification algorithms for body-centered systems

    Full text link
    El principal motivo por el cual los sistemas de comunicación en el entrono corporal se desean con el objetivo de poder obtener y procesar señales biométricas para monitorizar e incluso tratar una condición médica sea ésta causada por una enfermedad o el rendimiento de un atleta. Dado que la base de estos sistemas está en la sensorización y el procesado, los algoritmos de procesado de señal son una parte fundamental de los mismos. Esta tesis se centra en los algoritmos de tratamiento de señales en tiempo real que se utilizan tanto para monitorizar los parámetros como para obtener la información que resulta relevante de las señales obtenidas. En la primera parte se introduce los tipos de señales y sensores en los sistemas en el entrono corporal. A continuación se desarrollan dos aplicaciones concretas de los sistemas en el entorno corporal así como los algoritmos que en las mismas se utilizan. La primera aplicación es el control de glucosa en sangre en pacientes con diabetes. En esta parte se desarrolla un método de detección mediante clasificación de patronones de medidas erróneas obtenidas con el monitor contínuo comercial "Minimed CGMS". La segunda aplicacióin consiste en la monitorizacióni de señales neuronales. Descubrimientos recientes en este campo han demostrado enormes posibilidades terapéuticas (por ejemplo, pacientes con parálisis total que son capaces de comunicarse con el entrono gracias a la monitorizacióin e interpretación de señales provenientes de sus neuronas) y también de entretenimiento. En este trabajo, se han desarrollado algoritmos de detección, clasificación y compresión de impulsos neuronales y dichos algoritmos han sido evaluados junto con técnicas de transmisión inalámbricas que posibiliten una monitorización sin cables. Por último, se dedica un capítulo a la transmisión inalámbrica de señales en los sistemas en el entorno corporal. En esta parte se estudia las condiciones del canal que presenta el entorno corporal para la transmisión de sTraver Sebastiá, L. (2012). Real-time signal detection and classification algorithms for body-centered systems [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/16188Palanci

    Early detection of Myocardial Infarction using WBAN

    Get PDF
    International audienceCardiovascular diseases are the leading cause of death in the world, and Myocardial Infarction (MI) is the most serious one among those diseases. Patient monitoring for an early detection of MI is important to alert medical assistance and increase the vital prognostic of patients. With the development of wearable sensor devices having wireless transmission capabilities, there is a need to develop real-time applications that are able to accurately detect MI non-invasively. In this paper, we propose a new approach for early detection of MI using wireless body area networks. The proposed approach analyzes the patient electrocardiogram (ECG) in real time and extracts from each ECG cycle the ST elevation which is a significant indicator of an upcoming MI. We use the sequential change point detection algorithm CUmulative SUM (CUSUM) to early detect any deviation in ST elevation time series, and to raise an alarm for healthcare professionals. The experimental results on the ECG of real patients show that our proposed approach can detect MI with low delay and high accuracy

    A Priority-based Fair Queuing (PFQ) Model for Wireless Healthcare System

    Get PDF
    Healthcare is a very active research area, primarily due to the increase in the elderly population that leads to increasing number of emergency situations that require urgent actions. In recent years some of wireless networked medical devices were equipped with different sensors to measure and report on vital signs of patient remotely. The most important sensors are Heart Beat Rate (ECG), Pressure and Glucose sensors. However, the strict requirements and real-time nature of medical applications dictate the extreme importance and need for appropriate Quality of Service (QoS), fast and accurate delivery of a patient’s measurements in reliable e-Health ecosystem. As the elderly age and older adult population is increasing (65 years and above) due to the advancement in medicine and medical care in the last two decades; high QoS and reliable e-health ecosystem has become a major challenge in Healthcare especially for patients who require continuous monitoring and attention. Nevertheless, predictions have indicated that elderly population will be approximately 2 billion in developing countries by 2050 where availability of medical staff shall be unable to cope with this growth and emergency cases that need immediate intervention. On the other side, limitations in communication networks capacity, congestions and the humongous increase of devices, applications and IOT using the available communication networks add extra layer of challenges on E-health ecosystem such as time constraints, quality of measurements and signals reaching healthcare centres. Hence this research has tackled the delay and jitter parameters in E-health M2M wireless communication and succeeded in reducing them in comparison to current available models. The novelty of this research has succeeded in developing a new Priority Queuing model ‘’Priority Based-Fair Queuing’’ (PFQ) where a new priority level and concept of ‘’Patient’s Health Record’’ (PHR) has been developed and integrated with the Priority Parameters (PP) values of each sensor to add a second level of priority. The results and data analysis performed on the PFQ model under different scenarios simulating real M2M E-health environment have revealed that the PFQ has outperformed the results obtained from simulating the widely used current models such as First in First Out (FIFO) and Weight Fair Queuing (WFQ). PFQ model has improved transmission of ECG sensor data by decreasing delay and jitter in emergency cases by 83.32% and 75.88% respectively in comparison to FIFO and 46.65% and 60.13% with respect to WFQ model. Similarly, in pressure sensor the improvements were 82.41% and 71.5% and 68.43% and 73.36% in comparison to FIFO and WFQ respectively. Data transmission were also improved in the Glucose sensor by 80.85% and 64.7% and 92.1% and 83.17% in comparison to FIFO and WFQ respectively. However, non-emergency cases data transmission using PFQ model was negatively impacted and scored higher rates than FIFO and WFQ since PFQ tends to give higher priority to emergency cases. Thus, a derivative from the PFQ model has been developed to create a new version namely “Priority Based-Fair Queuing-Tolerated Delay” (PFQ-TD) to balance the data transmission between emergency and non-emergency cases where tolerated delay in emergency cases has been considered. PFQ-TD has succeeded in balancing fairly this issue and reducing the total average delay and jitter of emergency and non-emergency cases in all sensors and keep them within the acceptable allowable standards. PFQ-TD has improved the overall average delay and jitter in emergency and non-emergency cases among all sensors by 41% and 84% respectively in comparison to PFQ model

    Performance assessment of real-time data management on wireless sensor networks

    Get PDF
    Technological advances in recent years have allowed the maturity of Wireless Sensor Networks (WSNs), which aim at performing environmental monitoring and data collection. This sort of network is composed of hundreds, thousands or probably even millions of tiny smart computers known as wireless sensor nodes, which may be battery powered, equipped with sensors, a radio transceiver, a Central Processing Unit (CPU) and some memory. However due to the small size and the requirements of low-cost nodes, these sensor node resources such as processing power, storage and especially energy are very limited. Once the sensors perform their measurements from the environment, the problem of data storing and querying arises. In fact, the sensors have restricted storage capacity and the on-going interaction between sensors and environment results huge amounts of data. Techniques for data storage and query in WSN can be based on either external storage or local storage. The external storage, called warehousing approach, is a centralized system on which the data gathered by the sensors are periodically sent to a central database server where user queries are processed. The local storage, in the other hand called distributed approach, exploits the capabilities of sensors calculation and the sensors act as local databases. The data is stored in a central database server and in the devices themselves, enabling one to query both. The WSNs are used in a wide variety of applications, which may perform certain operations on collected sensor data. However, for certain applications, such as real-time applications, the sensor data must closely reflect the current state of the targeted environment. However, the environment changes constantly and the data is collected in discreet moments of time. As such, the collected data has a temporal validity, and as time advances, it becomes less accurate, until it does not reflect the state of the environment any longer. Thus, these applications must query and analyze the data in a bounded time in order to make decisions and to react efficiently, such as industrial automation, aviation, sensors network, and so on. In this context, the design of efficient real-time data management solutions is necessary to deal with both time constraints and energy consumption. This thesis studies the real-time data management techniques for WSNs. It particularly it focuses on the study of the challenges in handling real-time data storage and query for WSNs and on the efficient real-time data management solutions for WSNs. First, the main specifications of real-time data management are identified and the available real-time data management solutions for WSNs in the literature are presented. Secondly, in order to provide an energy-efficient real-time data management solution, the techniques used to manage data and queries in WSNs based on the distributed paradigm are deeply studied. In fact, many research works argue that the distributed approach is the most energy-efficient way of managing data and queries in WSNs, instead of performing the warehousing. In addition, this approach can provide quasi real-time query processing because the most current data will be retrieved from the network. Thirdly, based on these two studies and considering the complexity of developing, testing, and debugging this kind of complex system, a model for a simulation framework of the real-time databases management on WSN that uses a distributed approach and its implementation are proposed. This will help to explore various solutions of real-time database techniques on WSNs before deployment for economizing money and time. Moreover, one may improve the proposed model by adding the simulation of protocols or place part of this simulator on another available simulator. For validating the model, a case study considering real-time constraints as well as energy constraints is discussed. Fourth, a new architecture that combines statistical modeling techniques with the distributed approach and a query processing algorithm to optimize the real-time user query processing are proposed. This combination allows performing a query processing algorithm based on admission control that uses the error tolerance and the probabilistic confidence interval as admission parameters. The experiments based on real world data sets as well as synthetic data sets demonstrate that the proposed solution optimizes the real-time query processing to save more energy while meeting low latency.Fundação para a Ciência e Tecnologi

    Statistical Review of Health Monitoring Models for Real-Time Hospital Scenarios

    Get PDF
    Health Monitoring System Models (HMSMs) need speed, efficiency, and security to work. Cascading components ensure data collection, storage, communication, retrieval, and privacy in these models. Researchers propose many methods to design such models, varying in scalability, multidomain efficiency, flexibility, usage and deployment, computational complexity, cost of deployment, security level, feature usability, and other performance metrics. Thus, HMSM designers struggle to find the best models for their application-specific deployments. They must test and validate different models, which increases design time and cost, affecting deployment feasibility. This article discusses secure HMSMs' application-specific advantages, feature-specific limitations, context-specific nuances, and deployment-specific future research scopes to reduce model selection ambiguity. The models based on the Internet of Things (IoT), Machine Learning Models (MLMs), Blockchain Models, Hashing Methods, Encryption Methods, Distributed Computing Configurations, and Bioinspired Models have better Quality of Service (QoS) and security than their counterparts. Researchers can find application-specific models. This article compares the above models in deployment cost, attack mitigation performance, scalability, computational complexity, and monitoring applicability. This comparative analysis helps readers choose HMSMs for context-specific application deployments. This article also devises performance measuring metrics called Health Monitoring Model Metrics (HM3) to compare the performance of various models based on accuracy, precision, delay, scalability, computational complexity, energy consumption, and security
    corecore