36 research outputs found

    MedLAN: Compact mobile computing system for wireless information access in emergency hospital wards

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.As the need for faster, safer and more efficient healthcare delivery increases, medical consultants seek new ways of implementing a high quality telemedical system, using innovative technology. Until today, teleconsultation (the most common application of Telemedicine) was performed by transferring the patient from the Accidents and Emergency ward, to a specially equipped room, or by moving large and heavy machinery to the place where the patient resided. Both these solutions were unpractical, uneconomical and potentially dangerous. At the same time wireless networks became increasingly useful in point-of-care areas such as hospitals, because of their ease of use, low cost of installation and increased flexibility. This thesis presents an integrated system called MedLAN dedicated for use inside the A&E hospital wards. Its purpose is to wirelessly support high-quality live video, audio, high-resolution still images and networks support from anywhere there is WLAN coverage. It is capable of transmitting all of the above to a consultant residing either inside or outside the hospital, or even to an external place, thorough the use of the Internet. To implement that, it makes use of the existing IEEE 802.11b wireless technology. Initially, this thesis demonstrates that for specific scenarios (such as when using WLANs), DICOM specifications should be adjusted to accommodate for the reduced WLAN bandwidth. Near lossless compression has been used to send still images through the WLANs and the results have been evaluated by a number of consultants to decide whether they retain their diagnostic value. The thesis further suggests improvements on the existing 802.11b protocol. In particular, as the typical hospital environment suffers from heavy RF reflections, it suggests that an alternative method of modulation (OFDM) can be embedded in the 802.11b hardware to reduce the multipath effect, increase the throughput and thus the video quality sent by the MedLAN system. Finally, realising that the trust between a patient and a doctor is fundamental this thesis proposes a series of simple actions aiming at securing the MedLAN system. Additionally, a concrete security system is suggested, that encapsulates the existing WEP security protocol, over IPSec

    MedLAN : compact mobile computing system for wireless information access in emergency hospital wards

    Get PDF
    As the need for faster, safer and more efficient healthcare delivery increases, medical consultants seek new ways of implementing a high quality telemedical system, using innovative technology. Until today, teleconsultation (the most common application of Telemedicine) was performed by transferring the patient from the Accidents and Emergency ward, to a specially equipped room, or by moving large and heavy machinery to the place where the patient resided. Both these solutions were unpractical, uneconomical and potentially dangerous. At the same time wireless networks became increasingly useful in point-of-care areas such as hospitals, because of their ease of use, low cost of installation and increased flexibility. This thesis presents an integrated system called MedLAN dedicated for use inside the A;E hospital wards. Its purpose is to wirelessly support high-quality live video, audio, high-resolution still images and networks support from anywhere there is WLAN coverage. It is capable of transmitting all of the above to a consultant residing either inside or outside the hospital, or even to an external place, thorough the use of the Internet. To implement that, it makes use of the existing IEEE 802.11b wireless technology. Initially, this thesis demonstrates that for specific scenarios (such as when using WLANs), DICOM specifications should be adjusted to accommodate for the reduced WLAN bandwidth. Near lossless compression has been used to send still images through the WLANs and the results have been evaluated by a number of consultants to decide whether they retain their diagnostic value. The thesis further suggests improvements on the existing 802.11b protocol. In particular, as the typical hospital environment suffers from heavy RF reflections, it suggests that an alternative method of modulation (OFDM) can be embedded in the 802.11b hardware to reduce the multipath effect, increase the throughput and thus the video quality sent by the MedLAN system. Finally, realising that the trust between a patient and a doctor is fundamental this thesis proposes a series of simple actions aiming at securing the MedLAN system. Additionally, a concrete security system is suggested, that encapsulates the existing WEP security protocol, over IPSec.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Evaluation of international standards for ECG-recording and storage for use in tele-medical services

    Get PDF
    This report is written to clarify which of the international standards for ECG recordings that can be used in tele-medical services, where the recordings should be transmitted by wireless telecommunication facilities and finally stored as information integrated into the patients Electronic Health Record (EHR). Some principals for recording, transmission and storage of digital vital signs parameters are highlighted and important aspects of wireless communication of recorded signals from biomedical sensors are described, in order to understand the significance and differences in the storing formats to be used. Even if most of the relevant standards are not yet ratified (the last meeting in ISO TC 251 WH6 was held in October 2005), the actual international standards SCP-ECG, MFER, FDAXML and DICIOM are defined and already widely adopted. In this report, these standards are briefly described and evaluated with respect to possible use in tele-medical services, and recommendations are given in order to obtain a reliable and secure communication solution. Requirements for integration of the ECG file formats into the EHR are briefly described, and it is given some recommendations for actual standards to be used in future solutions

    Description of Implementations of the Clinical Testbed Applications [83 pages]

    Get PDF

    An Adaptive Framework for Real-Time ECG Transmission in Mobile Environments

    Get PDF
    Wireless electrocardiogram (ECG) monitoring involves the measurement of ECG signals and their timely transmission over wireless networks to remote healthcare professionals. However, fluctuations in wireless channel conditions pose quality-of-service challenges for real-time ECG monitoring services in a mobile environment. We present an adaptive framework for layered coding and transmission of ECG data that can cope with a time-varying wireless channel. The ECG is segmented into layers with differing importance with respect to the quality of the reconstructed signal. According to this observation, we have devised a simple and efficient real-time scheduling algorithm based on the earliest deadline first (EDF) policy, which decides the order of transmitting or retransmitting packets that contain ECG data at any given time for the delivery of scalable ECG data over a lossy channel. The algorithm takes into account the differing priorities of packets in each layer, which prevents the perceived quality of the reconstructed ECG signal from degrading abruptly as channel conditions worsen, while using the available bandwidth efficiently. Extensive simulations demonstrate this improvement in perceived quality

    Design of a secure architecture for the exchange of biomedical information in m-Health scenarios

    Get PDF
    El paradigma de m-Salud (salud móvil) aboga por la integración masiva de las más avanzadas tecnologías de comunicación, red móvil y sensores en aplicaciones y sistemas de salud, para fomentar el despliegue de un nuevo modelo de atención clínica centrada en el usuario/paciente. Este modelo tiene por objetivos el empoderamiento de los usuarios en la gestión de su propia salud (p.ej. aumentando sus conocimientos, promocionando estilos de vida saludable y previniendo enfermedades), la prestación de una mejor tele-asistencia sanitaria en el hogar para ancianos y pacientes crónicos y una notable disminución del gasto de los Sistemas de Salud gracias a la reducción del número y la duración de las hospitalizaciones. No obstante, estas ventajas, atribuidas a las aplicaciones de m-Salud, suelen venir acompañadas del requisito de un alto grado de disponibilidad de la información biomédica de sus usuarios para garantizar una alta calidad de servicio, p.ej. fusionar varias señales de un usuario para obtener un diagnóstico más preciso. La consecuencia negativa de cumplir esta demanda es el aumento directo de las superficies potencialmente vulnerables a ataques, lo que sitúa a la seguridad (y a la privacidad) del modelo de m-Salud como factor crítico para su éxito. Como requisito no funcional de las aplicaciones de m-Salud, la seguridad ha recibido menos atención que otros requisitos técnicos que eran más urgentes en etapas de desarrollo previas, tales como la robustez, la eficiencia, la interoperabilidad o la usabilidad. Otro factor importante que ha contribuido a retrasar la implementación de políticas de seguridad sólidas es que garantizar un determinado nivel de seguridad implica unos costes que pueden ser muy relevantes en varias dimensiones, en especial en la económica (p.ej. sobrecostes por la inclusión de hardware extra para la autenticación de usuarios), en el rendimiento (p.ej. reducción de la eficiencia y de la interoperabilidad debido a la integración de elementos de seguridad) y en la usabilidad (p.ej. configuración más complicada de dispositivos y aplicaciones de salud debido a las nuevas opciones de seguridad). Por tanto, las soluciones de seguridad que persigan satisfacer a todos los actores del contexto de m-Salud (usuarios, pacientes, personal médico, personal técnico, legisladores, fabricantes de dispositivos y equipos, etc.) deben ser robustas y al mismo tiempo minimizar sus costes asociados. Esta Tesis detalla una propuesta de seguridad, compuesta por cuatro grandes bloques interconectados, para dotar de seguridad a las arquitecturas de m-Salud con unos costes reducidos. El primer bloque define un esquema global que proporciona unos niveles de seguridad e interoperabilidad acordes con las características de las distintas aplicaciones de m-Salud. Este esquema está compuesto por tres capas diferenciadas, diseñadas a la medidas de los dominios de m-Salud y de sus restricciones, incluyendo medidas de seguridad adecuadas para la defensa contra las amenazas asociadas a sus aplicaciones de m-Salud. El segundo bloque establece la extensión de seguridad de aquellos protocolos estándar que permiten la adquisición, el intercambio y/o la administración de información biomédica -- por tanto, usados por muchas aplicaciones de m-Salud -- pero no reúnen los niveles de seguridad detallados en el esquema previo. Estas extensiones se concretan para los estándares biomédicos ISO/IEEE 11073 PHD y SCP-ECG. El tercer bloque propone nuevas formas de fortalecer la seguridad de los tests biomédicos, que constituyen el elemento esencial de muchas aplicaciones de m-Salud de carácter clínico, mediante codificaciones novedosas. Finalmente el cuarto bloque, que se sitúa en paralelo a los anteriores, selecciona herramientas genéricas de seguridad (elementos de autenticación y criptográficos) cuya integración en los otros bloques resulta idónea, y desarrolla nuevas herramientas de seguridad, basadas en señal -- embedding y keytagging --, para reforzar la protección de los test biomédicos.The paradigm of m-Health (mobile health) advocates for the massive integration of advanced mobile communications, network and sensor technologies in healthcare applications and systems to foster the deployment of a new, user/patient-centered healthcare model enabling the empowerment of users in the management of their health (e.g. by increasing their health literacy, promoting healthy lifestyles and the prevention of diseases), a better home-based healthcare delivery for elderly and chronic patients and important savings for healthcare systems due to the reduction of hospitalizations in number and duration. It is a fact that many m-Health applications demand high availability of biomedical information from their users (for further accurate analysis, e.g. by fusion of various signals) to guarantee high quality of service, which on the other hand entails increasing the potential surfaces for attacks. Therefore, it is not surprising that security (and privacy) is commonly included among the most important barriers for the success of m-Health. As a non-functional requirement for m-Health applications, security has received less attention than other technical issues that were more pressing at earlier development stages, such as reliability, eficiency, interoperability or usability. Another fact that has contributed to delaying the enforcement of robust security policies is that guaranteeing a certain security level implies costs that can be very relevant and that span along diferent dimensions. These include budgeting (e.g. the demand of extra hardware for user authentication), performance (e.g. lower eficiency and interoperability due to the addition of security elements) and usability (e.g. cumbersome configuration of devices and applications due to security options). Therefore, security solutions that aim to satisfy all the stakeholders in the m-Health context (users/patients, medical staff, technical staff, systems and devices manufacturers, regulators, etc.) shall be robust and, at the same time, minimize their associated costs. This Thesis details a proposal, composed of four interrelated blocks, to integrate appropriate levels of security in m-Health architectures in a cost-efcient manner. The first block designes a global scheme that provides different security and interoperability levels accordingto how critical are the m-Health applications to be implemented. This consists ofthree layers tailored to the m-Health domains and their constraints, whose security countermeasures defend against the threats of their associated m-Health applications. Next, the second block addresses the security extension of those standard protocols that enable the acquisition, exchange and/or management of biomedical information | thus, used by many m-Health applications | but do not meet the security levels described in the former scheme. These extensions are materialized for the biomedical standards ISO/IEEE 11073 PHD and SCP-ECG. Then, the third block proposes new ways of enhancing the security of biomedical standards, which are the centerpiece of many clinical m-Health applications, by means of novel codings. Finally the fourth block, with is parallel to the others, selects generic security methods (for user authentication and cryptographic protection) whose integration in the other blocks results optimal, and also develops novel signal-based methods (embedding and keytagging) for strengthening the security of biomedical tests. The layer-based extensions of the standards ISO/IEEE 11073 PHD and SCP-ECG can be considered as robust, cost-eficient and respectful with their original features and contents. The former adds no attributes to its data information model, four new frames to the service model |and extends four with new sub-frames|, and only one new sub-state to the communication model. Furthermore, a lightweight architecture consisting of a personal health device mounting a 9 MHz processor and an aggregator mounting a 1 GHz processor is enough to transmit a 3-lead electrocardiogram in real-time implementing the top security layer. The extra requirements associated to this extension are an initial configuration of the health device and the aggregator, tokens for identification/authentication of users if these devices are to be shared and the implementation of certain IHE profiles in the aggregator to enable the integration of measurements in healthcare systems. As regards to the extension of SCP-ECG, it only adds a new section with selected security elements and syntax in order to protect the rest of file contents and provide proper role-based access control. The overhead introduced in the protected SCP-ECG is typically 2{13 % of the regular file size, and the extra delays to protect a newly generated SCP-ECG file and to access it for interpretation are respectively a 2{10 % and a 5 % of the regular delays. As regards to the signal-based security techniques developed, the embedding method is the basis for the proposal of a generic coding for tests composed of biomedical signals, periodic measurements and contextual information. This has been adjusted and evaluated with electrocardiogram and electroencephalogram-based tests, proving the objective clinical quality of the coded tests, the capacity of the coding-access system to operate in real-time (overall delays of 2 s for electrocardiograms and 3.3 s for electroencephalograms) and its high usability. Despite of the embedding of security and metadata to enable m-Health services, the compression ratios obtained by this coding range from ' 3 in real-time transmission to ' 5 in offline operation. Complementarily, keytagging permits associating information to images (and other signals) by means of keys in a secure and non-distorting fashion, which has been availed to implement security measures such as image authentication, integrity control and location of tampered areas, private captioning with role-based access control, traceability and copyright protection. The tests conducted indicate a remarkable robustness-capacity tradeoff that permits implementing all this measures simultaneously, and the compatibility of keytagging with JPEG2000 compression, maintaining this tradeoff while setting the overall keytagging delay in only ' 120 ms for any image size | evidencing the scalability of this technique. As a general conclusion, it has been demonstrated and illustrated with examples that there are various, complementary and structured manners to contribute in the implementation of suitable security levels for m-Health architectures with a moderate cost in budget, performance, interoperability and usability. The m-Health landscape is evolving permanently along all their dimensions, and this Thesis aims to do so with its security. Furthermore, the lessons learned herein may offer further guidance for the elaboration of more comprehensive and updated security schemes, for the extension of other biomedical standards featuring low emphasis on security or privacy, and for the improvement of the state of the art regarding signal-based protection methods and applications

    Efficient and secured wireless monitoring systems for detection of cardiovascular diseases

    Get PDF
    Cardiovascular Disease (CVD) is the number one killer for modern era. Majority of the deaths associated with CVD can entirely be prevented if the CVD struck person is treated with urgency. This thesis is our effort in minimizing the delay associated with existing tele-cardiology application. We harnessed the computational power of modern day mobile phones to detect abnormality in Electrocardiogram (ECG). If abnormality is detected, our innovative ECG compression algorithm running on the patient's mobile phone compresses and encrypts the ECG signal and then performs efficient transmission towards the doctors or hospital services. According to the literature, we have achieved the highest possible compression ratio of 20.06 (95% compression) on ECG signal, without any loss of information. Our 3 layer permutation cipher based ECG encoding mechanism can raise the security strength substantially higher than conventional AES or DES algorithms. If in near future, a grid of supercomputers can compare a trillion trillion trillion (1036) combinations of one ECG segment (comprising 500 ECG samples) per second for ECG morphology matching, it will take approximately 9.333 X 10970 years to enumerate all the combinations. After receiving the compressed ECG packets the doctor's mobile phone or the hospital server authenticates the patient using our proposed set of ECG biometric based authentication mechanisms. Once authenticated, the patients are diagnosed with our faster ECG diagnosis algorithms. In a nutshell, this thesis contains a set of algorithms that can save a CVD affected patient's life by harnessing the power of mobile computation and wireless communication

    Cloud and mobile infrastructure monitoring for latency and bandwidth sensitive applications

    Get PDF
    This PhD thesis involves the study of cloud computing infrastructures (from the networking perspective) to assess the feasibility of applications gaining increasing popularity over recent years, including multimedia and telemedicine applications, demanding low, bounded latency and sufficient bandwidth. I also focus on the case of telemedicine, where remote imaging applications (for example, telepathology or telesurgery) need to achieve a low and stable latency for the remote transmission of images, and also for the remote control of such equipment. Another important use case for telemedicine is denoted as remote computation, which involves the offloading of image processing to help diagnosis; also in this case, bandwidth and latency requirements should be enforced to ensure timely results, although they are less strict compared to the previous scenario. Nowadays, the capability of gaining access to IT resources in a rapid and on-demand fashion, according to a pay-as-you-go model, has made the cloud computing a key-enabler for innovative multimedia and telemedicine services. However, the partial obscurity of cloud performance, and also security concerns are still hindering the adoption of cloud infrastructure. To ensure that the requirements of applications running on the cloud are satisfied, there is the need to design and evaluate proper methodologies, according to the metric of interest. Moreover, some kinds of applications have specific requirements that cannot be satisfied by the current cloud infrastructure. In particular, since the cloud computing involves communication to remote servers, two problems arise: firstly, the core network infrastructure can be overloaded, considering the massive amount of data that has to flow through it to allow clients to reach the datacenters; secondly, the latency resulting from this remote interaction between clients and servers is increased. For these, and many other cases also beyond the field of telemedicine, the Edge and Fog computing paradigms were introduced. In these new paradigms, the IT resources are deployed not only in the core cloud datacenters, but also at the edge of the network, either in the telecom operator access network or even leveraging other users' devices. The proximity of resources to end-users allows to alleviate the burden on the core network and at the same time to reduce latency towards users. Indeed, the latency from users to remote cloud datacenters encompasses delays from the access and core networks, as well as the intra-datacenter delay. Therefore, this latency is expected to be higher than that required to interconnect users to edge servers, which in the envisioned paradigm are deployed in the access network, that is, nearby final users. Therefore, the edge latency is expected to be reduced to only a portion of the overall cloud delay. Moreover, the edge and central resources can be used in conjunction, and therefore attention to core cloud monitoring is of capital importance even when edge architectures will have a widespread adoption, which is not the case yet. While a lot of research work has been presented for monitoring several network-related metrics, such as bandwidth, latency, jitter and packet loss, less attention was given to the monitoring of latency in cloud and edge cloud infrastructures. In detail, while some works target cloud-latency monitoring, the evaluation is lacking a fine-grained analysis of latency considering spatial and temporal trends. Furthermore, the widespread adoption of mobile devices, and the Internet of Things paradigm further accelerate the shift towards the cloud paradigm for the additional benefits it can provide in this context, allowing energy savings and augmenting the computation capabilities of these devices, creating a new scenario denoted as mobile cloud. This scenario poses additional challenges for its bandwidth constraints, accentuating the need for tailored methodologies that can ensure that the crucial requirements of the aforementioned applications can be met by the current infrastructure. In this sense, there is still a gap of works monitoring bandwidth-related metrics in mobile networks, especially when performing in-the-wild assessment targeting actual mobile networks and operators. Moreover, even the few works testing real scenarios typically consider only one provider in one country for a limited period of time, lacking an in-depth assessment of bandwidth variability over space and time. In this thesis, I therefore consider monitoring methodologies for challenging scenarios, focusing on latency perceived by customers of public cloud providers, and bandwidth in mobile broadband networks. Indeed, as described, achieving low latency is a critical requirement for core cloud infrastructures, while providing enough bandwidth is still challenging in mobile networks compared to wired settings, even with the adoption of 4G mobile broadband networks, expecting to overcome this issue only with the widespread availability of 5G connections (with half of total traffic expected to come from 5G networks by 2026). Therefore, in the research activities carried on during my PhD, I focused on monitoring latency and bandwidth on cloud and mobile infrastructures, assessing to which extent the current public cloud infrastructure and mobile network make multimedia and telemedicine applications (as well as others having similar requirements) feasible

    Desing and evaluation of novel authentication, authorization and border protection mechanisms for modern information security architectures

    Get PDF
    En los últimos años, las vidas real y digital de las personas están más entrelazadas que nunca, lo que ha dado lugar a que la información de los usuarios haya adquirido un valor incalculable tanto para las empresas como para los atacantes. Mientras tanto, las consecuencias derivadas del uso inadecuado de dicha información son cada vez más preocupantes. El número de brechas de seguridad sigue aumentando cada día y las arquitecturas de seguridad de la información, si se diseñan correctamente, son la apuesta más segura para romper esta tendencia ascendente.Esta tesis contribuye en tres de los pilares fundamentales de cualquier arquitectura de seguridad de la información—autenticación, autorización y seguridad de los datos en tránsito—mejorando la seguridad y privacidad provista a la información involucrada. En primer lugar, la autenticación tiene como objetivo verificar que el usuario es quien dice ser. Del mismo modo que otras tareas que requieren de interacción por parte del usuario, en la autenticación es fundamental mantener el balance entre seguridad y usabilidad. Por ello, hemos diseñado una metodología de autenticación basada en el fotopletismograma (PPG). En la metodología propuesta, el modelo de cada usuario contiene un conjunto de ciclos aislados de su señal PPG, mientras que la distancia de Manhattan se utiliza para calcular la distancia entre modelos. Dicha metodología se ha evaluado prestando especial atención a los resultados a largo plazo. Los resultados obtenidos muestran que los impresionantes valores de error que se pueden obtener a corto plazo (valores de EER por debajo del 1%) crecen rápidamente cuando el tiempo entre la creación del modelo y la evaluación aumenta (el EER aumenta hasta el 20% durante las primeras 24 horas, valor que permanece estable desde ese momento). Aunque los valores de error encontrados en el largo plazo pueden ser demasiado altos para permitir que el PPG sea utilizado como una alternativa de autenticación confiable por si mismo, este puede ser utilizado de forma complementaria (e.g. como segundo factor de autenticación) junto a otras alternativas de autenticación, mejorándolas con interesantes propiedades, como la prueba de vida.Tras una correcta autenticación, el proceso de autorización determina si la acción solicitada al sistema debería permitirse o no. Como indican las nuevas leyes de protección de datos, los usuarios son los dueños reales de su información, y por ello deberían contar con los métodos necesarios para gestionar su información digital de forma efectiva. El framework OAuth, que permite a los usuarios autorizar a una aplicación de terceros a acceder a sus recursos protegidos, puede considerarse la primera solución en esta línea. En este framework, la autorización del usuario se encarna en un token de acceso que la tercera parte debe presentar cada vez que desee acceder a un recurso del usuario. Para desatar todo su potencial, hemos extendido dicho framework desde tres perspectivas diferentes. En primer lugar, hemos propuesto un protocolo que permite al servidor de autorización verificar que el usuario se encuentra presente cada vez que la aplicación de terceros solicita acceso a uno de sus recursos. Esta comprobación se realiza mediante una autenticación transparente basada en las señales biométricas adquiridas por los relojes inteligentes y/o las pulseras de actividad y puede mitigar las graves consecuencias de la exfiltración de tokens de acceso en muchas situaciones. En segundo lugar, hemos desarrollado un nuevo protocolo para autorizar a aplicaciones de terceros a acceder a los datos del usuario cuando estas aplicaciones no son aplicaciones web, sino que se sirven a través de plataformas de mensajería. El protocolo propuesto no lidia únicamente con los aspectos relacionados con la usabilidad (permitiendo realizar el proceso de autorización mediante el mismo interfaz que el usuario estaba utilizando para consumir el servicio, i.e. la plataforma de mensajería) sino que también aborda los problemas de seguridad que surgen derivados de este nuevo escenario. Finalmente, hemos mostrado un protocolo donde el usuario que requiere de acceso a los recursos protegidos no es el dueño de estos. Este nuevo mecanismo se basa en un nuevo tipo de concesión OAuth (grant type) para la interacción entre el servidor de autorización y ambos usuarios, y un perfil de OPA para la definición y evaluación de políticas de acceso. En un intento de acceso a los recursos, el dueño de estos podría ser consultado interactivamente para aprobar el acceso, habilitando de esta forma la delegación usuario a usuario. Después de unas autenticación y autorización exitosas, el usuario consigue acceso al recurso protegido. La seguridad de los datos en tránsito se encarga de proteger la información mientras es transmitida del dispositivo del usuario al servidor de recursos y viceversa. El cifrado, al tiempo que mantiene la información a salvo de los curiosos, también evita que los dispositivos de seguridad puedan cumplir su función—por ejemplo, los firewalls son incapaces de inspeccionar la información cifrada en busca de amenazas. Sin embargo, mostrar la información de los usuarios a dichos dispositivos podría suponer un problema de privacidad en ciertos escenarios. Por ello, hemos propuesto un método basado en Computación Segura Multiparte (SMC) que permite realizar las funciones de red sin comprometer la privacidad del tráfico. Esta aproximación aprovecha el paralelismo intrínseco a los escenarios de red, así como el uso adaptativo de diferentes representaciones de la función de red para adecuar la ejecución al estado de la red en cada momento. En nuestras pruebas hemos analizado el desencriptado seguro del tráfico utilizando el algoritmo Chacha20, mostrando que somos capaces de evaluar el tráfico introduciendo latencias realmente bajas (menores de 3ms) cuando la carga de la red permanece suficientemente baja, mientras que podemos procesar hasta 1.89 Gbps incrementando la latencia introducida. Teniendo en cuenta todo esto, a pesar de la penalización de rendimiento que se ha asociado tradicionalmente a las aplicaciones de Computación Segura, hemos presentado un método eficiente y flexible que podría lanzar la evaluación segura de las funciones de red a escenarios reales.<br /

    An inference system framework for personal sensor devices in mobile health and internet of things networks

    Full text link
    Future healthcare directions include individuals being monitored in real-time during day-to-day activity using wearable sensors. This thesis solves a critical requirement, that of intelligently managing when body sensors should alert doctors of changes to a person&rsquo;s health status, bringing existing research closer to live health monitoring
    corecore