1,210 research outputs found

    How to Choose the Relevant MAC Protocol for Wireless Smart Parking Urban Networks?

    Get PDF
    Parking sensor network is rapidly deploying around the world and is regarded as one of the first implemented urban services in smart cities. To provide the best network performance, the MAC protocol shall be adaptive enough in order to satisfy the traffic intensity and variation of parking sensors. In this paper, we study the heavy-tailed parking and vacant time models from SmartSantander, and then we apply the traffic model in the simulation with four different kinds of MAC protocols, that is, contention-based, schedule-based and two hybrid versions of them. The result shows that the packet interarrival time is no longer heavy-tailed while collecting a group of parking sensors, and then choosing an appropriate MAC protocol highly depends on the network configuration. Also, the information delay is bounded by traffic and MAC parameters which are important criteria while the timely message is required.Comment: The 11th ACM International Symposium on Performance Evaluation of Wireless Ad Hoc, Sensor, and Ubiquitous Networks (2014

    On Statistical QoS Provisioning for Smart Grid

    Get PDF
    Current power system is in the transition from traditional power grid to Smart Grid. A key advantage of Smart Grid is its integration of advanced communication technologies, which can provide real-time system-wide two-way information links. Since the communication system and power system are deeply coupled within the Smart Grid system, it makes Quality of Service (QoS) performance analysis much more complex than that in either system alone. In order to address this challenge, the effective rate theory is studied and extended in this thesis, where a new H transform based framework is proposed. Various scenarios are investigated using the new proposed effective rate framework, including both independent and correlated fading channels. With the effective rate as a connection between the communication system and the power system, an analysis of the power grid observability under communication constraints is performed. Case studies show that the effective rate provides a cross layer analytical framework within the communication system, while its statistical characterisation of the communication delay has the potential to be applied as a general coupling point between the communication system and the power system, especially when real-time applications are considered. Besides the theoretical QoS performance analysis within Smart Grid, a new Software Defined Smart Grid testbed is proposed in this thesis. This testbed provides a versatile evaluation and development environment for Smart Grid QoS performance studies. It exploits the Real Time Digital Simulator (RTDS) to emulate different power grid configurations and the Software Defined Radio (SDR) environment to implement the communication system. A data acquisition and actuator module is developed, which provides an emulation of various Intelligent Electronic Devices (IEDs). The implemented prototype demonstrates that the proposed testbed has the potential to evaluate real time Smart Grid applications such as real time voltage stability control

    Adrenal function after induction therapy for acute lymphoblastic leukemia in children short : adrenal function in ALL

    Get PDF
    Prednisolone used in the induction phase of the treatment of acute lymphoblastic leukemia (ALL) may suppress hypothalamic-pituitary-adrenal axis and require hydrocortisone substitution. In this retrospective analysis, we reviewed altogether 371 ACTH stimulation tests of 352 children after a uniform NOPHO (Nordic Society of Pediatric Hematology and Oncology) ALL induction. Both low- and standard-dose ACTH tests were used. Full recovery of adrenal function was defined by both normal basal and stimulated cortisol levels. Sixty-two percent of patients were detected with normal adrenal function in median of 15 days after tapering of prednisolone. Both low basal and stimulated cortisol levels were detected in 13% of patients. The median time to normal adrenal function was 31 days (95% CI 28-34), 24 days (95% CI 18-30), and 12 days (95% CI 10-14) for those with basal cortisol 183 nmol/L at first ACTH testing, respectively. Patients with fluconazole prophylaxis had higher median baseline cortisol levels compared to patients without prophylaxis (207 nmol/L, range 21-839 nmol/L vs. 153 nmol/L, range 22-832 nmol/L, P = 0.003). Conclusion: These data can be used to reduce unnecessary substitution or testing, but also to guarantee hydrocortisone substitution for those at risk.What is Known:center dot These data can be used to reduce unnecessary hydrocortisone substitution or ACTH testing.center dot Our data helps to guarantee hydrocortisone substitution for those at risk of adrenal insufficiency.What is New:center dot Full recovery of adrenal function after ALL induction is detected in 62% of patients already at 15 days after tapering of prednisolone.center dot Both basal and stimulated cortisol testing are required for detection of full adrenal recovery.center dot Recovery time of adrenal function is extended over 3-4 weeks after tapering of prednisolone in patients with low basal cortisol levels (Peer reviewe

    Probabilistic Model for Laser Damage to the Human Retina

    Get PDF
    Understanding how lasers interact with media during propagation is a premiere field of physics. The subject area known as laser bioeffects explores laser interactions with biological cells, tissues, organs, and bodies. This research includes laser applications used in medicine, establishes safe exposure limits for industry and academia, and generally studies the many effects of laser light on living creatures. The bioeffects community relies heavily on deterministic modeling and simulation tools to support experimental research into damage thresholds and laser effects. However, recent laser applications require a probabilistic approach to support risk management and analyses methodologies. Some probabilistic models exist, but their assumptions are largely biased due to sampling and reporting techniques. This research focuses on building the first-ever population-based probabilistic model for retinal damage using a statistical model of the optical properties and dimensions of the human eye. Simulated population distributions are used as input to propagation and thermal damage models for analysis. The results of this research are intended to provide a foundation for future probabilistic models and applications. The format of this document is two separate papers. The first is the development of the statistical eye model based on human covariance data: An Analysis of the Influences of Biological Variance, Measurement Error, and Uncertainty on Retinal Photothermal Damage Threshold Studies. The paper examines trends in wavelength and time dependencies of damage thresholds. The second paper, Biological Variance-Based Dose Response Model for 514 to 1064 Nanometer Laser Exposures, is the application of the statistical eye model in the creation of the dose-response model. The model can be used to establish the design space in the development of future laser systems. It provides the foundation for a true population-based risk analysis tool for safety standards development

    Analysis and evaluation of in-home networks based on HomePlug-AV power line communications

    Get PDF
    [ESP] No hace mucho tiempo, las redes in-home (también denominadas redes domésticas) únicamente se utilizaban para interconectar los diferentes ordenadores de una vivienda, de manera que pudieran compartir una impresora entre ellos. Hoy en día, sin embargo, esta definición es mucho más amplia debido a la gran cantidad de dispositivos existentes en la vivienda con capacidad de conectarse a una red para transmitir y recibir información. En una red in-home actual, podemos encontrar desde teléfonos móviles equipados con conectividad WI-FI a dispositivos NAS (Network Attached Storage), utilizados para almacenar información, imágenes o videos en red, que a su vez pueden ser transferidos a televisiones de alta definición u ordenadores. A la hora de instalar una red de comunicaciones en una vivienda, se persiguen principalmente dos objetivos, reducir el coste de instalación y conseguir una gran flexibilidad de cara a futuras ampliaciones. Una red basada en tecnología PLC (Power Line Communications) cumple estos requisitos ya que, al utilizar la infraestructura de cableado eléctrico existente en la vivienda, es muy sencilla y económica de instalar y ampliar. Dentro de la tecnología PLC existen diferentes estándares, siendo HomePlug-AV (HomePlug Audio-Video o simplemente HPAV) el más extendido en la actualidad para la instalación de redes domésticas. Este estándar permite alcanzar velocidades de transmisión de hasta 200Mbps a través de los cables de baja tensión de una vivienda convencional. El objetivo principal de esta tesis doctoral es aportar nuevas ideas que mejoren las prestaciones de las redes in-home basadas en la tecnología PLC, utilizando como base el estándar Homeplug-AV. Estas redes utilizan una arquitectura centralizada, en la que la mayor parte de la inteligencia de red está concentrada en un coordinador central (CCo, por sus siglas en inglés). Por lo tanto, la mayor parte de las modificaciones propuestas irán encaminadas a mejorar dicho dispositivo, que podrá llegar a convertirse en un gestor de red capaz de manejar conjuntamente interfaces de diferentes tecnologías. En primer lugar, se presenta un análisis detallado del comportamiento del estándar en diferentes situaciones que se pueden producir de manera común en una red doméstica. Este análisis se realizó tanto con dispositivos reales como mediante simulación. Para el segundo tipo de medidas, se diseñó un simulador de la tecnología HomePlug que implementa el nivel físico y el nivel MAC de la misma, junto con modelos de los servicios más utilizados en entornos domésticos. Este simulador se utilizó tanto para estas medidas iniciales como para evaluar las diferentes modificaciones del estándar propuestas posteriormente en este trabajo. Este análisis proporcionó dos resultados significativos. En primer lugar, se comprobó que al introducir un modelo real de nivel físico al protocolo CSMA/CA utilizado a nivel MAC se producían resultados muy diferentes a los presentados en los modelos publicados hasta ese momento. Por ello, se propuso un modelo matemático que incorporaba dichos efectos. En segundo lugar, se identificaron diferentes áreas de la tecnología que eran susceptibles de mejora. El resto de la tesis se centró entonces en la mejora de dichos puntos débiles. El primero de estos puntos débiles está relacionado con las transmisión de datos unicast. El medio PLC es selectivo en frecuencia y muy dependiente del tiempo y de la localización de las estaciones. Incluso es posible que, en un mismo enlace, la capacidad de los enlaces ascendente y descendente sea distinta. En estos entornos, la utilización del protocolo de transporte TCP presenta serios problemas, ya que define gran parte de sus parámetros en función del Round Trip time (RTT) del enlace. Como alternativa se pensó en los códigos Fountain. Este tipo de codificación de fuente permite realizar transmisiones fiables de datos sin necesidad de utilizar un canal de retorno, evitando de esta forma los problemas derivados de las asimetrías de la red. Se realizaron varios experimentos comparando ambas soluciones, y se comprobó que las prestaciones de este tipo de codificaciones superan al protocolo TCP a la hora de transmitir ficheros de manera fiable a través de las redes PLC. Además, los códigos Fountain también se utilizaron para el diseño de otra aplicación. Es muy común que en un escenario doméstico haya disponible más de una tecnología (Wi-Fi, Ethernet, PLC, etc). Tenemos por tanto que una aplicación capaz de integrar interfaces de diferentes tecnologías podría ser muy útil en estos entornos, ya que se podría conseguir un mayor ancho de banda, mayor tolerancia a errores, balanceo de carga, etc. El kernel de Linux dispone de un módulo denominado Bonding que permite agrupar diferentes interfaces Ethernet. Sin embargo, no está preparado para agrupar interfaces de diferentes tecnologías, y mucho menos para tecnologás de capacidad variable como es el caso de PLC o de las comunicaciones inalámbricas. Por ello, se realizó una modificación de dicho driver utilizando para ello los códigos Fountain, que solucionan los problemas que se pueden producir debido a las variaciones de capacidad. Por otra parte, con la actual versión del estándar HomePlug AV, las comunicaciones multicast presentan unas prestaciones muy pobres. Esto es debido a que, a pesar de que el canal PLC es broadcast, la naturaleza de la modulación OFDM (Ortogonal Frequency Division Multiplexing) que se utiliza a nivel físico es punto a punto. Esto hace que las transmisiones simultáneas a un grupo de receptores se traduzcan automáticamente en sucesivas transmisiones punto a punto a los diferentes miembros del grupo. Con esta técnica, la capacidad efectiva de transmisión multicast disminuye de manera muy importante a medida que aumenta el número de receptores. En este trabajo se han propuesto dos técnicas alternativas. La primera consiste en la utilización de un mapa de tonos común para todos los miembros del grupo multicast, asignado a estas comunicaciones los parámetros de modulación del cliente con las peores condiciones de canal. Este algoritmo ha sido tradicionalmente descartado en los sistemas OFDM por sus bajas prestaciones. Sin embargo, la correlación existente entre los diferentes canales de una red PLC hace que su comportamiento sea mucho mejor. Además, se propuso un segundo algoritmo que utilizaba técnicas de optimización para maximizar la tasa de comunicación multicast, obteniendo un mejor comportamiento cuando el número de clientes es elevado. Por último, en redes de capacidad física variable, como es el caso de las redes PLC, las técnicas cross-layer están despertando un gran interés. Este tipo de algoritmos están basado en la compartición de información entre diferentes capas de la estructura OSI para mejorar el comportamiento del sistema. En este trabajo se ha propuesto un algoritmo que modifica los parámetros del protocolo CSMA/CA de nivel MAC utilizando información de nivel físico y los requerimientos de QoS del servicio de niveles superiores. De esta forma se consigue dar prioridad en el acceso al medio a los clientes con problemas de QoS, mejorando de esta forma del comportamiento de la red. Este algoritmo ha sido evaluado mediante simulación en un escenario doméstico típico, comprobando que ofrece unos resultados muy prometedores. [ENG] Not very long time ago, in-home networks (also called domestic networks) were only used to share a printer between a number of computers. Nowadays, however, due to the huge amount of devices present at home with communication capabilities, this definition has become much wider. In a current in-home network we can find, from mobile phones with wireless connectivity, or NAS (Network Attached Storage) devices sharing multimedia content with high-definition televisions or computers. When installing a communications network in a home, two objectives are mainly pursued: Reducing cost and high flexibility in supporting future network requirements. A network based on Power Line Communications (PLC) technology is able to fulfill these objectives, since as it uses the low voltage wiring already available at home, it is very easy to install and expand, providing a cost-effective solution for home environments. There are different PLC standards, being HomePlug-AV (HomePlug Audio-Video, or simply HPAV) the most widely used nowadays. This standard is able to achieve transmission rates up to 200 Mpbs through the electrical wiring of a typical home. The main objective of this thesis is to provide new ideas to improve the performance of PLC technology based in-home networks, using as starting point the HPAV standard. A network based on this technology uses a centralized architecture, in which the most important part of the network intelligence is concentrated in a single device, the Central Coordinator (CCo). Hence, most of the modifications proposed in this work will try to improve this particular device, which can even become a multi-technology central manager, able to combine interfaces of different technologies to improve the network performance. Initially, it is presented a detailed analysis of HPAV performance in some scenarios typically found in a home environment. It was done through simulation and by experimentation using real devices. To obtain the former results, it was designed a HPAV simulator which implements the physical (PHY) and medium access control (MAC) layers of the standard, together with a traffic modeling module which implements the services most commonly found in a home network. This simulation tool was used both in these initial measurements and to evaluate the standard modifications that are proposed in this work. This analysis provides two main results. Firstly, it was found that when a real PHY model is used together with the CSMA/CA MAC protocol the simulation results were very different to those obtained with previously presented mathematical models of this protocol. Hence, it was proposed a new model that considers these effects. Next, some areas of the technology which could be improved were identified. The rest of the thesis was then centered around proposing solutions to these weaknesses. The first weakness solved is related to unicast data transmission. PLC medium is frequency selective and time variant, and it presents a remarkable variation among locations or depending on the connected loads. Even in a single link, the channel capacities between transmitter and receiver can be very asymmetric. In such environments, the use of TCP as transport protocol presents serious problems, since it defines some of its parameters according to the Round Trip Time (RTT). Alternatively, the use of Fountain codes for reliable data transmission in these environments was proposed. These codes allow to transmit information without a feedback channel, overcoming in this way the problems related to the variability of the channel. Different experiments were performed comparing both solutions, concluding that in PLC based networks the performance achieved by Fountain codes outperforms the results obtained with a TCP-based application. In addition, Fountain codes were also used for another application. In home environments, it is very common to find more than one available technology to deploy a network (Wi-Fi, Ethernet, PLC, etc). Therefore, an application that makes possible the aggregation of different interfaces would be very useful, as it will provide higher bandwidth, fault tolerance and load balancing. The Linux Kernel contains a driver (Bonding) which allows Ethernet interfaces aggregation. However, it is not prepared for asymmetric interfaces aggregation and even less for variable capacity technologies like PLC or Wi-Fi. In this work, it is presented a modification of this driver which uses Fountain codes to solve the problems that may arise when asymmetric interfaces are aggregated. On another note, multicast communications in the actual HPAV standard versions presents serious problems. This is because, although PLC medium is broadcast by nature, the Orthogonal Frequency Division Multiplexing (OFDM) modulation used at PHY layer is always point to point. Therefore, multicast communications are carried out as successive point-to-point transmissions to the different members of the group. This technique clearly degrades the performance of multicast services as the number of receivers increases. In this work, they have been proposed two alternative algorithms. The first one consists of using a common tone map for all the multicast group members. This tone map corresponds to the modulation parameters obtained for the client with the worst channel conditions. This algorithm has been traditionally discarded in OFDM systems because of its poor performance. However, in contrast to other technologies (like wireless for example), channel responses in a given PLC network exhibit significant correlation among them. This reduces the differences among the users, improving the performance of this algorithm. In addition, another technique which uses an optimization algorithm to maximize the multicast bit rate is also evaluated, obtaining that its use can be suitable when the number of multicast clients is high. Finally, due to the properties of PLC medium, cross-layer technique are eliciting a big interest. These algorithms are based in the information sharing between adjacent layers in the OSI model to improve the system behavior. In this work, it has been proposed an extension of the HPAV CSMA/CA algorithm which modifies the protocol parameters using PHY layer information and the QoS requirements of the upper-layer services. In this way, priority access to the channel can be provided to the nodes with QoS problems, improving the whole network performance. This algorithm has been evaluated through simulation in a typical home environment with very promising results.Universidad Politécnica de Cartagen

    Proximity as a Service for the Use Case of Access Enhancement via Cellular​ Network-Assisted Mobile​Device-to-Device

    Get PDF
    Device-to-Device (D2D) communication is a way to treat the User Equipments (UEs) not as terminals, but as a part of the network (helpers) for service provisioning. We propose a generic framework, namely Proximity as a Service (PaaS), formulate the helper selection problem, and design and prove a heuristic helper selection policy, ContAct based Proximity (CAP), which increases the service connectivity and continuity. Design Of Experiment (DOE) is a statistical methodology that rigorously designs and conducts an experiment, and maximizes the information obtained from that experiment. We apply DOE to explore the relationship (analytic expression) between four inputs (factors) and four metrics (responses). Since different factors have different regression levels, a unified four level full factorial experiment and cubic multiple regression analysis have been carried out. Multiple regression equations are provided to estimate the different contributions and the interactions between factors. Results show that transmission range and user density are dominant and monotonically increasing, but transmission range should be restricted because of interference and energy-efficiency. After obtaining the explicit close form expressions between factors and responses, optimal values of key factors are derived. A methodology (the e-constraint method) to solve the multiple-objective optimization problem has been provided and a Pareto-Optimal set of factors has been found through iteration. The fluctuation of the iterations is small and a specific solution can be chosen based on the particular scenarios (city center or countryside with different user density). The methodology of optimization informs the design rules of the operator, helping to find the optimal networking solution

    Natural experiments: An overview of methods, approaches, and contributions to public health intervention research

    Get PDF
    Population health interventions are essential to reduce health inequalities and tackle other public health priorities, but they are not always amenable to experimental manipulation. Natural experiment (NE) approaches are attracting growing interest as a way of providing evidence in such circumstances. One key challenge in evaluating NEs is selective exposure to the intervention. Studies should be based on a clear theoretical understanding of the processes that determine exposure. Even if the observed effects are large and rapidly follow implementation, confidence in attributing these effects to the intervention can be improved by carefully considering alternative explanations. Causal inference can be strengthened by including additional design features alongside the principal method of effect estimation.NEstudies often rely on existing (including routinely collected) data. Investment in such data sources and the infrastructure for linking exposure and outcome data is essential if the potential for such studies to inform decision making is to be realized
    • …
    corecore