100 research outputs found
Measuring the impact of ICNIRP vs. stricter-than-ICNIRP exposure limits on QoS and EMF from cellular networks
The installation of new equipment (Base Stations, BSs) during the planning phase of a cellular network (including 5G BSs) is governed by exposure limits in terms of allowable ElectroMagnetic Field (EMF) levels. The exposure limits can be either defined by (i) international bodies (e.g., ICNIRP) or (ii) national regulations imposing limits stricter than (i). In this work, we compare the impact of ICNIRP vs. stricter-than-ICNIRP exposure regulations on the Quality of Service (QoS) and EMF. To this aim, we perform a large-scale measurement campaign in one scenario in Spain subject to ICNIRP regulations and another one in Italy subject to EMF limits stricter than ICNIRP ones. Both the scenarios are characterized by similar exposure conditions, comparable user density, and common 4G performance targets by the operators. Results, obtained by measuring QoS and EMF at selected locations, reveal that the QoS in the scenario subject to strict EMF limits is heavily worsened compared to the one in which ICNIRP-based limits are enforced. Clearly, the scenario with strict EMF limits presents a lower level exposure over the territory compared to the one imposing ICNIRP limits
Joint energy efficiency and load balancing optimization in hybrid IP/SDN networks
Software-defined networking (SDN) is a paradigm that provides flexibility and programmability to computer networks. By introducing SDN nodes in a legacy IP network topology, network operators can benefit on higher control over the infrastructure. However, this migration is not a fast or straightforward process. Furthermore, to provide an adequate quality of service in hybrid IP/SDN networks, the coordination of both IP and SDN paradigm is fundamental. In this paper, this coordination is used to solve two optimization problems that are typically solved separately: (i) traffic load balancing and (ii) power consumption minimization. Each of these problems has opposing objectives, and thus, their joint consideration implies striking a balance between them. Therefore, this paper proposes the Hybrid Spreading Load Algorithm (HSLA) heuristic that jointly faces the problems of balancing traffic by minimizing link utilization and network's power consumption in a hybrid IP/SDN network. HSLA is evaluated over differently sized topologies using different methods to select which nodes are migrated from IP to SDN. These evaluations reveal that alternative approaches that only address one of the objectives are outperformed by HSLA
Dynamic in-network classification for service function chaining ready SDN networks
Service Function Chaining (SFC) paradigm consists in steering traffic flows through an ordered set of Service Functions (SFs) so that to realize complex end to end services. SFC architecture introduces all the logical functions that need to be developed in order to provide the required service. The SFC overlay infrastructure can be built on top of many different underlay network technologies. The high flexibility and centrally controlled feature of Software Defined Networking (SDN), make SDN networks to be a perfect underlay to build the SFC architecture. Due to Ternary Content Address Memory (TCAM) limited size, SDN switches have a limitation in the number of flow rules that can be hosted. This constraint is particularly penalizing in case of the SFC classifier function, since it requires to manage a high number of different flows. The limitation imposed by the TCAM size on the SFC classifier can be a bottleneck for the number of SFC requests that the SDN-based SFC architecture can handle. In this paper we define the Dynamic Chain Request Classification Offloading (D-CRCO) problem, as the one of maximizing the number of accepted SFC requests, having the possibility of: i) implement the SFC classifier also in a node that is internal to the SDN-based SFC domain, and ii) install classification rules in a reactive fashion. Furthermore, we propose the Dynamic Nearest Node (DNN) heuristic to solve the D-CRCO problem. Performance evaluation shows that by using DNN heuristic it is possible to triple the number of accepted requests, with respect to existing solutions
Live Demonstration:Neuromorphic Sensory Integration for Combining Sound Source Localization and Collision Avoidance
The brain is able to solve complex tasks in real time by combining different sensory cues with previously acquired knowledge. Inspired by the brain, we designed a neuromorphic demonstrator which combines auditory and visual input to find an obstacle free direction closest to the sound source. The system consists of two event-based sensors (the eDVS for vision and the NAS for audition) mounted onto a pan-tilt unit and a spiking neural network implemented on the SpiNNaker platform. By combining the different sensory information, the demonstrator is able to point at a sound source direction while avoiding obstacles in real time
Assessment of Platelet REACtivity After Transcatheter Aortic Valve Replacement : The REAC-TAVI Trial
The REAC-TAVI (Assessment of platelet REACtivity after Transcatheter Aortic Valve Implantation) trial enrolled patients with aortic stenosis (AS) undergoing transcatheter aortic valve replacement (TAVR) pre-treated with aspirin + clopidogrel, aimed to compare the efficacy of clopidogrel and ticagrelor in suppressing high platelet reactivity (HPR) after TAVI. Current recommendations support short-term use of aspirin + clopidogrel for patients with severe AS undergoing TAVR despite the lack of compelling evidence. This was a prospective, randomized, multicenter investigation. Platelet reactivity was measured at 6 different time points with the VerifyNow assay (Accriva Diagnostics, San Diego, California). HPR was defined as (P2Y reaction units (PRU) ≥208. Patients with HPR before TAVR were randomized to either aspirin + ticagrelor or aspirin + clopidogrel for 3 months. Patients without HPR continued with aspirin + clopidogrel (registry cohort). The primary endpoint was non-HPR status (PRU <208) in ≥70% of patients treated with ticagrelor at 90 days post-TAVR. A total of 68 patients were included. Of these, 48 (71%) had HPR (PRU 273 ± 09) and were randomized to aspirin + ticagrelor (n = 24, PRU 277 ± 08) or continued with aspirin + clopidogrel (n = 24, PRU 269 ± 49). The remaining 20 patients (29%) without HPR (PRU 133 ± 12) were included in the registry. Overall, platelet reactivity across all the study time points after TAVR was lower in patients randomized to ticagrelor compared with those treated with clopidogrel, including those enrolled in the registry (p < 0.001). The primary endpoint was achieved in 100% of patients with ticagrelor compared with 21% with clopidogrel (p < 0.001). Interestingly, 33% of clopidogrel responder patients at baseline developed HPR status during the first month after TAVR. HPR to clopidogrel is present in a considerable number of patients with AS undergoing TAVR. Ticagrelor achieves a better and faster effect, providing sustained suppression of HPR to these patients. (Platelet Reactivity After TAVI: A Multicenter Pilot Study [REAC-TAVI]; NCT02224066
X chromosome inactivation does not necessarily determine the severity of the phenotype in Rett syndrome patients
Rett syndrome (RTT) is a severe neurological disorder usually caused by mutations in the MECP2 gene. Since the MECP2 gene is located on the X chromosome, X chromosome inactivation (XCI) could play a role in the wide range of phenotypic variation of RTT patients; however, classical methylation-based protocols to evaluate XCI could not determine whether the preferentially inactivated X chromosome carried the mutant or the wild-type allele. Therefore, we developed an allele-specific methylation-based assay to evaluate methylation at the loci of several recurrent MECP2 mutations. We analyzed the XCI patterns in the blood of 174 RTT patients, but we did not find a clear correlation between XCI and the clinical presentation. We also compared XCI in blood and brain cortex samples of two patients and found differences between XCI patterns in these tissues. However, RTT mainly being a neurological disease complicates the establishment of a correlation between the XCI in blood and the clinical presentation of the patients. Furthermore, we analyzed MECP2 transcript levels and found differences from the expected levels according to XCI. Many factors other than XCI could affect the RTT phenotype, which in combination could influence the clinical presentation of RTT patients to a greater extent than slight variations in the XCI pattern
Association between loop diuretic dose changes and outcomes in chronic heart failure: observations from the ESC-EORP Heart Failure Long-Term Registry
[Abstract]
Aims. Guidelines recommend down-titration of loop diuretics (LD) once euvolaemia is achieved. In outpatients with heart
failure (HF), we investigated LD dose changes in daily cardiology practice, agreement with guideline recommendations,
predictors of successful LD down-titration and association between dose changes and outcomes.
Methods
and results.
We included 8130 HF patients from the ESC-EORP Heart Failure Long-Term Registry. Among patients who had dose
decreased, successful decrease was defined as the decrease not followed by death, HF hospitalization, New York Heart
Association class deterioration, or subsequent increase in LD dose. Mean age was 66±13 years, 71% men, 62% HF
with reduced ejection fraction, 19% HF with mid-range ejection fraction, 19% HF with preserved ejection fraction.
Median [interquartile range (IQR)] LD dose was 40 (25–80) mg. LD dose was increased in 16%, decreased in 8.3%
and unchanged in 76%. Median (IQR) follow-up was 372 (363–419) days. Diuretic dose increase (vs. no change) was
associated with HF death [hazard ratio (HR) 1.53, 95% confidence interval (CI) 1.12–2.08; P = 0.008] and nominally
with cardiovascular death (HR 1.25, 95% CI 0.96–1.63; P = 0.103). Decrease of diuretic dose (vs. no change) was
associated with nominally lower HF (HR 0.59, 95% CI 0.33–1.07; P = 0.083) and cardiovascular mortality (HR 0.62 95% CI 0.38–1.00; P = 0.052). Among patients who had LD dose decreased, systolic blood pressure [odds ratio
(OR) 1.11 per 10 mmHg increase, 95% CI 1.01–1.22; P = 0.032], and absence of (i) sleep apnoea (OR 0.24, 95% CI
0.09–0.69; P = 0.008), (ii) peripheral congestion (OR 0.48, 95% CI 0.29–0.80; P = 0.005), and (iii) moderate/severe
mitral regurgitation (OR 0.57, 95% CI 0.37–0.87; P = 0.008) were independently associated with successful decrease.
Conclusion. Diuretic dose was unchanged in 76% and decreased in 8.3% of outpatients with chronic HF. LD dose increase was
associated with worse outcomes, while the LD dose decrease group showed a trend for better outcomes compared
with the no-change group. Higher systolic blood pressure, and absence of (i) sleep apnoea, (ii) peripheral congestion,
and (iii) moderate/severe mitral regurgitation were independently associated with successful dose decrease
Coronavirus Gene 7 Counteracts Host Defenses and Modulates Virus Virulence
Transmissible gastroenteritis virus (TGEV) genome contains three accessory genes: 3a, 3b and 7. Gene 7 is only present in members of coronavirus genus a1, and encodes a hydrophobic protein of 78 aa. To study gene 7 function, a recombinant TGEV virus lacking gene 7 was engineered (rTGEV-Δ7). Both the mutant and the parental (rTGEV-wt) viruses showed the same growth and viral RNA accumulation kinetics in tissue cultures. Nevertheless, cells infected with rTGEV-Δ7 virus showed an increased cytopathic effect caused by an enhanced apoptosis mediated by caspase activation. Macromolecular synthesis analysis showed that rTGEV-Δ7 virus infection led to host translational shut-off and increased cellular RNA degradation compared with rTGEV-wt infection. An increase of eukaryotic translation initiation factor 2 (eIF2α) phosphorylation and an enhanced nuclease, most likely RNase L, activity were observed in rTGEV-Δ7 virus infected cells. These results suggested that the removal of gene 7 promoted an intensified dsRNA-activated host antiviral response. In protein 7 a conserved sequence motif that potentially mediates binding to protein phosphatase 1 catalytic subunit (PP1c), a key regulator of the cell antiviral defenses, was identified. We postulated that TGEV protein 7 may counteract host antiviral response by its association with PP1c. In fact, pull-down assays demonstrated the interaction between TGEV protein 7, but not a protein 7 mutant lacking PP1c binding motif, with PP1. Moreover, the interaction between protein 7 and PP1 was required, during the infection, for eIF2α dephosphorylation and inhibition of cell RNA degradation. Inoculation of newborn piglets with rTGEV-Δ7 and rTGEV-wt viruses showed that rTGEV-Δ7 virus presented accelerated growth kinetics and pathology compared with the parental virus. Overall, the results indicated that gene 7 counteracted host cell defenses, and modified TGEV persistence increasing TGEV survival. Therefore, the acquisition of gene 7 by the TGEV genome most likely has provided a selective advantage to the virus
Risk Factors Associated with Adverse Fetal Outcomes in Pregnancies Affected by Coronavirus Disease 2019 (COVID-19): A Secondary Analysis of the WAPM study on COVID-19
To evaluate the strength of association between maternal and pregnancy characteristics and the risk of adverse perinatal outcomes in pregnancies with laboratory confirmed COVID-19. Secondary analysis of a multinational, cohort study on all consecutive pregnant women with laboratory-confirmed COVID-19 from February 1, 2020 to April 30, 2020 from 73 centers from 22 different countries. A confirmed case of COVID-19 was defined as a positive result on real-time reverse-transcriptase-polymerase-chain-reaction (RT-PCR) assay of nasal and pharyngeal swab specimens. The primary outcome was a composite adverse fetal outcome, defined as the presence of either abortion (pregnancy loss before 22 weeks of gestations), stillbirth (intrauterine fetal death after 22 weeks of gestation), neonatal death (death of a live-born infant within the first 28 days of life), and perinatal death (either stillbirth or neonatal death). Logistic regression analysis was performed to evaluate parameters independently associated with the primary outcome. Logistic regression was reported as odds ratio (OR) with 95% confidence interval (CI). Mean gestational age at diagnosis was 30.6\ub19.5 weeks, with 8.0% of women being diagnosed in the first, 22.2% in the second and 69.8% in the third trimester of pregnancy. There were six miscarriage (2.3%), six intrauterine device (IUD) (2.3) and 5 (2.0%) neonatal deaths, with an overall rate of perinatal death of 4.2% (11/265), thus resulting into 17 cases experiencing and 226 not experiencing composite adverse fetal outcome. Neither stillbirths nor neonatal deaths had congenital anomalies found at antenatal or postnatal evaluation. Furthermore, none of the cases experiencing IUD had signs of impending demise at arterial or venous Doppler. Neonatal deaths were all considered as prematurity-related adverse events. Of the 250 live-born neonates, one (0.4%) was found positive at RT-PCR pharyngeal swabs performed after delivery. The mother was tested positive during the third trimester of pregnancy. The newborn was asymptomatic and had negative RT-PCR test after 14 days of life. At logistic regression analysis, gestational age at diagnosis (OR: 0.85, 95% CI 0.8-0.9 per week increase; p<0.001), birthweight (OR: 1.17, 95% CI 1.09-1.12.7 per 100 g decrease; p=0.012) and maternal ventilatory support, including either need for oxygen or CPAP (OR: 4.12, 95% CI 2.3-7.9; p=0.001) were independently associated with composite adverse fetal outcome. Early gestational age at infection, maternal ventilatory supports and low birthweight are the main determinants of adverse perinatal outcomes in fetuses with maternal COVID-19 infection. Conversely, the risk of vertical transmission seems negligible
- …