78 research outputs found
High-speed, in-band performance measurement instrumentation for next generation IP networks
Facilitating always-on instrumentation of Internet traffic for the purposes of performance measurement is crucial in order to enable accountability of resource usage and automated network control, management and optimisation. This has proven infeasible to date due to the lack of native measurement mechanisms that can form an integral part of the network‟s main forwarding operation. However, Internet Protocol version 6 (IPv6) specification enables the efficient encoding and processing of optional per-packet information as a native part of the network layer, and this constitutes a strong reason for IPv6 to be adopted as the ubiquitous next generation Internet transport.
In this paper we present a very high-speed hardware implementation of in-line measurement, a truly native traffic instrumentation mechanism for the next generation Internet, which facilitates performance measurement of the actual data-carrying traffic at small timescales between two points in the network. This system is designed to operate as part of the routers' fast path and to incur an absolutely minimal impact on the network operation even while instrumenting traffic between the edges of very high capacity links. Our results show that the implementation can be easily accommodated by current FPGA technology, and real Internet traffic traces verify that the overhead incurred by instrumenting every packet over a 10 Gb/s operational backbone link carrying a typical workload is indeed negligible
Forecasting foreign exchange rates with adaptive neural networks using radial basis functions and particle swarm optimization
The motivation for this paper is to introduce a hybrid Neural Network architecture of Particle
Swarm Optimization and Adaptive Radial Basis Function (ARBF-PSO), a time varying leverage
trading strategy based on Glosten, Jagannathan and Runkle (GJR) volatility forecasts and a
Neural Network fitness function for financial forecasting purposes. This is done by
benchmarking the ARBF-PSO results with those of three different Neural Networks
architectures, a Nearest Neighbors algorithm (k-NN), an autoregressive moving average model
(ARMA), a moving average convergence/divergence model (MACD) plus a naïve strategy.
More specifically, the trading and statistical performance of all models is investigated in a
forecast simulation of the EUR/USD, EUR/GBP and EUR/JPY ECB exchange rate fixing time
series over the period January 1999 to March 2011 using the last two years for out-of-sample
testing
LightningNet: Distributed Graph-based Cellular Network Performance Forecasting for the Edge
The cellular network plays a pivotal role in providing Internet access, since
it is the only global-scale infrastructure with ubiquitous mobility support. To
manage and maintain large-scale networks, mobile network operators require
timely information, or even accurate performance forecasts. In this paper, we
propose LightningNet, a lightweight and distributed graph-based framework for
forecasting cellular network performance, which can capture spatio-temporal
dependencies that arise in the network traffic. LightningNet achieves a steady
performance increase over state-of-the-art forecasting techniques, while
maintaining a similar resource usage profile. Our architecture ideology also
excels in the respect that it is specifically designed to support IoT and edge
devices, giving us an even greater step ahead of the current state-of-the-art,
as indicated by our performance experiments with NVIDIA Jetson
Meta-analysis of T peak –T end and T peak –T end /QT ratio for risk stratification in congenital long QT syndrome
Background and objectives: Congenital long QT syndrome (LQTS) predisposes affected individuals to ventricular tachycardia/fibrillation (VF/VF), potentially resulting in sudden cardiac death. The Tpeak–Tend interval and the Tpeak–Tend/QT ratio, electrocardiographic markers of dispersion of ventricular repolarization, were proposed for risk stratification but their predictive values in LQTS have been controversial. A systematic review and meta-analysis was conducted to examine the value of Tpeak–Tend intervals and Tpeak–Tend/QT ratios in predicting arrhythmic and mortality outcomes in congenital LQTS. Method: PubMed and Embase databases were searched until 9th May 2017, identifying 199 studies. Results: Five studies on long QT syndrome were included in the final meta-analysis. Tpeak–Tend intervals were longer (mean difference [MD]: 13 ms, standard error [SE]: 4 ms, P = 0.002; I2 = 34%) in congenital LQTS patients with adverse events [syncope, ventricular arrhythmias or sudden cardiac death] compared to LQTS patients without such events. By contrast, Tpeak–Tend/QT ratios were not significantly different between the two groups (MD: 0.02, SE: 0.02, P = 0.26; I2 = 0%). Conclusion: This meta-analysis showed that Tpeak–Tend interval is significant higher in individuals who are at elevated risk of adverse events in congenital LQTS, offering incremental value for risk stratification
The Tpeak – Tend interval as an electrocardiographic risk marker of arrhythmic and mortality outcomes: a systematic review and meta-analysis
Background: The Tpeak – Tend interval, an electrocardiographic marker reflecting transmural dispersion of repolarization, has been used to predict ventricular tachycardia/fibrillation (VT/VF) and sudden cardiac death (SCD) in different clinical settings. Objective: This systematic review and meta-analysis evaluated the significance of Tpeak – Tend interval in predicting arrhythmic and/or mortality endpoints. Methods: PubMed, Embase, Cochrane Library and CINAHL Plus databases were searched through 30th November 2016.Results: Of the 854 studies identified initially, 33 observational studies involving 155856 patients were included in our meta-analysis. Tpeak – Tend interval prolongation (mean cut-off: 103.3 ± 17.4 ms) was a significant predictor of the arrhythmic or mortality outcomes (odds ratio (OR): 1.14, 95% CI: 1.11 to 1.17, p < 0.001). When different end-points were analyzed, the ORs are as follows: VT/VF (1.10, 95% CI: 1.06 to 1.13, p < 0.0001), SCD (1.27, 95% CI 1.17 to 1.39, p < 0.0001), cardiovascular death (1.40, 95% CI 1.19 to 1.64, p < 0.0001), and all-cause mortality (4.56, 95% CI 0.62 to 33.68, p < 0.0001). Subgroup analysis for each disease revealed that the risk of VT/VF or death was highest for Brugada syndrome (OR: 5.68, 95% CI: 1.57 to 20.53, p < 0.01), followed by hypertension (OR: 1.52, 95% CI: 1.26 to 1.85, p < .0001), heart failure (OR: 1.07, 95% CI: 1.04 to 1.11, p < .0001) and ischemic heart disease (OR: 1.06, 95% CI: 1.02 to 1.10, p = 0.001). In the general population, a prolonged Tpeak – Tend interval also predicted arrhythmic or mortality outcomes (OR: 1.59, 95% CI: 1.21 to 2.09, p < 0.001).Conclusion: The Tpeak – Tend interval is useful risk stratification tool in different diseases and in the general population
Riociguat treatment in patients with chronic thromboembolic pulmonary hypertension: Final safety data from the EXPERT registry
Objective: The soluble guanylate cyclase stimulator riociguat is approved for the treatment of adult patients with pulmonary arterial hypertension (PAH) and inoperable or persistent/recurrent chronic thromboembolic pulmonary hypertension (CTEPH) following Phase
A novel approach for dynamic specification testing of high-resolution ΣΔ analogue-to-digital converters
Modern day dynamic specification testing of high-resolution mixed-signal devices, such as the ΣΔ analogue-to-digital converter, has revealed a multitude of challenges. These intricacies are mainly associated with the output data record size which needs to be captured in order to perform dynamic specification analysis. The unavoidable consequence is that of significantly long test times that have a direct impact upon the manufacturing cost. In addition, the continuous scaling down of transistor dimensions and power supply range minimisation means that the performance of high-resolution ADC's can be significant: influenced by noise interference and internal non-idealities. Therefore, an accurate and costefficient dynamic specification testing technique remains greatly in demand. This thesis identifies the main bottlenecks associated with modem day industrial dynamic specification testing through the off-chip use of the Fast Fourier Transform. These are mainly split into two categories: 1. The FFT algorithm requires significantly large data sets which lead to long data capture, transfer and processing time 2. The FFT algorithm is prohibitively large for on-chip realisation when data sets for high-resolution ΣΔ dynamic specification testing are considered.EThOS - Electronic Theses Online ServiceGBUnited Kingdo
- …