71 research outputs found

    Interventions for hyperhidrosis in secondary care : a systematic review and value-of-information analysis

    Get PDF
    Background: Hyperhidrosis is uncontrollable excessive sweating that occurs at rest, regardless of temperature. The symptoms of hyperhidrosis can significantly affect quality of life. The management of hyperhidrosis is uncertain and variable. Objective: To establish the expected value of undertaking additional research to determine the most effective interventions for the management of refractory primary hyperhidrosis in secondary care. Methods: A systematic review and economic model, including a value-of-information (VOI) analysis. Treatments to be prescribed by dermatologists and minor surgical treatments for hyperhidrosis of the hands, feet and axillae were reviewed; as endoscopic thoracic sympathectomy (ETS) is incontestably an end-of-line treatment, it was not reviewed further. Fifteen databases (e.g. CENTRAL, PubMed and PsycINFO), conference proceedings and trial registers were searched from inception to July 2016. Systematic review methods were followed. Pairwise meta-analyses were conducted for comparisons between botulinum toxin (BTX) injections and placebo for axillary hyperhidrosis, but otherwise, owing to evidence limitations, data were synthesised narratively. A decision-analytic model assessed the cost-effectiveness and VOI of five treatments (iontophoresis, medication, BTX, curettage, ETS) in 64 different sequences for axillary hyperhidrosis only. Results and conclusions: Fifty studies were included in the effectiveness review: 32 randomised controlled trials (RCTs), 17 non-RCTs and one large prospective case series. Most studies were small, rated as having a high risk of bias and poorly reported. The interventions assessed in the review were iontophoresis, BTX, anticholinergic medications, curettage and newer energy-based technologies that damage the sweat gland (e.g. laser, microwave). There is moderate-quality evidence of a large statistically significant effect of BTX on axillary hyperhidrosis symptoms, compared with placebo. There was weak but consistent evidence for iontophoresis for palmar hyperhidrosis. Evidence for other interventions was of low or very low quality. For axillary hyperhidrosis cost-effectiveness results indicated that iontophoresis, BTX, medication, curettage and ETS was the most cost-effective sequence (probability 0.8), with an incremental cost-effectiveness ratio of £9304 per quality-adjusted life-year. Uncertainty associated with study bias was not reflected in the economic results. Patients and clinicians attending an end-of-project workshop were satisfied with the sequence of treatments for axillary hyperhidrosis identified as being cost-effective. All patient advisors considered that the Hyperhidrosis Quality of Life Index was superior to other tools commonly used in hyperhidrosis research for assessing quality of life. Limitations: The evidence for the clinical effectiveness and safety of second-line treatments for primary hyperhidrosis is limited. This meant that there was insufficient evidence to draw conclusions for most interventions assessed and the cost-effectiveness analysis was restricted to hyperhidrosis of the axilla. Future work: Based on anecdotal evidence and inference from evidence for the axillae, participants agreed that a trial of BTX (with anaesthesia) compared with iontophoresis for palmar hyperhidrosis would be most useful. The VOI analysis indicates that further research into the effectiveness of existing medications might be worthwhile, but it is unclear that such trials are of clinical importance. Research that established a robust estimate of the annual incidence of axillary hyperhidrosis in the UK population would reduce the uncertainty in future VOI analyses

    Smoothed functional principal components analysis

    No full text
    Unlike classical principal component analysis (PCA) for multivariate data, we need some smoothing or regularizing in estimating functional principal components. There are different approaches for smoothed functional principal components. Kernel-based smoothed PCA performs on smoothed functional data while Silverman's method directly estimates smoothed principal components using the smoothed covariance operator. Silverman's method for smoothed functional principal components has many theoretical and practical advantages. Powerful tools from Hilbert space theory can be used to study the theoretical properties of this method. This research mainly focuses on studying asymptotic properties of Silverman's method in an abstract Hilbert space by exploiting the more general perturbation results. However, the asymptotic properties of Kernel-based method can also be studied using perturbation theory. We obtain the results using general theory on the perturbation of eigenvalues and eigenvectors of the covariance operator. Consistency and asymptotic distributions are derived under mild conditions. For the sake of simplicity of presentation, we restrict our attention to the first smoothed functional principal component

    An Usual Presentation of Bilateral Anterior Optic Neuritis

    No full text
    VKH disease is a granulomatous inflammatory disorder that can be bilateral or unilateral and is associated with a variety of ocular findings, including serous retinal detachments, multifocal retinal pigment epithelial detachments, optic disc swelling, and vitritis

    Granger Causality-Based Forecasting Model for Rainfall at Ratnapura Area, Sri Lanka: A Deep Learning Approach

    No full text
    Rainfall forecasting, especially extreme rainfall forecasting, is one of crucial tasks in weather forecasting since it has direct impact on accompanying devastating events such as flash floods and fast-moving landslides. However, obtaining rainfall forecasts with high accuracy, especially for extreme rainfall occurrences, is a challenging task. This study focuses on developing a forecasting model which is capable of forecasting rainfall, including extreme rainfall values. The rainfall forecasting was achieved through sequence learning capability of the Long Short-Term Memory (LSTM) method. The identification of the optimal set of features for the LSTM model was conducted using Random Forest and Granger Causality tests. Then, that best set of features was fed into Stacked LSTM, Bidirectional LSTM, and Encoder-Decoder LSTM models to obtain three days-ahead forecasts of rainfall with the input of the past fourteen days-values of selected features. Out of the three models, the best model was taken through post hoc residual analysis and extra validation approaches. This entire approach was illustrated utilizing rainfall and weather-related measurements obtained from the gauging station located in the city of Ratnapura, Sri Lanka. Originally, twenty-three features were collected including relative humidity, ssunshine hours, and mean sea level pressure. The performances of the three models were compared using RMSE. The Bidirectional LSTM model outperformed the other methods (RMSE < 5 mm and MAE < 3 mm) and this model has the capability to forecast extreme rainfall values with high accuracy

    A Fusion of Deep Learning and Time Series Regression for Flood Forecasting: An Application to the Ratnapura Area Based on the Kalu River Basin in Sri Lanka

    No full text
    Flooding is the most frequent natural hazard that accompanies hardships for millions of civilians and substantial economic losses. In Sri Lanka, fluvial floods cause the highest damage to lives and properties. Ratnapura, which is in the Kalu River Basin, is the area most vulnerable to frequent flood events in Sri Lanka due to inherent weather patterns and its geographical location. However, flood-related studies conducted based on the Kalu River Basin and its most vulnerable cities are given minimal attention by researchers. Therefore, it is crucial to develop a robust and reliable dynamic flood forecasting system to issue accurate and timely early flood warnings to vulnerable victims. Modeling the water level at the initial stage and then classifying the results of this into pre-defined flood risk levels facilitates more accurate forecasts for upcoming susceptibilities, since direct flood classification often produces less accurate predictions due to the heavily imbalanced nature of the data. Thus, this study introduces a novel hybrid model that combines a deep leaning technique with a traditional Linear Regression model to first forecast water levels and then detect rare but destructive flood events (i.e., major and critical floods) with high accuracy, from 1 to 3 days ahead. Initially, the water level of the Kalu River at Ratnapura was forecasted 1 to 3 days ahead by employing a Vanilla Bi-LSTM model. Similarly to water level modeling, rainfall at the same location was forecasted 1 to 3 days ahead by applying another Bi-LSTM model. To further improve the forecasting accuracy of the water level, the forecasted water level at day t was combined with the forecasted rainfall for the same day by applying a Time Series Regression model, thereby resulting in a hybrid model. This improvement is imperative mainly because the water level forecasts obtained for a longer lead time may change with the real-time appearance of heavy rainfall. Nevertheless, this important phenomenon has often been neglected in past studies related to modeling water levels. The performances of the models were compared by examining their ability to accurately forecast flood risks, especially at critical levels. The combined model with Bi-LSTM and Time Series Regression outperformed the single Vanilla Bi-LSTM model by forecasting actionable flood events (minor and critical) occurring in the testing period with accuracies of 80%, 80%, and 100% for 1- to 3-day-ahead forecasting, respectively. Moreover, overall, the results evidenced lower RMSE and MAE values (&lt;0.4 m MSL) for three-days-ahead water level forecasts. Therefore, this enhanced approach enables more trustworthy, impact-based flood forecasting for the Rathnapura area in the Kalu River Basin. The same modeling approach could be applied to obtain flood risk levels caused by rivers across the globe

    A Novel Hybrid Spatiotemporal Missing Value Imputation Approach for Rainfall Data: An Application to the Ratnapura Area, Sri Lanka

    No full text
    Meteorological time series, such as rainfall data, show spatiotemporal characteristics and are often faced with the problem of containing missing values. Discarding missing values or modeling data with missing values causes negative impacts on the accuracy of the final predictions. Hence, accurately estimating missing values by considering the spatiotemporal variations in data has become a crucial step in eco-hydrological modeling. The multi-layer perceptron (MLP) is a promising tool for modeling temporal variation, while spatial kriging (SK) is a promising tool for capturing spatial variations. Therefore, in this study, we propose a novel hybrid approach combining the multi-layer perceptron method and spatial kriging to impute missing values in rainfall data. The proposed approach was tested using spatiotemporal data collected from a set of nearby rainfall gauging stations in the Ratnapura area, Sri Lanka. Missing values are present in collected rainfall data consecutively for a considerably longer period. This pattern has scattered among stations discontinuously over five years. The proposed hybrid model captures the temporal variability and spatial variability of the rainfall data through MLP and SK, respectively. It integrates predictions obtained through both MLP and SK with a novel optimal weight allocation method. The performance of the model was compared with individual approaches, MLP, SK, and spatiotemporal kriging. The results indicate that the novel hybrid approach outperforms spatiotemporal kriging and the other two pure approaches
    corecore