55 research outputs found

    Urinary catecholamine excretion, cardiovascular variability, and outcomes in tetanus

    Get PDF
    Severe tetanus is characterized by muscle spasm and cardiovascular system disturbance. The pathophysiology of muscle spasm is relatively well understood and involves inhibition of central inhibitory synapses by tetanus toxin. That of cardiovascular disturbance is less clear, but is believed to relate to disinhibition of the autonomic nervous system. The clinical syndrome of autonomic nervous system dysfunction (ANSD) seen in severe tetanus is characterized principally by changes in heart rate and blood pressure which have been linked to increased circulating catecholamines. Previous studies have described varying relationships between catecholamines and signs of ANSD in tetanus, but are limited by confounders and assays used. In this study, we aimed to perform detailed characterization of the relationship between catecholamines (adrenaline and noradrenaline), cardiovascular parameters (heart rate and blood pressure) and clinical outcomes (ANSD, mechanical ventilation required, and length of intensive care unit stay) in adults with tetanus, as well as examine whether intrathecal antitoxin administration affected subsequent catecholamine excretion. Noradrenaline and adrenaline were measured by ELISA from 24-h urine collections taken on day 5 of hospitalization in 272 patients enrolled in a 2 × 2 factorial-blinded randomized controlled trial in a Vietnamese hospital. Catecholamine results measured from 263 patients were available for analysis. After adjustment for potential confounders (i.e., age, sex, intervention treatment, and medications), there were indications of non-linear relationships between urinary catecholamines and heart rate. Adrenaline and noradrenaline were associated with subsequent development of ANSD, and length of ICU stay

    Hospital-acquired colonization and infections in a Vietnamese intensive care unit.

    Get PDF
    Data concerning intensive care unit (ICU)-acquired bacterial colonization and infections are scarce from low and middle-income countries (LMICs). ICU patients in these settings are at high risk of becoming colonized and infected with antimicrobial-resistant organisms (AROs). We conducted a prospective observational study at the Ho Chi Minh City Hospital for Tropical Diseases, Vietnam from November 2014 to January 2016 to assess the ICU-acquired colonization and infections, focusing on the five major pathogens in our setting: Staphylococcus aureus (S. aureus), Escherichia coli (E. coli), Klebsiella spp., Pseudomonas spp. and Acinetobacter spp., among adult patients with more than 48 hours of ICU stay. We found that 61.3% (223/364) of ICU patients became colonized with AROs: 44.2% (161/364) with rectal ESBL-producing E. coli and Klebsiella spp.; 30.8% (40/130) with endotracheal carbapenemase-producing Acinetobacter spp.; and 14.3% (52/364) with nasal methicillin-resistant S. aureus. The incidence rate of ICU patients becoming colonized with AROs was 9.8 (223/2,276) per 100 patient days. Significant risk factor for AROs colonization was the Charlson Comorbidity Index score. The proportion of ICU patients with HAIs was 23.4% (85/364), and the incidence rate of ICU patients contracting HAIs was 2.3 (85/3,701) per 100 patient days. The vascular catheterization (central venous, arterial and hemofiltration catheter) was significantly associated with hospital-acquired bloodstream infection. Of the 77 patients who developed ICU-acquired infections with one of the five specified bacteria, 44 (57.1%) had prior colonization with the same organism. Vietnamese ICU patients have a high colonization rate with AROs and a high risk of subsequent infections. Future research should focus on monitoring colonization and the development of preventive measures that may halt spread of AROs in ICU settings

    Implementation of point-of-care testing of C-reactive protein concentrations to improve antibiotic targeting in respiratory illness in Vietnamese primary care: a pragmatic cluster-randomised controlled trial

    Get PDF
    Background In previous trials, point-of-care testing of C-reactive protein (CRP) concentrations safely reduced antibiotic use in non-severe acute respiratory infections in primary care. However, these trials were done in a research-oriented context with close support from research staff, which could have influenced prescribing practices. To better inform the potential for scaling up point-of-care testing of CRP in respiratory infections, we aimed to do a pragmatic trial of the intervention in a routine care setting. Methods We did a pragmatic, cluster-randomised controlled trial at 48 commune health centres in Viet Nam between June 1, 2020, and May 12, 2021. Eligible centres served populations of more than 3000 people, handled 10–40 respiratory infections per week, had licensed prescribers on site, and maintained electronic patient databases. Centres were randomly allocated (1:1) to provide point-of-care CRP testing plus routine care or routine care only. Randomisation was stratified by district and by baseline prescription level (ie, the proportion of patients with suspected acute respiratory infections to whom antibiotics were prescribed in 2019). Eligible patients were aged 1–65 years and visiting the commune health centre for a suspected acute respiratory infection with at least one focal sign or symptom and symptoms lasting less than 7 days. The primary endpoint was the proportion of patients prescribed an antibiotic at first attendance in the intention-to-treat population. The per-protocol analysis included only people who underwent CRP testing. Secondary safety outcomes included time to resolution of symptoms and frequency of hospitalisation. This trial is registered with ClinicalTrials.gov, NCT03855215. Findings 48 commune health centres were enrolled and randomly assigned, 24 to the intervention group (n=18 621 patients) and 24 to the control group (n=21 235). 17 345 (93·1%) patients in the intervention group were prescribed antibiotics, compared with 20 860 (98·2%) in the control group (adjusted relative risk 0·83 [95% CI 0·66–0·93]). Only 2606 (14%) of 18 621 patients in the intervention group underwent CRP testing and were included in the per-protocol analysis. When analyses were restricted to this population, larger reductions in prescribing were noted in the intervention group compared with the control group (adjusted relative risk 0·64 [95% CI 0·60–0·70]). Time to resolution of symptoms (hazard ratio 0·70 [95% CI 0·39–1·27]) and frequency of hospitalisation (nine in the intervention group vs 17 in the control group; adjusted relative risk 0·52 [95% CI 0·23–1·17]) did not differ between groups. Interpretation Use of point-of-care CRP testing efficaciously reduced prescription of antibiotics in patients with non-severe acute respiratory infections in primary health care in Viet Nam without compromising patient recovery. The low uptake of CRP testing suggests that barriers to implementation and compliance need to be addressed before scale-up of the intervention. Funding Australian Government, UK Government, and the Foundation for Innovative New Diagnostics

    Wearable devices for remote monitoring of hospitalized patients with COVID-19 in Vietnam

    Get PDF
    Patients with severe COVID-19 disease require monitoring with pulse oximetry as a minimal requirement. In many low- and middle- income countries, this has been challenging due to lack of staff and equipment. Wearable pulse oximeters potentially offer an attractive means to address this need, due to their low cost, battery operability and capacity for remote monitoring. Between July and October 2021, Ho Chi Minh City experienced its first major wave of SARS-CoV-2 infection, leading to an unprecedented demand for monitoring in hospitalized patients. We assess the feasibility of a continuous remote monitoring system for patients with COVID-19 under these circumstances as we implemented 2 different systems using wearable pulse oximeter devices in a stepwise manner across 4 departments

    Epidemiology of forest malaria in Central Vietnam: the hidden parasite reservoir

    Get PDF
    BackgroundAfter successfully reducing the malaria burden to pre-elimination levels over the past two decades, the national malaria programme in Vietnam has recently switched from control to elimination. However, in forested areas of Central Vietnam malaria elimination is likely to be jeopardized by the high occurrence of asymptomatic and submicroscopic infections as shown by previous reports. This paper presents the results of a malaria survey carried out in a remote forested area of Central Vietnam where we evaluated malaria prevalence and risk factors for infection.MethodsAfter a full census (four study villages?=?1,810 inhabitants), the study population was screened for malaria infections by standard microscopy and, if needed, treated according to national guidelines. An additional blood sample on filter paper was also taken in a random sample of the population for later polymerase chain reaction (PCR) and more accurate estimation of the actual burden of malaria infections. The risk factor analysis for malaria infections was done using survey multivariate logistic regression as well as the classification and regression tree method (CART).ResultsA total of 1,450 individuals were screened. Malaria prevalence by microscopy was 7.8% (ranging from 3.9 to 10.9% across villages) mostly Plasmodium falciparum (81.4%) or Plasmodium vivax (17.7%) mono-infections; a large majority (69.9%) was asymptomatic. By PCR, the prevalence was estimated at 22.6% (ranging from 16.4 to 42.5%) with a higher proportion of P. vivax mono-infections (43.2%). The proportion of sub-patent infections increased with increasing age and with decreasing prevalence across villages. The main risk factors were young age, village, house structure, and absence of bed net.ConclusionThis study confirmed that in Central Vietnam a substantial part of the human malaria reservoir is hidden. Additional studies are urgently needed to assess the contribution of this hidden reservoir to the maintenance of malaria transmission. Such evidence will be crucial for guiding elimination strategies

    Ventilator-associated respiratory infection in a resource-restricted setting: impact and etiology.

    Get PDF
    BACKGROUND: Ventilator-associated respiratory infection (VARI) is a significant problem in resource-restricted intensive care units (ICUs), but differences in casemix and etiology means VARI in resource-restricted ICUs may be different from that found in resource-rich units. Data from these settings are vital to plan preventative interventions and assess their cost-effectiveness, but few are available. METHODS: We conducted a prospective observational study in four Vietnamese ICUs to assess the incidence and impact of VARI. Patients ≥ 16 years old and expected to be mechanically ventilated > 48 h were enrolled in the study and followed daily for 28 days following ICU admission. RESULTS: Four hundred fifty eligible patients were enrolled over 24 months, and after exclusions, 374 patients' data were analyzed. A total of 92/374 cases of VARI (21.7/1000 ventilator days) were diagnosed; 37 (9.9%) of these met ventilator-associated pneumonia (VAP) criteria (8.7/1000 ventilator days). Patients with any VARI, VAP, or VARI without VAP experienced increased hospital and ICU stay, ICU cost, and antibiotic use (p < 0.01 for all). This was also true for all VARI (p < 0.01 for all) with/without tetanus. There was no increased risk of in-hospital death in patients with VARI compared to those without (VAP HR 1.58, 95% CI 0.75-3.33, p = 0.23; VARI without VAP HR 0.40, 95% CI 0.14-1.17, p = 0.09). In patients with positive endotracheal aspirate cultures, most VARI was caused by Gram-negative organisms; the most frequent were Acinetobacter baumannii (32/73, 43.8%) Klebsiella pneumoniae (26/73, 35.6%), and Pseudomonas aeruginosa (24/73, 32.9%). 40/68 (58.8%) patients with positive cultures for these had carbapenem-resistant isolates. Patients with carbapenem-resistant VARI had significantly greater ICU costs than patients with carbapenem-susceptible isolates (6053 USD (IQR 3806-7824) vs 3131 USD (IQR 2108-7551), p = 0.04) and after correction for adequacy of initial antibiotics and APACHE II score, showed a trend towards increased risk of in-hospital death (HR 2.82, 95% CI 0.75-6.75, p = 0.15). CONCLUSIONS: VARI in a resource-restricted setting has limited impact on mortality, but shows significant association with increased patient costs, length of stay, and antibiotic use, particularly when caused by carbapenem-resistant bacteria. Evidence-based interventions to reduce VARI in these settings are urgently needed

    The Uncertainty of Gis-based Interpolation Methods in Constructing Shallow Groundwater Distribution Map: A Case Study at Pleiku City, Gia Lai Province

    Full text link
    Four interpolation methods are applied to interpolate shallow groundwater level in Pleiku city, including inverse distance weighted, tension spline, universal kriging, and ordinary kriging. The cross-validation results record that the ordinary kriging is the best interpolation method which is shown by the lowest RMSE value, the highest R2 value. It is selected to assess the shallow groundwater level in spatial and seasonal change. Based on the groundwater level interpolated by the ordinary kriging, the groundwater level is divided into the northern and southern parts of the study area. The distribution of groundwater level is shallower than that in the southern part wheret groundwater depth is around less than15 m while in the southern part, most of the groundwater level is higher than 18 m. The elevation of groundwater level is found in rainy season; the elevated area accounts for 72.6% of natural area. Additionally, the groundwater level also declines in the rainy season at some region, focusing on Bien Ho lake region and two regions in the southern part of the city whose area accounts for 27.4% of the natural area

    Characterizing the spatial distribution of coral reefs in the South-Central Coast region of Viet Nam using Planetscope imagery

    No full text
    This study aims to understand the spatial distribution of coral reefs in the central region of Viet Nam. We classified live coral cover in Son Tra Peninsula (ST) and Cu Lao Cham Island (CLC) in the South-Central Coast Region of Viet Nam using the Maximum Likelihood Classifier on 3 m Planetscope imagery. Confusion matrices and the accuracy of the classifier were assessed using field data (1,543 and 1,560 photographs in ST and CLC, respectively). The results showed that the reef’s width ranged from 30 to 300 m across the study site, and we were able to detect live coral cover across a depth gradient of 2 to 6 m below the sea surface. The overall accuracies of the classifier (the Kappa coefficient) were 76.78% (0.76) and 78.08% (0.78) for ST and CLC, respectively. We found that 60.25% of coral reefs in ST were unhealthy and the live coral cover was less than 50%, while 25.75% and 11.46% of those in CLC were in good and excellent conditions, respectively. This study demonstrates the feasibility of utilizing Planetscope imagery to monitor shallow coral reefs of small islands at a high spatial resolution of 3 m. The results of this study provide valuable information for coral reef protection and conservation

    A novel solution of enhanced loss function using deep learning in sleep stage classification : predict and diagnose patients with sleep disorders

    No full text
    Sleep stage classification is important to accurately predict and diagnose patients with sleep disorders. Though various deep learning approaches have been implemented to classify sleep classes, these consist limitations that impact the accuracy and processing time of the classification model. The aim of this research is to enhance the accuracy and minimize the training time of the deep learning classification model. The proposed system consists of One Dimensional Convolutional Neural Network (CNN) with enhanced loss function to improve the accuracy of scoring of five different sleep classes. Preprocessing, Feature Extraction and Classification are the main components of the proposed system. Initially, EEG signals are fed to an adaptive filter for preprocessing, in order to remove any noise in signal. Thereafter, feature is extracted through multiple convolutional and pooling layers, and finally the classification is done by fully connected layer using softmax activation with enhanced loss function. The proposed solution is tested on data samples from multiple datasets with five classes of Sleep classification. Based on the obtained results, the proposed solution has found to achieve an accuracy of 96.26% which is almost 4.2% higher than the state-of-the-art solution which is 92.76%. Furthermore, the processing time has been reduced by 11 milliseconds against the state-of-the-art solution. The proposed system focused on classifying sleep stages in five classes using EEG signals with deep learning approach. It enhances the loss function in order to minimize errors in the prediction of sleep classes and improves the accuracy of the model. Furthermore, the training speed of the model has also been reduced by applying batch normalization techniques inside the model. In the future, larger datasets of different sleep disorder patients with varying features can be used for training and implementing the proposed solution. The datasets can also be pre-processed using additional techniques to refine the data before feeding to the neural network model

    A novel solution of using deep learning for left ventricle detection : enhanced feature extraction

    No full text
    Background and aim: deep learning algorithms have not been successfully used for the left ventricle (LV) detection in echocardiographic images due to overfitting and vanishing gradient descent problem. This research aims to increase accuracy and improves the processing time of the left ventricle detection process by reducing the overfitting and vanishing gradient problem. Methodology: the proposed system consists of an enhanced deep convolutional neural network with an extra convolutional layer, and dropout layer to solve the problem of overfitting and vanishing gradient. Data augmentation was used for increasing the accuracy of feature extraction for left ventricle detection. Results: four pathological groups of datasets were used for training and evaluation of the model: heart failure without infarction, heart failure with infarction, and hypertrophy, and healthy. The proposed model provided an accuracy of 94% in left ventricle detection for all the groups compared to the other current systems. The results showed that the processing time was reduced from 0.45 s to 0.34 s in an average. Conclusion: the proposed system enhances accuracy and decreases processing time in the left ventricle detection. This paper solves the issues of overfitting of the data
    corecore