70 research outputs found
Characteristics of Fatal Cases of Pandemic Influenza A (H1N1) from September 2009 to January 2010 in Saurashtra Region, India
Background: India reported first case of 2009 pandemic influenza A (H1N1) virus infection in May, 2009 and Saurashtra region in August, 2009. We describe the characteristics of fatal cases of 2009 influenza A (H1N1) infection reported in Saurashtra region. Methods: From September, 2009 to January, 2010, we observed 71 fatal cases that were infected with 2009 influenza A (H1N1) virus and admitted in different hospitals in Rajkot city. Real-time reverse-transcriptase-polymerase-chain-reaction (RT-PCR) testing was used to confirm infection; the clinico-epidemiological features were observed and documented. Results: Median age of the deceased (71) was 29 years, and 57.7% were females. Median time observed was 5 days from onset of illness to diagnosis of influenza A (H1N1), and 57.7% were referred from general practitioner (OR=0.42, CI=0.24-0.74). Median hospital stay reported was 3 days. All admitted patients received oseltamivir, but only 16.9% received it within 2 days of onset of illness. The most common symptoms were cough (97.2%), fever (93%), sore throat and shortness of breath. Co-morbid conditions were present in almost half of the patients who ultimately died, the most common of which was pregnancy (OR=0.15, CI=0.04-0.52). Radiological pneumonia was reported in 98% patients. Conclusion: Residing in urban area, delayed referral from general practitioner, presence of co-existing condition, especially pregnancy was responsible for mortality among influenza A (H1N1) infected positive
Bacteriological analysis of bile in cholecystectomy patients
Background: Cholecystectomy is currently a frequently performed operation. The presence of gallstones within either the gallbladder or biliary tree is associated with the bacterial colonization of the bile. Acute cholangitis spans a continuous clinical spectrum and can progress from a local biliary infection to advanced disease with sepsis and multiple organ dysfunction syndrome. Therefore, it is important to know the microbiological flora of the gallbladder before prophylactic antibiotics are given. Aims & objectives: To evaluate the microbiological profile of bile from gall bladder in patients undergoing cholecystectomy. To determine the appropriate antibiotic for preoperative prophylaxis in cholecystectomy patients based on the microbiological profile of bile.Methods: The study was a prospective study carried out in SSG Hospital. A total of 78 patients undergone cholecystectomy who met the inclusion criteria were included in the study. 3cc bile was aspirated from all patients, this collected bile from gallbladder before cholecystectomy was transported to the laboratory in sterile test-tube. The specimen was evaluated to find out whether it is sterile or has any bacteria present. The types of bacteria are determined and whether the amount of isolate is significant or not. And sensitivity to antibacterial agents against antibiotics was determined.Results: 19 patients showed positive bile culture in which Escherichia coli was the most common isolated bacteria (63.16% among positive bile culture and 15.38% among all patients) and bile was sterile in 59 patients (75.64%). Other organisms isolated were Pseudomonas (3.85%), Klebsiella (2.56%), coagulase negative Staphylococcus and Staphylococcus viridans (1.28%). Positive bile culture was a more common finding (50% of patients were bile culture positive) in patients with acute cholecystitis in this study. Post-operative wound infection is more common (15.79%) in group of patients with isolated organism from bile. There is a strong correlation between bile culture and wound culture (75%). Conclusions: It was found that sensitivity to third- and fourth-generation cephalosporins was higher as compared to aminoglycoside in acute as well as chronic cholecystitis. In this study levofloxacin also shows good sensitivity against isolated organism from bile. Piperacilin and tazobactum also shows good sensitivity against isolated organism from bile and they are more effective against pseudomonas. The resistance to second-generation cephalosporins and aminoglycoside has increased. For preoperative prophylaxis third and fourth-generation cephalosporins and levofloxacin show better promise and may be used as the first line of preoperative prophylaxis in operations for acute and chronic cholecystitis undergoing cholecystectomy.
Novel subtype of mucopolysaccharidosis caused by arylsulfatase K (ARSK) deficiency
BACKGROUND: Mucopolysaccharidoses (MPS) are monogenic metabolic disorders that significantly affect the skeleton. Eleven enzyme defects in the lysosomal degradation of glycosaminoglycans (GAGs) have been assigned to the known MPS subtypes (I-IX). Arylsulfatase K (ARSK) is a recently characterised lysosomal hydrolase involved in GAG degradation that removes the 2-O-sulfate group from 2-sulfoglucuronate. Knockout of Arsk in mice was consistent with mild storage pathology, but no human phenotype has yet been described. METHODS: In this study, we report four affected individuals of two unrelated consanguineous families with homozygous variants c.250C>T, p.(Arg84Cys) and c.560T>A, p.(Leu187Ter) in ARSK, respectively. Functional consequences of the two ARSK variants were assessed by mutation-specific ARSK constructs derived by site-directed mutagenesis, which were ectopically expressed in HT1080 cells. Urinary GAG excretion was analysed by dimethylene blue and electrophoresis, as well as liquid chromatography/mass spectrometry (LC-MS)/MS analysis. RESULTS: The phenotypes of the affected individuals include MPS features, such as short stature, coarse facial features and dysostosis multiplex. Reverse phenotyping in two of the four individuals revealed additional cardiac and ophthalmological abnormalities. Mild elevation of dermatan sulfate was detected in the two subjects investigated by LC-MS/MS. Human HT1080 cells expressing the ARSK-Leu187Ter construct exhibited absent protein levels by western blot, and cells with the ARSK-Arg84Cys construct showed markedly reduced enzyme activity in an ARSK-specific enzymatic assay against 2-O-sulfoglucuronate-containing disaccharides as analysed by C18-reversed-phase chromatography followed by MS. CONCLUSION: Our work provides a detailed clinical and molecular characterisation of a novel subtype of mucopolysaccharidosis, which we suggest to designate subtype X
Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study
Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe
Effect of surgical experience and spine subspecialty on the reliability of the {AO} Spine Upper Cervical Injury Classification System
OBJECTIVE
The objective of this paper was to determine the interobserver reliability and intraobserver reproducibility of the AO Spine Upper Cervical Injury Classification System based on surgeon experience (< 5 years, 5–10 years, 10–20 years, and > 20 years) and surgical subspecialty (orthopedic spine surgery, neurosurgery, and "other" surgery).
METHODS
A total of 11,601 assessments of upper cervical spine injuries were evaluated based on the AO Spine Upper Cervical Injury Classification System. Reliability and reproducibility scores were obtained twice, with a 3-week time interval. Descriptive statistics were utilized to examine the percentage of accurately classified injuries, and Pearson’s chi-square or Fisher’s exact test was used to screen for potentially relevant differences between study participants. Kappa coefficients (κ) determined the interobserver reliability and intraobserver reproducibility.
RESULTS
The intraobserver reproducibility was substantial for surgeon experience level (< 5 years: 0.74 vs 5–10 years: 0.69 vs 10–20 years: 0.69 vs > 20 years: 0.70) and surgical subspecialty (orthopedic spine: 0.71 vs neurosurgery: 0.69 vs other: 0.68). Furthermore, the interobserver reliability was substantial for all surgical experience groups on assessment 1 (< 5 years: 0.67 vs 5–10 years: 0.62 vs 10–20 years: 0.61 vs > 20 years: 0.62), and only surgeons with > 20 years of experience did not have substantial reliability on assessment 2 (< 5 years: 0.62 vs 5–10 years: 0.61 vs 10–20 years: 0.61 vs > 20 years: 0.59). Orthopedic spine surgeons and neurosurgeons had substantial intraobserver reproducibility on both assessment 1 (0.64 vs 0.63) and assessment 2 (0.62 vs 0.63), while other surgeons had moderate reliability on assessment 1 (0.43) and fair reliability on assessment 2 (0.36).
CONCLUSIONS
The international reliability and reproducibility scores for the AO Spine Upper Cervical Injury Classification System demonstrated substantial intraobserver reproducibility and interobserver reliability regardless of surgical experience and spine subspecialty. These results support the global application of this classification system
Phenotype and genotype in patients with Larsen syndrome: clinical homogeneity and allelic heterogeneity in seven patients
Finishing the euchromatic sequence of the human genome
The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead
Recommended from our members
Global burden of 288 causes of death and life expectancy decomposition in 204 countries and territories and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021
BACKGROUND Regular, detailed reporting on population health by underlying cause of death is fundamental for public health decision making. Cause-specific estimates of mortality and the subsequent effects on life expectancy worldwide are valuable metrics to gauge progress in reducing mortality rates. These estimates are particularly important following large-scale mortality spikes, such as the COVID-19 pandemic. When systematically analysed, mortality rates and life expectancy allow comparisons of the consequences of causes of death globally and over time, providing a nuanced understanding of the effect of these causes on global populations. METHODS The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 cause-of-death analysis estimated mortality and years of life lost (YLLs) from 288 causes of death by age-sex-location-year in 204 countries and territories and 811 subnational locations for each year from 1990 until 2021. The analysis used 56 604 data sources, including data from vital registration and verbal autopsy as well as surveys, censuses, surveillance systems, and cancer registries, among others. As with previous GBD rounds, cause-specific death rates for most causes were estimated using the Cause of Death Ensemble model-a modelling tool developed for GBD to assess the out-of-sample predictive validity of different statistical models and covariate permutations and combine those results to produce cause-specific mortality estimates-with alternative strategies adapted to model causes with insufficient data, substantial changes in reporting over the study period, or unusual epidemiology. YLLs were computed as the product of the number of deaths for each cause-age-sex-location-year and the standard life expectancy at each age. As part of the modelling process, uncertainty intervals (UIs) were generated using the 2·5th and 97·5th percentiles from a 1000-draw distribution for each metric. We decomposed life expectancy by cause of death, location, and year to show cause-specific effects on life expectancy from 1990 to 2021. We also used the coefficient of variation and the fraction of population affected by 90% of deaths to highlight concentrations of mortality. Findings are reported in counts and age-standardised rates. Methodological improvements for cause-of-death estimates in GBD 2021 include the expansion of under-5-years age group to include four new age groups, enhanced methods to account for stochastic variation of sparse data, and the inclusion of COVID-19 and other pandemic-related mortality-which includes excess mortality associated with the pandemic, excluding COVID-19, lower respiratory infections, measles, malaria, and pertussis. For this analysis, 199 new country-years of vital registration cause-of-death data, 5 country-years of surveillance data, 21 country-years of verbal autopsy data, and 94 country-years of other data types were added to those used in previous GBD rounds. FINDINGS The leading causes of age-standardised deaths globally were the same in 2019 as they were in 1990; in descending order, these were, ischaemic heart disease, stroke, chronic obstructive pulmonary disease, and lower respiratory infections. In 2021, however, COVID-19 replaced stroke as the second-leading age-standardised cause of death, with 94·0 deaths (95% UI 89·2-100·0) per 100 000 population. The COVID-19 pandemic shifted the rankings of the leading five causes, lowering stroke to the third-leading and chronic obstructive pulmonary disease to the fourth-leading position. In 2021, the highest age-standardised death rates from COVID-19 occurred in sub-Saharan Africa (271·0 deaths [250·1-290·7] per 100 000 population) and Latin America and the Caribbean (195·4 deaths [182·1-211·4] per 100 000 population). The lowest age-standardised death rates from COVID-19 were in the high-income super-region (48·1 deaths [47·4-48·8] per 100 000 population) and southeast Asia, east Asia, and Oceania (23·2 deaths [16·3-37·2] per 100 000 population). Globally, life expectancy steadily improved between 1990 and 2019 for 18 of the 22 investigated causes. Decomposition of global and regional life expectancy showed the positive effect that reductions in deaths from enteric infections, lower respiratory infections, stroke, and neonatal deaths, among others have contributed to improved survival over the study period. However, a net reduction of 1·6 years occurred in global life expectancy between 2019 and 2021, primarily due to increased death rates from COVID-19 and other pandemic-related mortality. Life expectancy was highly variable between super-regions over the study period, with southeast Asia, east Asia, and Oceania gaining 8·3 years (6·7-9·9) overall, while having the smallest reduction in life expectancy due to COVID-19 (0·4 years). The largest reduction in life expectancy due to COVID-19 occurred in Latin America and the Caribbean (3·6 years). Additionally, 53 of the 288 causes of death were highly concentrated in locations with less than 50% of the global population as of 2021, and these causes of death became progressively more concentrated since 1990, when only 44 causes showed this pattern. The concentration phenomenon is discussed heuristically with respect to enteric and lower respiratory infections, malaria, HIV/AIDS, neonatal disorders, tuberculosis, and measles. INTERPRETATION Long-standing gains in life expectancy and reductions in many of the leading causes of death have been disrupted by the COVID-19 pandemic, the adverse effects of which were spread unevenly among populations. Despite the pandemic, there has been continued progress in combatting several notable causes of death, leading to improved global life expectancy over the study period. Each of the seven GBD super-regions showed an overall improvement from 1990 and 2021, obscuring the negative effect in the years of the pandemic. Additionally, our findings regarding regional variation in causes of death driving increases in life expectancy hold clear policy utility. Analyses of shifting mortality trends reveal that several causes, once widespread globally, are now increasingly concentrated geographically. These changes in mortality concentration, alongside further investigation of changing risks, interventions, and relevant policy, present an important opportunity to deepen our understanding of mortality-reduction strategies. Examining patterns in mortality concentration might reveal areas where successful public health interventions have been implemented. Translating these successes to locations where certain causes of death remain entrenched can inform policies that work to improve life expectancy for people everywhere. FUNDING Bill & Melinda Gates Foundation
Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial
Background: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. Methods: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. Findings: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96–1·28). Interpretation: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. Funding: National Institute for Health Research Health Services and Delivery Research Programme
Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial
BACKGROUND: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. METHODS: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. FINDINGS: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96-1·28). INTERPRETATION: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. FUNDING: National Institute for Health Research Health Services and Delivery Research Programme
- …