85 research outputs found

    Myocarditis following COVID-19 vaccine: incidence, presentation, diagnosis, pathophysiology, therapy, and outcomes put into perspective. A clinical consensus document supported by the Heart Failure Association of the European Society of Cardiology (ESC) and the ESC Working Group on Myocardial and Pericardial Diseases

    Get PDF
    Over 10 million doses of COVID-19 vaccines based on RNA technology, viral vectors, recombinant protein, and inactivated virus have been administered worldwide. Although generally very safe, post-vaccine myocarditis can result from adaptive humoral and cellular, cardiac-specific inflammation within days and weeks of vaccination. Rates of vaccine-associated myocarditis vary by age and sex with the highest rates in males between 12 and 39 years. The clinical course is generally mild with rare cases of left ventricular dysfunction, heart failure and arrhythmias. Mild cases are likely underdiagnosed as cardiac magnetic resonance imaging (CMR) is not commonly performed even in suspected cases and not at all in asymptomatic and mildly symptomatic patients. Hospitalization of symptomatic patients with electrocardiographic changes and increased plasma troponin levels is considered necessary in the acute phase to monitor for arrhythmias and potential decline in left ventricular function. In addition to evaluation for symptoms, electrocardiographic changes and elevated troponin levels, CMR is the best non-invasive diagnostic tool with endomyocardial biopsy being restricted to severe cases with heart failure and/or arrhythmias. The management beyond. guideline-directed treatment of heart failure and arrhythmias includes non-specific measures to control pain. Anti-inflammatory drugs such as non-steroidal anti-inflammatory drugs, and corticosteroids have been used in more severe cases, with only anecdotal evidence for their effectiveness. In all age groups studied, the overall risks of SARS-CoV-2 infection-related hospitalization and death are hugely greater than the risks from post-vaccine myocarditis. This consensus statement serves as a practical resource for physicians in their clinical practice, to understand, diagnose, and manage affected patients. Furthermore, it is intended to stimulate research in this area

    Assessing socioeconomic health care utilization inequity in Israel: impact of alternative approaches to morbidity adjustment

    Get PDF
    <p/> <p>Background</p> <p>The ability to accurately detect differential resource use between persons of different socioeconomic status relies on the accuracy of health-needs adjustment measures. This study tests different approaches to morbidity adjustment in explanation of health care utilization inequity.</p> <p>Methods</p> <p>A representative sample was selected of 10 percent (~270,000) adult enrolees of Clalit Health Services, Israel's largest health care organization. The Johns-Hopkins University Adjusted Clinical Groups<sup>® </sup>were used to assess each person's overall morbidity burden based on one year's (2009) diagnostic information. The odds of above average health care resource use (primary care visits, specialty visits, diagnostic tests, or hospitalizations) were tested using multivariate logistic regression models, separately adjusting for levels of health-need using data on age and gender, comorbidity (using the Charlson Comorbidity Index), or morbidity burden (using the Adjusted Clinical Groups). Model fit was assessed using tests of the Area Under the Receiver Operating Characteristics Curve and the Akaike Information Criteria.</p> <p>Results</p> <p>Low socioeconomic status was associated with higher morbidity burden (1.5-fold difference). Adjusting for health needs using age and gender or the Charlson index, persons of low socioeconomic status had greater odds of above average resource use for all types of services examined (primary care and specialist visits, diagnostic tests, or hospitalizations). In contrast, after adjustment for overall morbidity burden (using Adjusted Clinical Groups), low socioeconomic status was no longer associated with greater odds of specialty care or diagnostic tests (OR: 0.95, CI: 0.94-0.99; and OR: 0.91, CI: 0.86-0.96, for specialty visits and diagnostic respectively). Tests of model fit showed that adjustment using the comprehensive morbidity burden measure provided a better fit than age and gender or the Charlson Index.</p> <p>Conclusions</p> <p>Identification of socioeconomic differences in health care utilization is an important step in disparity reduction efforts. Adjustment for health-needs using a comprehensive morbidity burden diagnoses-based measure, this study showed relative underutilization in use of specialist and diagnostic services, and thus allowed for identification of inequity in health resources use, which could not be detected with less comprehensive forms of health-needs adjustments.</p

    A Prediction Model to Prioritize Individuals for a SARS-CoV-2 Test Built from National Symptom Surveys

    Get PDF
    Background: The gold standard for COVID-19 diagnosis is detection of viral RNA through PCR. Due to global limitations in testing capacity, effective prioritization of individuals for testing is essential. Methods: We devised a model estimating the probability of an individual to test positive for COVID-19 based on answers to 9 simple questions that have been associated with SARS-CoV-2 infection. Our model was devised from a subsample of a national symptom survey that was answered over 2 million times in Israel in its first 2 months and a targeted survey distributed to all residents of several cities in Israel. Overall, 43,752 adults were included, from which 498 self-reported as being COVID-19 positive. Findings: Our model was validated on a held-out set of individuals from Israel where it achieved an auROC of 0.737 (CI: 0.712–0.759) and auPR of 0.144 (CI: 0.119–0.177) and demonstrated its applicability outside of Israel in an independently collected symptom survey dataset from the US, UK, and Sweden. Our analyses revealed interactions between several symptoms and age, suggesting variation in the clinical manifestation of the disease in different age groups. Conclusions: Our tool can be used online and without exposure to suspected patients, thus suggesting worldwide utility in combating COVID-19 by better directing the limited testing resources through prioritization of individuals for testing, thereby increasing the rate at which positive individuals can be identified. Moreover, individuals at high risk for a positive test result can be isolated prior to testing. Funding: E.S. is supported by the Crown Human Genome Center, Larson Charitable Foundation New Scientist Fund, Else Kroener Fresenius Foundation, White Rose International Foundation, Ben B. and Joyce E. Eisenberg Foundation, Nissenbaum Family, Marcos Pinheiro de Andrade and Vanessa Buchheim, Lady Michelle Michels, and Aliza Moussaieff and grants funded by the Minerva foundation with funding from the Federal German Ministry for Education and Research and by the European Research Council and the Israel Science Foundation. H.R. is supported by the Israeli Council for Higher Education (CHE) via the Weizmann Data Science Research Center and by a research grant from Madame Olga Klein – Astrachan

    The association between socio-demographic characteristics and adherence to breast and colorectal cancer screening: Analysis of large sub populations

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Populations having lower socioeconomic status, as well as ethnic minorities, have demonstrated lower utilization of preventive screening, including tests for early detection of breast and colorectal cancer.</p> <p>The objective</p> <p>To explore socio-demographic disparities in adherence to screening recommendations for early detection of cancer.</p> <p>Methods</p> <p>The study was conducted by Maccabi Healthcare Services, an Israeli HMO (health plan) providing healthcare services to 1.9 million members. Utilization of breast cancer (BC) and colorectal cancer (CC) screening were analyzed by socio-economic ranks (SERs), ethnicity (Arab vs non-Arab), immigration status and ownership of voluntarily supplemental health insurance (VSHI).</p> <p>Results</p> <p>Data on 157,928 and 303,330 adults, eligible for BC and CC screening, respectively, were analyzed. Those having lower SER, Arabs, immigrants from Former Soviet Union countries and non-owners of VSHI performed fewer cancer screening examinations compared with those having higher SER, non-Arabs, veterans and owners of VSHI (p < 0.001). Logistic regression model for BC Screening revealed a positive association with age and ownership of VSHI and a negative association with being an Arab and having a lower SER. The model for CC screening revealed a positive association with age and ownership of VSHI and a negative association with being an Arab, having a lower SER and being an immigrant. The model estimated for BC and CC screening among females revealed a positive association with age and ownership of VSHI and a negative association with being an Arab, having a lower SER and being an immigrant.</p> <p>Conclusion</p> <p>Patients from low socio-economic backgrounds, Arabs, immigrants and those who do not own supplemental insurance do fewer tests for early detection of cancer. These sub-populations should be considered priority populations for targeted intervention programs and improved resource allocation.</p

    Influenza, Winter Olympiad, 2002

    Get PDF
    Prospective surveillance for influenza was performed during the 2002 Salt Lake City Winter Olympics. Oseltamivir was administered to patients with influenzalike illness and confirmed influenza, while their close contacts were given oseltamivir prophylactically. Influenza A/B was diagnosed in 36 of 188 patients, including 13 athletes. Prompt management limited the spread of this outbreak

    Antiviral resistance during pandemic influenza: implications for stockpiling and drug use

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The anticipated extent of antiviral use during an influenza pandemic can have adverse consequences for the development of drug resistance and rationing of limited stockpiles. The strategic use of drugs is therefore a major public health concern in planning for effective pandemic responses.</p> <p>Methods</p> <p>We employed a mathematical model that includes both sensitive and resistant strains of a virus with pandemic potential, and applies antiviral drugs for treatment of clinical infections. Using estimated parameters in the published literature, the model was simulated for various sizes of stockpiles to evaluate the outcome of different antiviral strategies.</p> <p>Results</p> <p>We demonstrated that the emergence of highly transmissible resistant strains has no significant impact on the use of available stockpiles if treatment is maintained at low levels or the reproduction number of the sensitive strain is sufficiently high. However, moderate to high treatment levels can result in a more rapid depletion of stockpiles, leading to run-out, by promoting wide-spread drug resistance. We applied an antiviral strategy that delays the onset of aggressive treatment for a certain amount of time after the onset of the outbreak. Our results show that if high treatment levels are enforced too early during the outbreak, a second wave of infections can potentially occur with a substantially larger magnitude. However, a timely implementation of wide-scale treatment can prevent resistance spread in the population, and minimize the final size of the pandemic.</p> <p>Conclusion</p> <p>Our results reveal that conservative treatment levels during the early stages of the outbreak, followed by a timely increase in the scale of drug-use, will offer an effective strategy to manage drug resistance in the population and avoid run-out. For a 1918-like strain, the findings suggest that pandemic plans should consider stockpiling antiviral drugs to cover at least 20% of the population.</p

    Outbreak of Pneumonia in the Setting of Fatal Pneumococcal Meningitis among US Army Trainees: Potential Role of Chlamydia pneumoniae Infection

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Compared to the civilian population, military trainees are often at increased risk for respiratory infections. We investigated an outbreak of radiologically-confirmed pneumonia that was recognized after 2 fatal cases of serotype 7F pneumococcal meningitis were reported in a 303-person military trainee company (Alpha Company).</p> <p>Methods</p> <p>We reviewed surveillance data on pneumonia and febrile respiratory illness at the training facility; conducted chart reviews for cases of radiologically-confirmed pneumonia; and administered surveys and collected nasopharyngeal swabs from trainees in the outbreak battalion (Alpha and Hotel Companies), associated training staff, and trainees newly joining the battalion.</p> <p>Results</p> <p>Among Alpha and Hotel Company trainees, the average weekly attack rates of radiologically-confirmed pneumonia were 1.4% and 1.2% (most other companies at FLW: 0-0.4%). The pneumococcal carriage rate among all Alpha Company trainees was 15% with a predominance of serotypes 7F and 3. <it>Chlamydia pneumoniae </it>was identified from 31% of specimens collected from Alpha Company trainees with respiratory symptoms.</p> <p>Conclusion</p> <p>Although the etiology of the outbreak remains unclear, the identification of both <it>S. pneumoniae </it>and <it>C. pneumoniae </it>among trainees suggests that both pathogens may have contributed either independently or as cofactors to the observed increased incidence of pneumonia in the outbreak battalion and should be considered as possible etiologies in outbreaks of pneumonia in the military population.</p

    Impact of two interventions on timeliness and data quality of an electronic disease surveillance system in a resource limited setting (Peru): a prospective evaluation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A timely detection of outbreaks through surveillance is needed in order to prevent future pandemics. However, current surveillance systems may not be prepared to accomplish this goal, especially in resource limited settings. As data quality and timeliness are attributes that improve outbreak detection capacity, we assessed the effect of two interventions on such attributes in Alerta, an electronic disease surveillance system in the Peruvian Navy.</p> <p>Methods</p> <p>40 Alerta reporting units (18 clinics and 22 ships) were included in a 12-week prospective evaluation project. After a short refresher course on the notification process, units were randomly assigned to either a phone, visit or control group. Phone group sites were called three hours before the biweekly reporting deadline if they had not sent their report. Visit group sites received supervision visits on weeks 4 & 8, but no phone calls. The control group sites were not contacted by phone or visited. Timeliness and data quality were assessed by calculating the percentage of reports sent on time and percentage of errors per total number of reports, respectively.</p> <p>Results</p> <p>Timeliness improved in the phone group from 64.6% to 84% in clinics (+19.4 [95% CI, +10.3 to +28.6]; p < 0.001) and from 46.9% to 77.3% on ships (+30.4 [95% CI, +16.9 to +43.8]; p < 0.001). Visit and control groups did not show significant changes in timeliness. Error rates decreased in the visit group from 7.1% to 2% in clinics (-5.1 [95% CI, -8.7 to -1.4]; p = 0.007), but only from 7.3% to 6.7% on ships (-0.6 [95% CI, -2.4 to +1.1]; p = 0.445). Phone and control groups did not show significant improvement in data quality.</p> <p>Conclusion</p> <p>Regular phone reminders significantly improved timeliness of reports in clinics and ships, whereas supervision visits led to improved data quality only among clinics. Further investigations are needed to establish the cost-effectiveness and optimal use of each of these strategies.</p
    corecore