41 research outputs found

    The 18-Year Risk of Cancer, Angioedema, Insomnia, Depression, and Erectile Dysfunction in Association With Antihypertensive Drugs: Post-Trial Analyses From ALLHAT-Medicare Linked Data

    Get PDF
    PURPOSE: This study aimed to determine the 18-year risk of cancer, angioedema, insomnia, depression, and erectile dysfunction in association with antihypertensive drug use. METHODS: This is a post-trial passive follow-up study of Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT) participants between 1994 and 1998 that was conducted by linking their follow-up data with Medicare claims data until 2017 of subjects who were free of outcomes at baseline on 1 January 1999. The main outcomes were the occurrence of cancer (among RESULTS: The 18-year cumulative incidence rate of cancer other than non-melanoma skin cancer from Medicare inpatient claims was 23.9% for chlorthalidone, 23.4% for amlodipine, and 25.3% for lisinopril. There were no statistically significant differences in the 18-year risk of cancer, depression, and erectile dysfunction among the three drugs based on the adjusted hazard ratios. The adjusted 18-year risk of angioedema was elevated in those receiving lisinopril than in those receiving amlodipine (hazard ratio: 1.63, 95% CI: 1.14-2.33) or in those receiving chlorthalidone (1.33, 1.00-1.79), whereas the adjusted 18-year risk of insomnia was statistically significantly decreased in those receiving lisinopril than in those receiving amlodipine (0.90, 0.81-1.00). CONCLUSIONS: The 18-year risk of angioedema was significantly higher in patients receiving lisinopril than in those receiving amlodipine or chlorthalidone; the risk of insomnia was significantly lower in patients receiving lisinopril than in those receiving amlodipine; and the risk of cancer, depression, and erectile dysfunction (in men) was not statistically significantly different among the three drug groups

    Mortality and Morbidity among individuals With Hypertension Receiving a Diuretic, ace inhibitor, or Calcium Channel Blocker: a Secondary analysis of a Randomized Clinical Trial

    Get PDF
    IMPORTANCE: The long-term relative risk of antihypertensive treatments with regard to mortality and morbidity is not well understood. OBJECTIVE: to determine the long-term posttrial risk of primary and secondary outcomes among trial participants who were randomized to either a thiazide-type diuretic, calcium channel blocker (CCB), or angiotensin-converting enzyme (ACE) inhibitor with up to 23 years of follow-up. DESIGN, SETTING, AND PARTICIPANTS: This prespecified secondary analysis of the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT), a multicenter randomized, double-blind, active-controlled clinical trial, followed up with participants aged 55 years or older with a diagnosis of hypertension and at least 1 other coronary heart disease risk factor for up to 23 years, from February 23, 1994, to December 31, 2017. Trial participants were linked with administrative databases for posttrial mortality (N = 32 804) and morbidity outcomes (n = 22 754). Statistical analysis was performed from January 2022 to October 2023. INTERVENTIONS: Participants were randomly assigned to receive a thiazide-type diuretic (n = 15 002), a CCB (n = 8898), or an ACE inhibitor (n = 8904) for planned in-trial follow-up of approximately 4 to 8 years and posttrial passive follow-up for up to 23 years. MAIN OUTCOMES AND MEASURES: The primary end point was mortality due to cardiovascular disease (CVD). Secondary outcomes included all-cause mortality, combined fatal and nonfatal (morbidity) CVD, and both mortality and morbidity for coronary heart disease, stroke, heart failure, end-stage renal disease, and cancer. RESULTS: A total of 32 804 participants (mean [SD] age, 66.9 [7.7] years; 17 411 men [53.1%]; and 11 772 Black participants [35.9%]) were followed up for all-cause mortality and a subgroup of 22 754 participants (mean [SD] age, 68.7 [7.2] years; 12 772 women [56.1%]; and 8199 Black participants [36.0%]) were followed up for fatal or nonfatal CVD through 2017 (mean [SD] follow-up, 13.7 [6.7] years; maximum follow-up, 23.9 years). Cardiovascular disease mortality rates per 100 persons were 23.7, 21.6, and 23.8 in the diuretic, CCB, and ACE inhibitor groups, respectively, at 23 years after randomization (adjusted hazard ratio [AHR], 0.97 [95% CI, 0.89-1.05] for CCB vs diuretic; AHR, 1.06 [95% CI, 0.97-1.15] for ACE inhibitor vs diuretic). The long-term risks of most secondary outcomes were similar among the 3 groups. Compared with the diuretic group, the ACE inhibitor group had a 19% increased risk of stroke mortality (AHR, 1.19 [95% CI, 1.03-1.37]) and an 11% increased risk of combined fatal and nonfatal hospitalized stroke (AHR, 1.11 [95% CI, 1.03-1.20]). CONCLUSIONS AND RELEVANCE: In this secondary analysis of a randomized clinical trial in an adult population with hypertension and coronary heart disease risk factors, CVD mortality was similar between all 3 groups. ACE inhibitors increased the risk of stroke outcomes by 11% compared with diuretics, and this effect persisted well beyond the trial period. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT00000542

    An assessment of Outpatient Clinic Room Ventilation Systems and Possible Relationship to Disease Transmission

    Get PDF
    BACKGROUND: With healthcare shifting to the outpatient setting, this study examined whether outpatient clinics operating in business occupancy settings were conducting procedures in rooms with ventilation rates above, at, or below thresholds defined in the American National Standards Institute/American Society of Heating, Refrigerating and Air-Conditioning Engineers/American Society for Health Care Engineering Standard 170 for Ventilation in Health Care Facilities and whether lower ventilation rates and building characteristics increase the risk of disease transmission. METHODS: Ventilation rates were measured in 105 outpatient clinic rooms categorized by services rendered. Building characteristics were evaluated as determinants of ventilation rates, and risk of disease transmission was estimated using the Gammaitoni-Nucci model. RESULTS: When compared to Standard 170, 10% of clinic rooms assessed did not meet the minimum requirement for general exam rooms, 39% did not meet the requirement for treatment rooms, 83% did not meet the requirement for aerosol-generating procedures, and 88% did not meet the requirement for procedure rooms or minor surgical procedures. CONCLUSIONS: Lower than standard air changes per hour were observed and could lead to an increased risk of spread of diseases when conducting advanced procedures and evaluating persons of interest for emerging infectious diseases. These findings are pertinent during the SARS-CoV-2 pandemic, as working guidelines are established for the healthcare community

    Design For a Cluster Randomized Controlled Trial to Evaluate the Effects of the Catch Healthy Smiles School-Based oral Health Promotion intervention among Elementary School Children

    Get PDF
    BACKGROUND: The top two oral diseases (tooth decay and gum disease) are preventable, yet dental caries is the most common childhood disease with 68% of children entering kindergarten having tooth decay. CATCH Healthy Smiles is a coordinated school health program to prevent cavities for students in kindergarten, 1st, and 2nd grade, and is based on the framework of Coordinated Approach to Child Health (CATCH), an evidence-based coordinated school health program. CATCH has undergone several cluster-randomized controlled trials (CRCT) demonstrating sustainable long-term effectiveness in incorporating the factors surrounding children, in improving eating and physical activity behaviors, and reductions in obesity prevalence among low-income, ethnically diverse children. The aim of this paper is to describe the design of the CATCH Healthy Smiles CRCT to determine the effectiveness of an oral health school-based behavioral intervention in reducing incidence of dental caries among children. METHODS: In this CRCT, 30 schools serving low-income, ethnically-diverse children in greater Houston area are recruited and randomized into intervention and comparison groups. From which, 1020 kindergarten children (n = 510 children from 15 schools for each group) will be recruited and followed through 2nd grade. The intervention consists of four components (classroom curriculum, toothbrushing routine, family outreach, and schoolwide coordinated activities) will be implemented for three years in the intervention schools, whereas the control schools will be offered free trainings and materials to implement a sun safety curriculum in the meantime. Outcome evaluation will be conducted at four time points throughout the study period, each consists of three components: dental assessment, child anthropometric measures, and parent survey. The dental assessment will use International Caries Detection and Assessment System (ICDAS) to measures the primary outcome of this study: incidence of dental caries in primary teeth as measured at the tooth surface level (dfs). The parent self-report survey measures secondary outcomes of this study, such as oral health related behavioral and psychosocial factors. A modified crude caries increment (mCCI) will be used to calculate the primary outcome of the CATCH Healthy Smiles CRCT, and a two-tailed test of the null hypothesis will be conducted to evaluate the intervention effect, while considering between- and within-cluster variances through computing the weighted-average of the mCCI ratios by cluster. CONCLUSION: If found to be effective, a platform for scalability, sustainability and dissemination of CATCH already exists, and opens a new line of research in school oral health. CLINICAL TRIALS IDENTIFIER: At ClinicalTrials.gov - NCT04632667

    Occupation and Risk of Traumatic Brain injury in the Millennium Cohort Study

    Get PDF
    INTRODUCTION: Traumatic brain injury (TBI) is an occupational health hazard of military service. Few studies have examined differences in military occupational categories (MOC) which take into consideration the physical demands and job requirements across occupational groups. METHODS: This study was approved by the University of Texas Health Science Center at Houston Institutional Review Board. Data for this cross-sectional study were obtained from the Naval Health Research Center\u27s Millennium Cohort Study, an ongoing DoD study. Univariate analyses were employed to calculate frequencies and proportions for all variables. Bivariate analyses included unadjusted odds ratios (OR) and 95% CI for the association between all variables and TBI. Multivariable logistic regression was used to calculate adjusted ORs and 95% CIs to assess the association between MOC and TBI, adjusted for potential confounders: sex, race/ethnicity, rank, military status, branch of service, before-service TBI, and panel. Logistic regression models estimated odds of TBI for each MOC, and stratified models estimated odds separately for enlisted and officer MOCs. RESULTS: Approximately 27% of all participants reported experiencing a service-related TBI. All MOCs were statistically significantly associated with increased odds of service-related TBI, with a range of 16 to 45%, except for Health Care MOCs (OR: 1.01, 95% CI 0.91-1.13). Service members in Infantry/Tactical Operations had the highest odds (OR: 1.45, 95% CI 1.31-1.61) of service-related TBI as compared to Administration & Executives. Among enlisted service members, approximately 28% reported experiencing a service-related TBI. Among enlisted-specific MOCs, the odds of TBI were elevated for those serving in Infantry, Gun Crews, Seamanship (OR: 1.79, 95% CI 1.58-2.02), followed by Electrical/Mechanical Equipment Repairers (OR: 1.23, 95% CI 1.09-1.38), Service & Supply Handlers (OR 1.21, 95% CI 1.08-1.37), Other Technical & Allied Specialists (OR 1.21, 95% CI 1.02-1.43), Health Care Specialists (OR 1.19, 95% CI 1.04-1.36), and Communications & Intelligence (OR: 1.16, 95% CI 1.02-1.31), compared to Functional Support & Administration. Among officer service members, approximately 24% reported experiencing a service-related TBI. After adjustment the odds of TBI were found to be significant for those serving as Health Care Officers (OR: 0.65, 95% CI: 0.52-0.80) and Intelligence Officers (OR: 1.27, 95% CI: 1.01-1.61). CONCLUSIONS: A strength of this analysis is the breakdown of MOC associations with TBI stratified by enlisted and officer ranks, which has been previously unreported. Given the significantly increased odds of service-related TBI reporting within enlisted ranks, further exploration into the location (deployed versus non-deployed) and mechanism (e.g., blast, training, sports, etc.) for these injuries is needed. Understanding injury patterns within these military occupations is necessary to increase TBI identification, treatment, and foremost, prevention.Results highlight the importance of examining specific occupational categories rather than relying on gross categorizations, which do not account for shared knowledge, skills, and abilities within occupations. The quantification of risk among enlisted MOCs suggests a need for further research into the causes of TBI

    Lifetime Traumatic Brain injury and Risk of Post-Concussive Symptoms in the Millennium Cohort Study

    Get PDF
    Traumatic brain injury (TBI) is prevalent among active duty military service members, with studies reporting up to 23% experiencing at least one TBI, with 10-60% of service members reporting at least one subsequent repeat TBI. A TBI has been associated with an increased risk of cumulative effects and long-term neurobehavioral symptoms, impacting operational readiness in the short-term and overall health in the long term. The association between multiple TBI and post-concussive symptoms (PCS), however, defined as symptoms that follow a concussion or TBI, in the military has not been adequately examined. Previous studies in military populations are limited by methodological issues including small sample sizes, the use of non-probability sampling, or failure to include the total number of TBI. to overcome these limitations, we examined the association between the total lifetime number of TBI and total number of PCS among U.S. active duty military service members who participated in the Millennium Cohort Study. A secondary data analysis was conducted using the Millennium Cohort Study\u27s 2014 survey

    Examining Social Vulnerability and the association With Covid-19 incidence in Harris County, Texas

    Get PDF
    Studies have investigated the association between social vulnerability and SARS-CoV-2 incidence. However, few studies have examined small geographic units such as census tracts, examined geographic regions with large numbers of Hispanic and Black populations, controlled for testing rates, and incorporated stay-at-home measures into their analyses. Understanding the relationship between social vulnerability and SARS-CoV-2 incidence is critical to understanding the interplay between social determinants and implementing risk mitigation guidelines to curtail the spread of infectious diseases. The objective of this study was to examine the relationship between CDC\u27s Social Vulnerability Index (SVI) and SARS-CoV-2 incidence while controlling for testing rates and the proportion of those who stayed completely at home among 783 Harris County, Texas census tracts. SARS-CoV-2 incidence data were collected between May 15 and October 1, 2020. The SVI and its themes were the primary exposures. Median percent time at home was used as a covariate to measure the effect of staying at home on the association between social vulnerability and SARS-CoV-2 incidence. Data were analyzed using Kruskal Wallis and negative binomial regressions (NBR) controlling for testing rates and staying at home. Results showed that a unit increase in the SVI score and the SVI themes were associated with significant increases in SARS-CoV-2 incidence. The incidence risk ratio (IRR) was 1.090 (95% CI, 1.082, 1.098) for the overall SVI; 1.107 (95% CI, 1.098, 1.115) for minority status/language; 1.090 (95% CI, 1.083, 1.098) for socioeconomic; 1.060 (95% CI, 1.050, 1.071) for household composition/disability, and 1.057 (95% CI, 1.047, 1.066) for housing type/transportation. When controlling for stay-at-home, the association between SVI themes and SARS-CoV-2 incidence remained significant. In the NBR model that included all four SVI themes, only the socioeconomic and minority status/language themes remained significantly associated with SARS-CoV-2 incidence. Community-level infections were not explained by a communities\u27 inability to stay at home. These findings suggest that community-level social vulnerability, such as socioeconomic status, language barriers, use of public transportation, and housing density may play a role in the risk of SARS-CoV-2 infection regardless of the ability of some communities to stay at home because of the need to work or other reasons

    Machine Learning Automated Detection of Large Vessel Occlusion From Mobile Stroke Unit Computed Tomography Angiography

    Get PDF
    BACKGROUND: Prehospital automated large vessel occlusion (LVO) detection in Mobile Stroke Units (MSUs) could accelerate identification and treatment of patients with LVO acute ischemic stroke. Here, we evaluate the performance of a machine learning (ML) model on CT angiograms (CTAs) obtained from 2 MSUs to detect LVO. METHODS: Patients evaluated on MSUs in Houston and Los Angeles with out-of-hospital CTAs were identified. Anterior circulation LVO was defined as an occlusion of the intracranial internal carotid artery, middle cerebral artery (M1 or M2), or anterior cerebral artery vessels and determined by an expert human reader. A ML model to detect LVO was trained and tested on independent data sets consisting of in-hospital CTAs and then tested on MSU CTA images. Model performance was determined using area under the receiver-operator curve statistics. RESULTS: Among 68 patients with out-of-hospital MSU CTAs, 40% had an LVO. The most common occlusion location was the middle cerebral artery M1 segment (59%), followed by the internal carotid artery (30%), and middle cerebral artery M2 (11%). Median time from last known well to CTA imaging was 88.0 (interquartile range, 59.5-196.0) minutes. After training on 870 in-hospital CTAs, the ML model performed well in identifying LVO in a separate in-hospital data set of 441 images with area under receiver-operator curve of 0.84 (95% CI, 0.80-0.87). ML algorithm analysis time was under 1 minute. The performance of the ML model on the MSU CTA images was comparable with area under receiver-operator curve 0.80 (95% CI, 0.71-0.89). There was no significant difference in performance between the Houston and Los Angeles MSU CTA cohorts. CONCLUSIONS: In this study of patients evaluated on MSUs in 2 cities, a ML algorithm was able to accurately and rapidly detect LVO using prehospital CTA acquisitions

    Case Growth analysis to inform Local Response to Covid-19 Epidemic in a Diverse Us Community

    Get PDF
    Early detection of new outbreak waves is critical for effective and sustained response to the COVID-19 pandemic. We conducted a growth rate analysis using local community and inpatient records from seven hospital systems to characterize distinct phases in SARS-CoV-2 outbreak waves in the Greater Houston area. We determined the transition times from rapid spread of infection in the community to surge in the number of inpatients in local hospitals. We identified 193,237 residents who tested positive for SARS-CoV-2 via molecular testing from April 8, 2020 to June 30, 2021, and 30,031 residents admitted within local healthcare institutions with a positive SARS-CoV-2 test, including emergency cases. We detected two distinct COVID-19 waves: May 12, 2020-September 6, 2020 and September 27, 2020-May 15, 2021; each encompassed four growth phases: lagging, exponential/rapid growth, deceleration, and stationary/linear. Our findings showed that, during early stages of the pandemic, the surge in the number of daily cases in the community preceded that of inpatients admitted to local hospitals by 12-36 days. Rapid decline in hospitalized cases was an early indicator of transition to deceleration in the community. Our real-time analysis informed local pandemic response in one of the largest U.S. metropolitan areas, providing an operationalized framework to support robust real-world surveillance for outbreak preparedness

    Accuracy of optical spectroscopy for the detection of cervical intraepithelial neoplasia without colposcopic tissue information; a step toward automation for low resource settings

    Get PDF
    Optical spectroscopy has been proposed as an accurate and low-cost alternative for detection of cervical intraepithelial neoplasia. We previously published an algorithm using optical spectroscopy as an adjunct to colposcopy and found good accuracy (sensitivity ¼ 1.00 [95% confidence interval ðCIÞ ¼ 0.92 to 1.00], specificity ¼ 0.71 [95% CI ¼ 0.62 to 0.79]). Those results used measurements taken by expert colposcopists as well as the colposcopy diagnosis. In this study, we trained and tested an algorithm for the detection of cervical intraepithelial neoplasia (i.e., identifying those patients who had histology reading CIN 2 or worse) that did not include the colposcopic diagnosis. Furthermore, we explored the interaction between spectroscopy and colposcopy, examining the importance of probe placement expertise. The colposcopic diagnosis-independent spectroscopy algorithm had a sensitivity of 0.98 (95% CI ¼ 0.89 to 1.00) and a specificity of 0.62 (95% CI ¼ 0.52 to 0.71). The difference in the partial area under the ROC curves between spectroscopy with and without the colposcopic diagnosis was statistically significant at the patient level (p ¼ 0.05) but not the site level (p ¼ 0.13). The results suggest that the device has high accuracy over a wide range of provider accuracy and hence could plausibly be implemented by providers with limited training
    corecore