36 research outputs found

    Post-Vasectomy Semen Analysis: Optimizing Laboratory Procedures and Test Interpretation through a Clinical Audit and Global Survey of Practices

    Get PDF
    Purpose: The success of vasectomy is determined by the outcome of a post-vasectomy semen analysis (PVSA). This article describes a step-by-step procedure to perform PVSA accurately, report data from patients who underwent post vasectomy semen analysis between 2015 and 2021 experience, along with results from an international online survey on clinical practice. Materials and methods: We present a detailed step-by-step protocol for performing and interpretating PVSA testing, along with recommendations for proficiency testing, competency assessment for performing PVSA, and clinical and laboratory scenarios. Moreover, we conducted an analysis of 1,114 PVSA performed at the Cleveland Clinic's Andrology Laboratory and an online survey to understand clinician responses to the PVSA results in various countries. Results: Results from our clinical experience showed that 92.1% of patients passed PVSA, with 7.9% being further tested. A total of 78 experts from 19 countries participated in the survey, and the majority reported to use time from vasectomy rather than the number of ejaculations as criterion to request PVSA. A high percentage of responders reported permitting unprotected intercourse only if PVSA samples show azoospermia while, in the presence of few non-motile sperm, the majority of responders suggested using alternative contraception, followed by another PVSA. In the presence of motile sperm, the majority of participants asked for further PVSA testing. Repeat vasectomy was mainly recommended if motile sperm were observed after multiple PVSA's. A large percentage reported to recommend a second PVSA due to the possibility of legal actions. Conclusions: Our results highlighted varying clinical practices around the globe, with controversy over the significance of non-motile sperm in the PVSA sample. Our data suggest that less stringent AUA guidelines would help improve test compliance. A large longitudinal multi-center study would clarify various doubts related to timing and interpretation of PVSA and would also help us to understand, and perhaps predict, recanalization and the potential for future failure of a vasectomy

    Global, regional, and national burden of stroke and its risk factors, 1990-2019: a systematic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Regularly updated data on stroke and its pathological types, including data on their incidence, prevalence, mortality, disability, risk factors, and epidemiological trends, are important for evidence-based stroke care planning and resource allocation. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) aims to provide a standardised and comprehensive measurement of these metrics at global, regional, and national levels. Methods: We applied GBD 2019 analytical tools to calculate stroke incidence, prevalence, mortality, disability-adjusted life-years (DALYs), and the population attributable fraction (PAF) of DALYs (with corresponding 95% uncertainty intervals [UIs]) associated with 19 risk factors, for 204 countries and territories from 1990 to 2019. These estimates were provided for ischaemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage, and all strokes combined, and stratified by sex, age group, and World Bank country income level. Findings: In 2019, there were 12·2 million (95% UI 11·0–13·6) incident cases of stroke, 101 million (93·2–111) prevalent cases of stroke, 143 million (133–153) DALYs due to stroke, and 6·55 million (6·00–7·02) deaths from stroke. Globally, stroke remained the second-leading cause of death (11·6% [10·8–12·2] of total deaths) and the third-leading cause of death and disability combined (5·7% [5·1–6·2] of total DALYs) in 2019. From 1990 to 2019, the absolute number of incident strokes increased by 70·0% (67·0–73·0), prevalent strokes increased by 85·0% (83·0–88·0), deaths from stroke increased by 43·0% (31·0–55·0), and DALYs due to stroke increased by 32·0% (22·0–42·0). During the same period, age-standardised rates of stroke incidence decreased by 17·0% (15·0–18·0), mortality decreased by 36·0% (31·0–42·0), prevalence decreased by 6·0% (5·0–7·0), and DALYs decreased by 36·0% (31·0–42·0). However, among people younger than 70 years, prevalence rates increased by 22·0% (21·0–24·0) and incidence rates increased by 15·0% (12·0–18·0). In 2019, the age-standardised stroke-related mortality rate was 3·6 (3·5–3·8) times higher in the World Bank low-income group than in the World Bank high-income group, and the age-standardised stroke-related DALY rate was 3·7 (3·5–3·9) times higher in the low-income group than the high-income group. Ischaemic stroke constituted 62·4% of all incident strokes in 2019 (7·63 million [6·57–8·96]), while intracerebral haemorrhage constituted 27·9% (3·41 million [2·97–3·91]) and subarachnoid haemorrhage constituted 9·7% (1·18 million [1·01–1·39]). In 2019, the five leading risk factors for stroke were high systolic blood pressure (contributing to 79·6 million [67·7–90·8] DALYs or 55·5% [48·2–62·0] of total stroke DALYs), high body-mass index (34·9 million [22·3–48·6] DALYs or 24·3% [15·7–33·2]), high fasting plasma glucose (28·9 million [19·8–41·5] DALYs or 20·2% [13·8–29·1]), ambient particulate matter pollution (28·7 million [23·4–33·4] DALYs or 20·1% [16·6–23·0]), and smoking (25·3 million [22·6–28·2] DALYs or 17·6% [16·4–19·0]). Interpretation: The annual number of strokes and deaths due to stroke increased substantially from 1990 to 2019, despite substantial reductions in age-standardised rates, particularly among people older than 70 years. The highest age-standardised stroke-related mortality and DALY rates were in the World Bank low-income group. The fastest-growing risk factor for stroke between 1990 and 2019 was high body-mass index. Without urgent implementation of effective primary prevention strategies, the stroke burden will probably continue to grow across the world, particularly in low-income countries. Funding: Bill & Melinda Gates Foundation

    Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study

    Get PDF
    Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P < 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P < 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk

    Sperm vitality and necrozoospermia: diagnosis, management, and results of a global survey of clinical practice

    Get PDF
    Sperm vitality testing is a basic semen examination that has been described in the World Health Organization (WHO) Laboratory Manual for the Examination and Processing of Human Semen from its primary edition, 40 years ago. Several methods can be used to test sperm vitality, such as the eosin-nigrosin (E-N) stain or the hypoosmotic swelling (HOS) test. In the 6th (2021) edition of the WHO Laboratory Manual, sperm vitality assessment is mainly recommended if the total motility is less than 40%. Hence, a motile spermatozoon is considered alive, however, in certain conditions an immotile spermatozoon can also be alive. Therefore, the differentiation between asthenozoospermia (pathological decrease in sperm motility) and necrozoospermia (pathological decrease in sperm vitality) is important in directing further investigation and management of infertile patients. The causes leading to necrozoospermia are diverse and can either be local or general, testicular or extra-testicular. The andrological management of necrozoospermia depends on its etiology. However, there is no standardized treatment available presently and practice varies among clinicians. In this study, we report the results of a global survey to understand current practices regarding the physician order of sperm vitality tests as well as the management practices for necrozoospermia. Laboratory and clinical scenarios are presented to guide the reader in the management of necrozoospermia with the overall objective of establishing a benchmark ranging from the diagnosis of necrozoospermia by sperm vitality testing to its clinical management

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Algal and aquatic plant carbon concentrating mechanisms in relation to environmental change

    Get PDF
    Carbon dioxide concentrating mechanisms (also known as inorganic carbon concentrating mechanisms; both abbreviated as CCMs) presumably evolved under conditions of low CO2 availability. However, the timing of their origin is unclear since there are no sound estimates from molecular clocks, and even if there were, there are no proxies for the functioning of CCMs. Accordingly, we cannot use previous episodes of high CO2 (e.g. the Palaeocene-Eocene Thermal Maximum) to indicate how organisms with CCMs responded. Present and predicted environmental change in terms of increased CO2 and temperature are leading to increased CO2 and HCO3- and decreased CO32- and pH in surface seawater, as well as decreasing the depth of the upper mixed layer and increasing the degree of isolation of this layer with respect to nutrient flux from deeper waters. The outcome of these forcing factors is to increase the availability of inorganic carbon, photosynthetic active radiation (PAR) and ultraviolet B radiation (UVB) to aquatic photolithotrophs and to decrease the supply of the nutrients (combined) nitrogen and phosphorus and of any non-aeolian iron. The influence of these variations on CCM expression has been examined to varying degrees as acclimation by extant organisms. Increased PAR increases CCM expression in terms of CO2 affinity, while increased UVB has a range of effects in the organisms examined; little relevant information is available on increased temperature. Decreased combined nitrogen supply generally increases CO2 affinity, decreased iron availability increases CO2 affinity, and decreased phosphorus supply has varying effects on the organisms examined. There are few data sets showing interactions among the observed changes, and even less information on genetic (adaptation) changes in response to the forcing factors. In freshwaters, changes in phytoplankton species composition may alter with environmental change with consequences for frequency of species with or without CCMs. The information available permits less predictive power as to the effect of the forcing factors on CCM expression than for their overall effects on growth. CCMs are currently not part of models as to how global environmental change has altered, and is likely to further alter, algal and aquatic plant primary productivity
    corecore