44 research outputs found

    Refusing to treat – is it legal? Is it justifiable? Is it ethical?

    Get PDF
    Historically, when clinicians wanted to know if certain conduct was ethical, they would consult the guidelines set out in the Hippocratic oath. While adherence to the oath may “represent an expression of the professions’ ethical obligations”, and be useful in promoting their commitment to “abide by these norms”, this assumption is open to question.1 Different practitioners may see and interpret the codes in different ways, depending on their personal ethos as well as the specific time and situation under consideration. At the same time, ethical material can and should reform, and when needed, be re written under optimal cool, calm conditions. Changes should be based on “contributions from those with a variety of perspectives who have access to as much available knowledge as possible” and not implemented as a result of immediate pressures where there may be distorting circumstances.1 Perhaps the best way to judge their value is to debate how well the code addresses the issue at hand in terms of its “comprehensiveness, clarity and consistency”.1 This paper uses an actual patient scenario as a basis on which to pose some clinically and ethically related queries and postulate possible solutions

    Clinical anatomy of the anterior cruciate ligament and pre-operative prediction of ligament length

    Get PDF
    BACKGROUND : Ligament grafts used in anterior cruciate ligament (ACL) reconstruction need to be the correct length for proper functioning. If the graft length is incorrect, the patient could risk knee instability, loss of range of motion, or failure of graft fixation. Easier and time-efficient reconstruction will be facilitated if the length of the ACL is predicted in advance. Apart from examining the morphological properties of the ACL, this study aimed to determine whether the epicondylar width of an individual can be used to predict ACL length and thereby assist in restoring the normal anatomy of the ACL. METHODS : Ninety-one adult cadavers were studied. Patellar ligament (PL) length, ACL length, ACL width and the maximum femoral epicondylar width (FECW) were measured. RESULTS : The morphology of the ACL and PL was determined. The results revealed that FECW was the most reliable predictor of ACL length. A linear regression formula was developed in order to determine ACL length by measuring maximum FECW. CONCLUSIONS : ACL and PL morphology compared well with the results found in previous studies. An individual's FECW can be used to predict ACL length pre-operatively. These results could improve pre-operative planning of ACL reconstruction.http://www.scielo.org.za/scielo.php?script=sci_serial&pid=1681-150X&lng=pt&nrm=isoAnatomyStatistic

    Predictors of persistently positive Mycobacterium-tuberculosis-specific interferon-gamma responses in the serial testing of health care workers

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Data on the performance of Mycobacterium-tuberculosis-specific interferon-(IFN)-γ release assays (IGRAs) in the serial testing of health care workers (HCWs) is limited. The objective of the present study was to determine the frequency of IGRA conversions and reversions and to identify predictors of persistent IGRA positivity among serially tested German HCWs in the absence of recent extensive tuberculosis (TB) exposure.</p> <p>Methods</p> <p>In this observational cohort-study HCWs were prospectively recruited within occupational safety and health measures and underwent a tuberculin skin test (TST) and the IGRA QuantiFERON<sup>®</sup>-TB Gold In-Tube (QFT-GIT) at baseline. The QFT-GIT was repeated 18 weeks later in the median. QFT-GIT conversions (and reversions) were defined as baseline IFN-γ < 0.35 IU/ml and follow-up IFN-γ ≥ 0.35 IU/ml (and vice versa). Predictors of persistently positive QFT-GIT results were calculated by logistic regression analysis.</p> <p>Results</p> <p>In total, 18 (9.9%) and 15 (8.2%) of 182 analyzed HCWs were QFT-GIT-positive at baseline and at follow-up, respectively. We observed a strong overall agreement between baseline and follow-up QFT-GIT results (κ = 0.70). Reversions (6/18, 33.3%) occurred more frequently than conversions (3/162, 1.9%). Age and positive prior and recent TST results independently predicted persistent QFT-GIT positivity. Furthermore, the chance of having persistently positive QFT-GIT results raised about 3% with each additional 0.1 IU/ml increase in the baseline IFN-γ response (adjusted odds ratio 1.03, 95% confidence interval 1.01-1.04). No active TB cases were detected within an observational period of more than two years.</p> <p>Conclusions</p> <p>The QFT-GIT's utility for the application in serial testing was limited by a substantial proportion of reversions. This shortcoming could be overcome by the implementation of a borderline zone for the interpretation of QFT-GIT results. However, further studies are needed to clearly define the within-subject variability of the QFT-GIT and to confirm that increasing age, concordantly positive TST results, and the extend of baseline IFN-γ responses may predict the persistence of QFT-GIT positivity over time in serially tested HCWs with only a low or medium TB screening risk in a TB low-incidence setting.</p

    A many-analysts approach to the relation between religiosity and well-being

    Get PDF
    The relation between religiosity and well-being is one of the most researched topics in the psychology of religion, yet the directionality and robustness of the effect remains debated. Here, we adopted a many-analysts approach to assess the robustness of this relation based on a new cross-cultural dataset (N=10,535 participants from 24 countries). We recruited 120 analysis teams to investigate (1) whether religious people self-report higher well-being, and (2) whether the relation between religiosity and self-reported well-being depends on perceived cultural norms of religion (i.e., whether it is considered normal and desirable to be religious in a given country). In a two-stage procedure, the teams first created an analysis plan and then executed their planned analysis on the data. For the first research question, all but 3 teams reported positive effect sizes with credible/confidence intervals excluding zero (median reported β=0.120). For the second research question, this was the case for 65% of the teams (median reported β=0.039). While most teams applied (multilevel) linear regression models, there was considerable variability in the choice of items used to construct the independent variables, the dependent variable, and the included covariates

    A Many-analysts Approach to the Relation Between Religiosity and Well-being

    Get PDF
    The relation between religiosity and well-being is one of the most researched topics in the psychology of religion, yet the directionality and robustness of the effect remains debated. Here, we adopted a many-analysts approach to assess the robustness of this relation based on a new cross-cultural dataset (N = 10, 535 participants from 24 countries). We recruited 120 analysis teams to investigate (1) whether religious people self-report higher well-being, and (2) whether the relation between religiosity and self-reported well-being depends on perceived cultural norms of religion (i.e., whether it is considered normal and desirable to be religious in a given country). In a two-stage procedure, the teams first created an analysis plan and then executed their planned analysis on the data. For the first research question, all but 3 teams reported positive effect sizes with credible/confidence intervals excluding zero (median reported β = 0.120). For the second research question, this was the case for 65% of the teams (median reported β = 0.039). While most teams applied (multilevel) linear regression models, there was considerable variability in the choice of items used to construct the independent variables, the dependent variable, and the included covariates

    Design and baseline characteristics of the finerenone in reducing cardiovascular mortality and morbidity in diabetic kidney disease trial

    Get PDF
    Background: Among people with diabetes, those with kidney disease have exceptionally high rates of cardiovascular (CV) morbidity and mortality and progression of their underlying kidney disease. Finerenone is a novel, nonsteroidal, selective mineralocorticoid receptor antagonist that has shown to reduce albuminuria in type 2 diabetes (T2D) patients with chronic kidney disease (CKD) while revealing only a low risk of hyperkalemia. However, the effect of finerenone on CV and renal outcomes has not yet been investigated in long-term trials. Patients and Methods: The Finerenone in Reducing CV Mortality and Morbidity in Diabetic Kidney Disease (FIGARO-DKD) trial aims to assess the efficacy and safety of finerenone compared to placebo at reducing clinically important CV and renal outcomes in T2D patients with CKD. FIGARO-DKD is a randomized, double-blind, placebo-controlled, parallel-group, event-driven trial running in 47 countries with an expected duration of approximately 6 years. FIGARO-DKD randomized 7,437 patients with an estimated glomerular filtration rate >= 25 mL/min/1.73 m(2) and albuminuria (urinary albumin-to-creatinine ratio >= 30 to <= 5,000 mg/g). The study has at least 90% power to detect a 20% reduction in the risk of the primary outcome (overall two-sided significance level alpha = 0.05), the composite of time to first occurrence of CV death, nonfatal myocardial infarction, nonfatal stroke, or hospitalization for heart failure. Conclusions: FIGARO-DKD will determine whether an optimally treated cohort of T2D patients with CKD at high risk of CV and renal events will experience cardiorenal benefits with the addition of finerenone to their treatment regimen. Trial Registration: EudraCT number: 2015-000950-39; ClinicalTrials.gov identifier: NCT02545049

    Evacetrapib and Cardiovascular Outcomes in High-Risk Vascular Disease

    Get PDF
    BACKGROUND: The cholesteryl ester transfer protein inhibitor evacetrapib substantially raises the high-density lipoprotein (HDL) cholesterol level, reduces the low-density lipoprotein (LDL) cholesterol level, and enhances cellular cholesterol efflux capacity. We sought to determine the effect of evacetrapib on major adverse cardiovascular outcomes in patients with high-risk vascular disease. METHODS: In a multicenter, randomized, double-blind, placebo-controlled phase 3 trial, we enrolled 12,092 patients who had at least one of the following conditions: an acute coronary syndrome within the previous 30 to 365 days, cerebrovascular atherosclerotic disease, peripheral vascular arterial disease, or diabetes mellitus with coronary artery disease. Patients were randomly assigned to receive either evacetrapib at a dose of 130 mg or matching placebo, administered daily, in addition to standard medical therapy. The primary efficacy end point was the first occurrence of any component of the composite of death from cardiovascular causes, myocardial infarction, stroke, coronary revascularization, or hospitalization for unstable angina. RESULTS: At 3 months, a 31.1% decrease in the mean LDL cholesterol level was observed with evacetrapib versus a 6.0% increase with placebo, and a 133.2% increase in the mean HDL cholesterol level was seen with evacetrapib versus a 1.6% increase with placebo. After 1363 of the planned 1670 primary end-point events had occurred, the data and safety monitoring board recommended that the trial be terminated early because of a lack of efficacy. After a median of 26 months of evacetrapib or placebo, a primary end-point event occurred in 12.9% of the patients in the evacetrapib group and in 12.8% of those in the placebo group (hazard ratio, 1.01; 95% confidence interval, 0.91 to 1.11; P=0.91). CONCLUSIONS: Although the cholesteryl ester transfer protein inhibitor evacetrapib had favorable effects on established lipid biomarkers, treatment with evacetrapib did not result in a lower rate of cardiovascular events than placebo among patients with high-risk vascular disease. (Funded by Eli Lilly; ACCELERATE ClinicalTrials.gov number, NCT01687998 .)

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Analytical methods for virus detection in water and food

    Get PDF
    Potential ways to address the issues that relate to the techniques for analyzing food and environmental samples for the presence of enteric viruses are discussed. It is not the authors’ remit to produce or recommend standard or reference methods but to address specific issues in the analytical procedures. Foods of primary importance are bivalve molluscs, particularly, oysters, clams, and mussels; salad crops such as lettuce, green onions and other greens; and soft fruits such as raspberries and strawberries. All types of water, not only drinking water but also recreational water (fresh, marine, and swimming pool), river water (irrigation water), raw and treated sewage are potential vehicles for virus transmission. Well over 100 different enteric viruses could be food or water contaminants; however, with few exceptions, most well-characterized foodborne or waterborne viral outbreaks are restricted to hepatitis A virus (HAV) and calicivirus, essentially norovirus (NoV). Target viruses for analytical methods include, in addition to NoV and HAV, hepatitis E virus (HEV), enteroviruses (e.g., poliovirus), adenovirus, rotavirus, astrovirus, and any other relevant virus likely to be transmitted by food or water. A survey of the currently available methods for detection of viruses in food and environmental matrices was conducted, gathering information on protocols for extraction of viruses from various matrices and on the various specific detection techniques for each virus type.G. Sánchez is the recipient of a JAE doctor grant from the “Consejo Superior de Investigaciones Científicas” (CSIC). Rembuluwani Netshikweta acknowledges a post-graduate bursary from the Poliomyelitis Research Foundation, South Africa
    corecore