394 research outputs found
Pharmacological Management of Obesity: An Endocrine Society Clinical Practice Guideline
OBJECTIVE: To formulate clinical practice guidelines for the pharmacological management of obesity. PARTICIPANTS: An Endocrine Society-appointed Task Force of experts, a methodologist, and a medical writer. This guideline was co-sponsored by the European Society of Endocrinology and The Obesity Society. EVIDENCE: This evidence-based guideline was developed using the Grading of Recommendations, Assessment, Development, and Evaluation (GRADE) system to describe the strength of recommendations and the quality of evidence. CONSENSUS PROCESS: One group meeting, several conference calls, and e-mail communications enabled consensus. Committees and members of the Endocrine Society, the European Society of Endocrinology, and The Obesity Society reviewed and commented on preliminary drafts of these guidelines. Two systematic reviews were conducted to summarize some of the supporting evidence. CONCLUSIONS: Weight loss is a pathway to health improvement for patients with obesity-associated risk factors and comorbidities. Medications approved for chronic weight management can be useful adjuncts to lifestyle change for patients who have been unsuccessful with diet and exercise alone. Many medications commonly prescribed for diabetes, depression, and other chronic diseases have weight effects, either to promote weight gain or produce weight loss. Knowledgeable prescribing of medications, choosing whenever possible those with favorable weight profiles, can aid in the prevention and management of obesity and thus improve health
Recommended from our members
Effect of diet composition and weight loss on resting energy expenditure in the POUNDS LOST study
Weight loss reduces energy expenditure, but it is unclear whether dietary macronutrient composition affects this reduction. We hypothesized that energy expenditure might be modulated by macronutrient composition of the diet. The POUNDS LOST study, a prospective, randomized controlled trial in 811 overweight/obese people who were randomized in a 2×2 design to diets containing 20en% or 40en% fat and 15en% or 25en% (diets with 65%, 55%, 45% and 35% carbohydrate) provided the data to test this hypothesis. Resting energy expenditure (REE) was measured at baseline, 6 and 24 months using a ventilated hood. REE declined at 6 months by 99.5±8.0 kcal/d in men and 55.2±10.6 kcal/d in women during the first 6 months. This decline was related to the weight loss, and there was no difference between the diets. REE had returned to baseline by 24 months, but body weight was still 60% below baseline. Measured REE at 6 months was significantly lower than the predicted (−18.2±6.7 kcal/d) and was the result of significant reductions from baseline in the low fat diets (65% or 55% carbohydrate), but not in the high fat diet groups. By 24 months the difference had reversed with measured REE being slightly but significantly higher than predicted (21.8±10.1 kcal/d). In conclusion, we found that REE fell significantly after weight loss but was not related to diet composition. Adaptive thermogenesis was evident at 6 months, but not at 24 months
Recommended from our members
A Phase II Basket Trial of Dual Anti-CTLA-4 and Anti-PD-1 Blockade in Rare Tumors (DART SWOG 1609) in Patients with Nonpancreatic Neuroendocrine Tumors.
PurposeImmune checkpoint blockade has improved outcomes across tumor types; little is known about the efficacy of these agents in rare tumors. We report the results of the (nonpancreatic) neuroendocrine neoplasm cohort of SWOG S1609 dual anti-CTLA-4 and anti-PD-1 blockade in rare tumors (DART).Patients and methodsWe performed a prospective, open-label, multicenter phase II clinical trial of ipilimumab plus nivolumab across multiple rare tumor cohorts, with the (nonpancreatic) neuroendocrine cohort reported here. Response assessment by grade was not prespecified. The primary endpoint was overall response rate [ORR; RECIST v1.1; complete response (CR) and partial response (PR)]; secondary endpoints included progression-free survival (PFS), overall survival (OS), stable disease >6 months, and toxicity.ResultsThirty-two eligible patients received therapy; 18 (56%) had high-grade disease. Most common primary sites were gastrointestinal (47%; N = 15) and lung (19%; N = 6). The overall ORR was 25% [95% confidence interval (CI) 13-64%; CR, 3%, N = 1; PR, 22%, N = 7]. Patients with high-grade neuroendocrine carcinoma had an ORR of 44% (8/18 patients) versus 0% in low/intermediate grade tumors (0/14 patients; P = 0.004). The 6-month PFS was 31% (95% CI, 19%-52%); median OS was 11 months (95% CI, 6-∞). The most common toxicities were hypothyroidism (31%), fatigue (28%), and nausea (28%), with alanine aminotransferase elevation (9%) as the most common grade 3/4 immune-related adverse event, and no grade 5 events.ConclusionsIpilimumab plus nivolumab demonstrated a 44% ORR in patients with nonpancreatic high-grade neuroendocrine carcinoma, with 0% ORR in low/intermediate grade disease
Hope in dirt: report of the Fort Apache Workshop on Forensic Sedimentology Applications to Cultural Property Crime, 15—19 October 2018
A 2018 workshop on the White Mountain Apache Tribe lands in Arizona examined ways to enhance investigations into cultural property crime (CPC) through applications of rapidly evolving methods from archaeological science. CPC (also looting, graverobbing) refers to unauthorized damage, removal, or trafficking in materials possessing blends of communal, aesthetic, and scientific values. The Fort Apache workshop integrated four generally partitioned domains of CPC expertise: (1) theories of perpetrators’ motivations and methods; (2) recommended practice in sustaining public and community opposition to CPC; (3) tactics and strategies for documenting, investigating, and prosecuting CPC; and (4) forensic sedimentology—uses of biophysical sciences to link sediments from implicated persons and objects to crime scenes. Forensic sedimentology served as the touchstone for dialogues among experts in criminology, archaeological sciences, law enforcement, and heritage stewardship. Field visits to CPC crime scenes and workshop deliberations identified pathways toward integrating CPC theory and practice with forensic sedimentology’s potent battery of analytic methods
Drug-gene interactions of antihypertensive medications and risk of incident cardiovascular disease: a pharmacogenomics study from the CHARGE consortium
Background
Hypertension is a major risk factor for a spectrum of cardiovascular diseases (CVD), including myocardial infarction, sudden death, and stroke. In the US, over 65 million people have high blood pressure and a large proportion of these individuals are prescribed antihypertensive medications. Although large long-term clinical trials conducted in the last several decades have identified a number of effective antihypertensive treatments that reduce the risk of future clinical complications, responses to therapy and protection from cardiovascular events vary among individuals.
Methods
Using a genome-wide association study among 21,267 participants with pharmaceutically treated hypertension, we explored the hypothesis that genetic variants might influence or modify the effectiveness of common antihypertensive therapies on the risk of major cardiovascular outcomes. The classes of drug treatments included angiotensin-converting enzyme inhibitors, beta-blockers, calcium channel blockers, and diuretics. In the setting of the Cohorts for Heart and Aging Research in Genomic Epidemiology (CHARGE) consortium, each study performed array-based genome-wide genotyping, imputed to HapMap Phase II reference panels, and used additive genetic models in proportional hazards or logistic regression models to evaluate drug-gene interactions for each of four therapeutic drug classes. We used meta-analysis to combine study-specific interaction estimates for approximately 2 million single nucleotide polymorphisms (SNPs) in a discovery analysis among 15,375 European Ancestry participants (3,527 CVD cases) with targeted follow-up in a case-only study of 1,751 European Ancestry GenHAT participants as well as among 4,141 African-Americans (1,267 CVD cases).
Results
Although drug-SNP interactions were biologically plausible, exposures and outcomes were well measured, and power was sufficient to detect modest interactions, we did not identify any statistically significant interactions from the four antihypertensive therapy meta-analyses (Pinteraction > 5.0×10−8). Similarly, findings were null for meta-analyses restricted to 66 SNPs with significant main effects on coronary artery disease or blood pressure from large published genome-wide association studies (Pinteraction ≥ 0.01). Our results suggest that there are no major pharmacogenetic influences of common SNPs on the relationship between blood pressure medications and the risk of incident CVD
Comparison of four methods to measure haemoglobin concentrations in whole blood donors (COMPARE): A diagnostic accuracy study.
OBJECTIVE: To compare four haemoglobin measurement methods in whole blood donors. BACKGROUND: To safeguard donors, blood services measure haemoglobin concentration in advance of each donation. NHS Blood and Transplant's (NHSBT) customary method have been capillary gravimetry (copper sulphate), followed by venous spectrophotometry (HemoCue) for donors failing gravimetry. However, NHSBT's customary method results in 10% of donors being inappropriately bled (ie, with haemoglobin values below the regulatory threshold). METHODS: We compared the following four methods in 21 840 blood donors (aged ≥18 years) recruited from 10 NHSBT centres in England, with the Sysmex XN-2000 haematology analyser, the reference standard: (1) NHSBT's customary method; (2) "post donation" approach, that is, estimating current haemoglobin concentration from that measured by a haematology analyser at a donor's most recent prior donation; (3) "portable haemoglobinometry" (using capillary HemoCue); (4) non-invasive spectrometry (using MBR Haemospect or Orsense NMB200). We assessed sensitivity; specificity; proportion who would have been inappropriately bled, or rejected from donation ("deferred") incorrectly; and test preference. RESULTS: Compared with the reference standard, the methods ranged in test sensitivity from 17.0% (MBR Haemospect) to 79.0% (portable haemoglobinometry) in men, and from 19.0% (MBR Haemospect) to 82.8% (portable haemoglobinometry) in women. For specificity, the methods ranged from 87.2% (MBR Haemospect) to 99.9% (NHSBT's customary method) in men, and from 74.1% (Orsense NMB200) to 99.8% (NHSBT's customary method) in women. The proportion of donors who would have been inappropriately bled ranged from 2.2% in men for portable haemoglobinometry to 18.9% in women for MBR Haemospect. The proportion of donors who would have been deferred incorrectly with haemoglobin concentration above the minimum threshold ranged from 0.1% in men for NHSBT's customary method to 20.3% in women for OrSense. Most donors preferred non-invasive spectrometry. CONCLUSION: In the largest study reporting head-to-head comparisons of four methods to measure haemoglobin prior to blood donation, our results support replacement of NHSBT's customary method with portable haemoglobinometry
Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors
Background:
Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries.
Methods:
In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants.
Findings:
45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups.
Interpretation:
Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency.
Funding:
NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
Structure of the germline genome of Tetrahymena thermophila and relationship to the massively rearranged somatic genome
The germline genome of the binucleated ciliate Tetrahymena thermophila undergoes programmed chromosome breakage and massive DNA elimination to generate the somatic genome. Here, we present a complete sequence assembly of the germline genome and analyze multiple features of its structure and its relationship to the somatic genome, shedding light on the mechanisms of genome rearrangement as well as the evolutionary history of this remarkable germline/soma differentiation. Our results strengthen the notion that a complex, dynamic, and ongoing interplay between mobile DNA elements and the host genome have shaped Tetrahymena chromosome structure, locally and globally. Non-standard outcomes of rearrangement events, including the generation of short-lived somatic chromosomes and excision of DNA interrupting protein-coding regions, may represent novel forms of developmental gene regulation. We also compare Tetrahymenas germline/soma differentiation to that of other characterized ciliates, illustrating the wide diversity of adaptations that have occurred within this phylum.</p
- …