93 research outputs found

    Impact of nonoptimal intakes of saturated, polyunsaturated, and trans fat on global burdens of coronary heart disease

    Get PDF
    Background: Saturated fat (SFA), ω‐6 (n‐6) polyunsaturated fat (PUFA), and trans fat (TFA) influence risk of coronary heart disease (CHD), but attributable CHD mortalities by country, age, sex, and time are unclear. Methods and Results: National intakes of SFA, n‐6 PUFA, and TFA were estimated using a Bayesian hierarchical model based on country‐specific dietary surveys; food availability data; and, for TFA, industry reports on fats/oils and packaged foods. Etiologic effects of dietary fats on CHD mortality were derived from meta‐analyses of prospective cohorts and CHD mortality rates from the 2010 Global Burden of Diseases study. Absolute and proportional attributable CHD mortality were computed using a comparative risk assessment framework. In 2010, nonoptimal intakes of n‐6 PUFA, SFA, and TFA were estimated to result in 711 800 (95% uncertainty interval [UI] 680 700–745 000), 250 900 (95% UI 236 900–265 800), and 537 200 (95% UI 517 600–557 000) CHD deaths per year worldwide, accounting for 10.3% (95% UI 9.9%–10.6%), 3.6%, (95% UI 3.5%–3.6%) and 7.7% (95% UI 7.6%–7.9%) of global CHD mortality. Tropical oil–consuming countries were estimated to have the highest proportional n‐6 PUFA– and SFA‐attributable CHD mortality, whereas Egypt, Pakistan, and Canada were estimated to have the highest proportional TFA‐attributable CHD mortality. From 1990 to 2010 globally, the estimated proportional CHD mortality decreased by 9% for insufficient n‐6 PUFA and by 21% for higher SFA, whereas it increased by 4% for higher TFA, with the latter driven by increases in low‐ and middle‐income countries. Conclusions: Nonoptimal intakes of n‐6 PUFA, TFA, and SFA each contribute to significant estimated CHD mortality, with important heterogeneity across countries that informs nation‐specific clinical, public health, and policy priorities.peer-reviewe

    Electrodiagnostic subtyping in Guillain–BarrĂ© syndrome patients in the International Guillain–BarrĂ© Outcome Study

    Get PDF
    Background and purpose: Various electrodiagnostic criteria have been developed in Guillain–BarrĂ© syndrome (GBS). Their performance in a broad representation of GBS patients has not been evaluated. Motor conduction data from the International GBS Outcome Study (IGOS) cohort were used to compare two widely used criterion sets and relate these to diagnostic amyotrophic lateral sclerosis criteria. Methods: From the first 1500 patients in IGOS, nerve conduction studies from 1137 (75.8%) were available for the current study. These patients were classified according to nerve conduction studies criteria proposed by Hadden and Rajabally. Results: Of the 1137 studies, 68.3% (N = 777) were classified identically according to criteria by Hadden and Rajabally: 111 (9.8%) axonal, 366 (32.2%) demyelinating, 195 (17.2%) equivocal, 35 (3.1%) inexcitable and 70 (6.2%) normal. Thus, 360 studies (31.7%) were classified differently. The areas of differences were as follows: 155 studies (13.6%) classified as demyelinating by Hadden and axonal by Rajabally; 122 studies (10.7%) classified as demyelinating by Hadden and equivocal by Rajabally; and 75 studies (6.6%) classified as equivocal by Hadden and axonal by Rajabally. Due to more strictly defined cutoffs fewer patients fulfilled demyelinating criteria by Rajabally than by Hadden, making more patients eligible for axonal or equivocal classification by Rajabally. In 234 (68.6%) axonal studies by Rajabally the revised El Escorial (amyotrophic lateral sclerosis) criteria were fulfilled; in axonal cases by Hadden this was 1.8%. Conclusions and discussion: This study shows that electrodiagnosis in GBS is dependent on the criterion set utilized, both of which are based on expert opinion. Reappraisal of electrodiagnostic subtyping in GBS is warranted

    Accelerated surgery versus standard care in hip fracture (HIP ATTACK): an international, randomised, controlled trial

    Get PDF

    Incident type 2 diabetes attributable to suboptimal diet in 184 countries

    Get PDF
    The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.8–14.4 million) incident T2D cases, representing 70.3% (68.8–71.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.0–27.1%)), excess refined rice and wheat intake (24.6% (22.3–27.2%)) and excess processed meat intake (20.3% (18.3–23.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.4–87.7%)) and Latin America and the Caribbean (81.8% (80.1–83.4%)); and lowest proportional burdens were in South Asia (55.4% (52.1–60.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally.publishedVersio

    Switch from intravenous to subcutaneous immunoglobulin in CIDP and MMN: improved tolerability and patient satisfaction

    No full text
    Objectives: To assess clinical outcomes and patient satisfaction in patients with chronic inflammatory demyelinating polyradiculoneuropathy (CIDP) or multifocal motor neuropathy (MMN) who were switched from intravenous immunoglobulin (IVIG) to subcutaneous immunoglobulin (SCIG). Methods: Eight consecutive patients, four with MMN and four with CIDP, already on long-term, hospital-based IVIG were switched to home-based SCIG. These patients were selected on the basis of their requirement for relatively low treatment doses, problems experienced with IVIG, and their willingness to switch to SCIG. Results: After a mean 33 [standard deviation (SD) 19] months receiving SCIG, 7 patients remained neurologically stable and 6 remained on a similar mean weekly immunoglobulin dose relative to their original intravenous dose. A good outcome was reported by 7 of the 8 patients: there were improvements in nausea and headache ( n = 4), need to travel to hospital ( n = 4), venous access problems ( n = 3), immunoglobulin-induced neutropenia ( n = 3), treatment wearing-off fluctuations ( n = 2), IVIG-induced allergy requiring antihistamine/hydrocortisone ( n = 1) and time taken off work ( n = 1). The eighth patient required increasing doses of immunoglobulin to maintain strength but still wanted to continue SCIG. Seven patients completed a questionnaire: there was a very high overall satisfaction level with immunoglobulin treatment [mean 96 (SD 5), visual analogue scale (VAS) where 0 = very unsatisfied, 100 = very satisfied]; and very strong preference for subcutaneous over intravenous immunoglobulin (VAS mean 93 [SD 12] where 0 = prefer IVIG, 100 = prefer SCIG). Conclusions: In seven of the eight patients, SCIG gave improved tolerability and patient satisfaction with similar efficacy compared with IVIG

    Effect of microfibril twisting on theoretical powder diffraction patterns of cellulose iÎČ

    No full text
    Previous studies of calculated diffraction patterns for cellulose crystallites suggest that distortions that arise once models have been subjected to molecular dynamics (MD) simulation are the result of both microfibril twisting and changes in unit cell dimensions induced by the empirical force field; to date, it has not been possible to separate the individual contributions of these effects. To provide a better understanding of how twisting manifests in diffraction data, the present study demonstrates a method for generating twisted and linear cellulose structures that can be compared without the bias of dimensional changes, allowing assessment of the impact of twisting alone. Analysis of unit cell dimensions, microfibril volume, hydrogen bond patterns, glycosidic torsion angles, and hydroxymethyl group orientations confirmed that the twisted and linear structures collected with this method were internally consistent, and theoretical powder diffraction patterns for the two were shown to be effectively indistinguishable. These results indicate that differences between calculated patterns for the crystal coordinates and twisted structures from MD simulation can result entirely from changes in unit cell dimensions, and not from microfibril twisting. Although powder diffraction patterns for models in the 81-chain size regime were shown to be unaffected by twisting, suggesting that a modest degree of twist is not inconsistent with available crystallographic data, it may be that other diffraction techniques are capable of detecting this structural difference. Until such time as definitive experimental evidence comes to light, the results of this study suggest that both twisted and linear microfibrils may represent an appropriate model for cellulose I beta
    • 

    corecore