554 research outputs found

    Long term survival after evidence based treatment of acute myocardial infarction and revascularisation: follow-up of population based Perth MONICA cohort, 1984-2005

    Get PDF
    Objective To examine trends in long term survival in patients alive 28 days after myocardial infarction and the impact of evidence based medical treatments and coronary revascularisation during or near the event

    Relationships between Molecular Characteristics of Novel Organic Selenium Compounds and the Formation of Sulfur Compounds in Selenium Biofortified Kale Sprouts

    Get PDF
    Due to problems with selenium deficiency in humans, the search for new organic molecules containing this element in plant biofortification process is highly required. Selenium organic esters evaluated in this study (E-NS-4, E-NS-17, E-NS-71, EDA-11, and EDA-117) are based mostly on benzoselenoate scaffolds, with some additional halogen atoms and various functional groups in the aliphatic side chain of different length, while one compound contains a phenylpiperazine moiety (WA-4b). In our previous study, the biofortification of kale sprouts with organoselenium compounds (at the concentrations of 15 mg/L in the culture fluid) strongly enhanced the synthesis of glucosinolates and isothiocyanates. Thus, the study aimed to discover the relationships between molecular characteristics of the organoselenium compounds used and the amount of sulfur phytochemicals in kale sprouts. The statistical partial least square model with eigenvalues equaled 3.98 and 1.03 for the first and second latent components, respectively, which explained 83.5% of variance in the predictive parameters, and 78.6% of response parameter variance was applied to reveal the existence of the correlation structure between molecular descriptors of selenium compounds as predictive parameters and biochemical features of studied sprouts as response parameters (correlation coefficients for parameters in PLS model in the range—0.521 Ă· 1.000). This study supported the conclusion that future biofortifiers composed of organic compounds should simultaneously contain nitryl groups, which may facilitate the production of plant-based sulfur compounds, as well as organoselenium moieties, which may influence the production of low molecular weight selenium metabolites. In the case of the new chemical compounds, environmental aspects should also be evaluated.This research received no external fundin

    Light smoking at base-line predicts a higher mortality risk to women than to men; evidence from a cohort with long follow-up

    Get PDF
    BACKGROUND: There is conflicting evidence as to whether smoking is more harmful to women than to men. The UK Cotton Workers’ Cohort was recruited in the 1960s and contained a high proportion of men and women smokers who were well matched in terms of age, job and length of time in job. The cohort has been followed up for 42 years. METHODS: Mortality in the cohort was analysed using an individual relative survival method and Cox regression. Whether smoking, ascertained at baseline in the 1960s, was more hazardous to women than to men was examined by estimating the relative risk ratio women to men, smokers to never smoked, for light (1–14), medium (15–24), heavy (25+ cigarettes per day) and former smoking. RESULTS: For all-cause mortality relative risk ratios were 1.35 for light smoking at baseline (95% CI 1.07-1.70), 1.15 for medium smoking (95% CI 0.89-1.49) and 1.00 for heavy smoking (95% CI 0.63-1.61). Relative risk ratios for light smoking at baseline for circulatory system disease was 1.42 (95% CI 1.01 to 1.98) and for respiratory disease was 1.89 (95% CI 0.99 to 3.63). Heights of participants provided no explanation for the gender difference. CONCLUSIONS: Light smoking at baseline was shown to be significantly more hazardous to women than to men but the effect decreased as consumption increased indicating a dose response relationship. Heavy smoking was equally hazardous to both genders. This result may help explain the conflicting evidence seen elsewhere. However gender differences in smoking cessation may provide an alternative explanation

    Is brief advice in primary care a cost-effective way to promote physical activity?

    Get PDF
    This article is made available through the Brunel Open Access Publishing Fund. This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 3.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial.Aim: This study models the cost-effectiveness of brief advice (BA) in primary care for physical activity (PA) addressing the limitations in the current limited economic literature through the use of a time-based modelling approach. Methods: A Markov model was used to compare the lifetime costs and outcomes of a cohort of 100 000 people exposed to BA versus usual care. Health outcomes were expressed in terms of quality-adjusted life years (QALYs). Costs were assessed from a health provider perspective (£2010/11 prices). Data to populate the model were derived from systematic literature reviews and the literature searches of economic evaluations that were conducted for national guidelines. Deterministic and probability sensitivity analyses explored the uncertainty in parameter estimates including short-term mental health gains associated with PA. Results: Compared with usual care, BA is more expensive, incurring additional costs of £806 809 but it is more effective leading to 466 QALYs gained in the total cohort, a QALY gain of 0.0047/person. The incremental cost per QALY of BA is £1730 (including mental health gains) and thus can be considered cost-effective at a threshold of £20 000/QALY. Most changes in assumptions resulted in the incremental cost-effectiveness ratio (ICER) falling at or below £12 000/QALY gained. However, when short-term mental health gains were excluded the ICER was £27 000/QALY gained. The probabilistic sensitivity analysis showed that, at a threshold of £20 000/QALY, there was a 99.9% chance that BA would be cost-effective. Conclusions: BA is a cost-effective way to improve PA among adults, provided short-term mental health gains are considered. Further research is required to provide more accurate evidence on factors contributing to the cost-effectiveness of BA.NICE Centre for Public Health Excellenc

    Research ethics committees: agents of research policy?

    Get PDF
    The purpose of this commentary is to describe the unintended effects ethics committees may have on research and to analyse the regulatory and administrative problems of clinical trials. DISCUSSION: The Finnish law makes an arbitrary distinction between medical research and other health research, and the European Union's directive for good clinical trials further differentiates drug trials. The starting point of current rules is that clinical trials are lesser in the interest of patients and society than routine health care. However, commercial interests are not considered unethical. The contrasting procedures in research and normal health care may tempt physicians to continue introducing innovations into practice by relying on unsystematic and uncontrolled observations. Tedious and bureaucratic rules may lead to the disappearance of trials initiated by researchers. Trying to accommodate the special legislative requirements for new drug trials into more complex interventions may result in poor designs with unreliable results and increased costs. Meanwhile, current legal requirements may undermine the morale of ethics committee members. CONCLUSION: The aims and the quality of the work of ethics committees should be evaluated, and a reformulation of the EU directive on good clinical trials is needed. Ethical judgement should consider the specific circumstance of each trial, and ethics committees should not foster poor research for legal reasons

    Studies on changes of estimated breeding values of U.S. Holstein bulls for final score from the first to second crop of daughters

    Get PDF
    The purpose of this study was to find ways of reducing changes of sire predicted transmitting ability for type’s final scores (PTATs) from the first to second crop of daughters. The PTATs were estimated from two datasets: D01 (scores recorded up to 2001) and D05 (scores recorded up to 2005). The PTAT changes were calculated as the difference between the evaluations based on D01 and D05. The PTATs were adjusted to a common genetic base of all evaluated cows born in 1995. The single-trait (ST) animal model included the fixed effects of the herd–year–season–classifier, age by year group at classification, stage of lactation at classification, registry status of animals, and additive genetic and permanent environment random effects. Unknown parent groups (UPGs) were defined based on every other birth year starting from 1972. Modifications to the ST model included the usage of a single record per cow, separate UPGs for first and second crop daughters, separate UPGs for sires and dams, and deepened pedigrees for dams with missing phenotypic records. Also, the multiple-trait (MT) model treated records of registered and grade cows as correlated traits. The mean PTAT change, for all of the sires, was close to zero in all of the models analyzed. The estimated mean PTAT change for 145 sires with 40 to 100 first crop and ≄200 second crop daughters was −0.33, −0.20, −0.13, −0.28, and −0.12 with ST, only first records, only last records, updated pedigrees, and allowing separate parent groups (PGs) for sires and dams after updating the pedigrees, respectively. The percentages of sires showing PTAT decline were reduced from 74.5 (with ST) to 57.3 by using only the last records of cows, and to 56.4 by allowing separate UPGs for sires and dams after updating the pedigrees. Though updating of the pedigrees alone was not effective, separate UPGs for sires together with additional pedigree was helpful in reducing the bias

    Estimation of genetic parameters for feed efficiency traits using random regression models in dairy cattle.

    Get PDF
    Feed efficiency has become an increasingly important research topic in recent years. As feed costs rise and the environmental impacts of agriculture become more apparent, improving the efficiency with which dairy cows convert feed to milk is increasingly important. However, feed intake is expensive to measure accurately on large populations, making the inclusion of this trait in breeding programs difficult. Understanding how the genetic parameters of feed efficiency and traits related to feed efficiency vary throughout the lactation period is valuable to gain understanding into the genetic nature of feed efficiency. This study used 121,226 dry matter intake (DMI) records, 120,500 energy corrected milk (ECM) records, and 98,975 metabolic body weight (MBW) records, collected on 7,440 first lactation Holstein cows from 6 countries (Canada, Denmark, Germany, Spain, Switzerland, and United States of America), from January 2003 to February 2022. Genetic parameters were estimated using a multiple-trait random regression model with a fourth order Legendre polynomial for all traits. Weekly phenotypes for DMI were re-parameterized using linear regressions of DMI on ECM and MBW, creating a measure of feed efficiency that was genetically corrected for ECM and MBW, referred to as genomic residual feed intake (gRFI). Heritability (SE) estimates varied from 0.15 (0.03) to 0.29 (0.02) for DMI, 0.24 (0.01) to 0.29 (0.03) for ECM, 0.55 (0.03) to 0.83 (0.05) for MBW, and 0.12 (0.03) to 0.22 (0.06) for gRFI. In general, heritability estimates were lower in the first stage of lactation compared with the later stages of lactation. Additive genetic correlations between weeks of lactation varied, with stronger correlations between weeks of lactation that were close together. The results of this study contribute to a better understanding of the change in genetic parameters across the first lactation, providing insight into potential selection strategies to include feed efficiency in breeding programs

    Ethics of controlled human infection to study COVID-19

    Get PDF
    Development of an effective vaccine is the clearest path to controlling the coronavirus disease 2019 (COVID-19) pandemic. To accelerate vaccine development, some researchers are pursuing, and thousands of people have expressed interest in participating in, controlled human infection studies (CHIs) with severe acute respiratory syndrome–coronavirus 2 (SARS-CoV-2) (1, 2). In CHIs, a small number of participants are deliberately exposed to a pathogen to study infection and gather preliminary efficacy data on experimental vaccines or treatments. We have been developing a comprehensive, state-of-the-art ethical framework for CHIs that emphasizes their social value as fundamental to justifying these studies. The ethics of CHIs in general are underexplored (3, 4), and ethical examinations of SARS-CoV-2 CHIs have largely focused on whether the risks are acceptable and participants could give valid informed consent (1). The high social value of such CHIs has generally been assumed. Based on our framework, we agree on the ethical conditions for conducting SARS-CoV-2 CHIs (see the table). We differ on whether the social value of such CHIs is sufficient to justify the risks at present, given uncertainty about both in a rapidly evolving situation; yet we see none of our disagreements as insurmountable. We provide ethical guidance for research sponsors, communities, participants, and the essential independent reviewers considering SARS-CoV-2 CHIs

    Impact of the "Tobacco control law" on exposure to environmental tobacco smoke in Spain

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The initial evaluations of the introduction of legislation that regulates smoking in enclosed public places in European countries, describe an important effect in the control of exposure to environmental tobacco smoke. However, the evidence is still limited. The objective of this study is to estimate the short-term effects of the comprehensive "Tobacco control law" introduced in Spain on January 2006, which includes a total ban of smoking in workplaces and a partial limitation of smoking in bars and restaurants.</p> <p>Methods</p> <p>Cross-sectional, population-based study. The self-reported exposure to environmental tobacco smoke at home, at work, in bars and restaurants of the population aged 18 to 64 years in the Madrid Region during a period prior to the law (October and November 2005; n = 1750) was compared to that of the period immediately after the law came into force (January-July 2006; n = 1252). Adjusted odds ratios (OR) were calculated using logistic regression models.</p> <p>Results</p> <p>Passive exposure to tobacco smoke at home has hardly changed. However, at indoor workplaces there has been a considerable reduction: after the law came into force the OR for daily exposure > 0–3 hours versus non-exposure was 0.11 (95% CI: 0.07 to 0.17) and for more than 3 hours, 0.12 (95% CI: 0.09 to 0.18). For fairly high exposure in bars and restaurants versus non-exposure, the OR in the former was 0.30 (95% CI: 0.20 to 0.44) and in the latter was 0.24 (95% CI: 0.18 to 0.32); for very high exposure versus non-exposure they were 0.16 (95% CI: 0.10 to 0.24) and 0.11 (95% CI: 0.07 to 0.19), respectively. These results were similar for the smoking and non-smoking populations.</p> <p>Conclusion</p> <p>A considerable reduction in exposure to environmental tobacco smoke in the workplace and, to a lesser extent, in bars and restaurants, is related to the implementation of the "Tobacco control law". Although only initial figures, these results already demonstrate the effectiveness of strategies that establish control measures to guarantee smoke-free places.</p
    • 

    corecore