79 research outputs found

    Low density lipoprotein cholesterol control status among Canadians at risk for cardiovascular disease: findings from the Canadian Primary Care Sentinel Surveillance Network Database

    Get PDF
    Background To determine the prevalence of uncontrolled LDL-C in patients with high cardiovascular disease (CVD) risks across Canada and to examine its related factors. Methods Non-pregnant adults >20 years-old, who had a lipid test completed between January 1, 2009 and December 31, 2011 and were included in the Canadian Primary Care Surveillance Network (CPCSSN) database were studied. The Framingham-Risk-Score was calculated to determine the risk levels. A serum LDL-C level of >2.0 mmol/L was considered as being poorly controlled. Patients with a previous record of a cerebrovascular accident, peripheral artery disease, or an ischemic heart disease were regarded as those under secondary prevention. Logistic regression modeling was performed to examine the factors associated with the LDL-C control. Results A total of 6,405 high-risk patients were included in the study and, of this population, 68 % had a suboptimal LDL-C, which was significantly associated with the female gender (OR: 3.26; 95 % CI: 2.63–4.05, p < 0.0001) and no medication therapy (OR: 6.31, 95 % CI: 5.21–7.65, p < 0.0001). Those with comorbidities of diabetes, hypertension, obesity, and smokers had a better LDL-C control. Rural residents (OR: 0.64, 95 % CI: 0.52–0.78, p < 0.0001), and those under secondary prevention (OR: 0.42; 95 % CI: 0.35–0.51, p < 0.0001), were also more likely to have a better LDL-C control. Conclusion A high proportion of high-cardiac risk patients in Canadian primary care settings have suboptimal LDL-C control. A lack of medication therapy appears to be the major contributing factor to this situation

    An Investigation of Cancer Rates in the Argentia Region, Newfoundland and Labrador: An Ecological Study

    Get PDF
    Background. The Argentia region of Newfoundland and Labrador, Canada, was home to a US naval base during a 40-year period between the 1940s and the 1990s. Activities on the base resulted in contamination of the soil and groundwater in the region with chemicals such as heavy metals and dioxins, and residents have expressed concern about higher rates of cancer in their community. This study investigated the rate of cancer diagnosis that is disproportionately high in the Argentia region. Methods. Cases of cancer diagnosed between 1985 and 2011 were obtained for the Argentia region, two comparison communities, and the province of Newfoundland and Labrador. Crude and age-standardized incidence rates of cancer diagnosis were calculated and compared. The crude incidence rate was adjusted for differences in age demographics using census data, and age-standardized incidence rates were compared. Results. Although the Argentia region had a higher crude rate of cancer diagnosis, the age-standardized incidence rate did not differ significantly from the comparison communities or the provincial average. Argentia has an aging population, which may have influenced the perception of increased cancer diagnosis in the community. Conclusions. We did not detect an increased burden of cancer in the Argentia region

    Six-year time-trend analysis of dyslipidemia among adults in Newfoundland and Labrador: findings from the laboratory information system between 2009 and 2014

    Get PDF
    Background: Dyslipidemia, an increased level of total cholesterol (TC), triglycerides (TG), low-density-lipoprotein cholesterol (LDL-C) and decreased level of high-density-lipoprotein cholesterol (HDL-C), is one of the most important risk factors for cardiovascular disease. We examined the six-year trend of dyslipidemia in Newfoundland and Labrador (NL), a Canadian province with a historically high prevalence of dyslipidemia. Methods: A serial cross-sectional study on all of the laboratory lipid tests available from 2009 to 2014 was performed. Dyslipidemia for every lipid component was defined using the Canadian Guidelines for the Diagnosis and Treatment of Dyslipidemia. The annual dyslipidemia rates for each component of serum lipid was examined. A fixed and random effect model was applied to adjust for confounding variables (sex and age) and random effects (residual variation in dyslipidemia over the years and redundancies caused by individuals being tested multiple times during the study period). Results: Between 2009 and 2014, a total of 875,208 records (mean age: 56.9 ± 14.1, 47.6% males) containing a lipid profile were identified. The prevalence of HDL-C and LDL-C dyslipidemia significantly decreased during this period (HDL-C: 35.8% in 2009 [95% CI 35.5-36.1], to 29.0% in 2014 [95% CI: 28.8-29.2], P = 0.03, and LDL-C: 35.2% in 2009 [95% CI: 34.9-35.4] to 32.1% in 2014 [95% CI: 31.9-32.3], P = 0.02). A stratification by sex, revealed no significant trend for any lipid element in females; however, in men, the previously observed trends were intensified and a new decreasing trend in dyslipidemia of TC was appeared (TC: 34.1% [95% CI 33.7-34.5] to 32.3% [95%CI: 32.0-32.6], p < 0.02, HDL-C: 33.8% (95%CI: 33.3-34.2) to 24.0% (95% CI: 23.7-24.3)], P < 0.01, LDL-C: 32.9% (95%CI:32.5-33.3) to 28.6 (95%CI: 28.3-28.9), P < 0.001). Adjustment for confounding factors and removing the residual noise by modeling the random effects did not change the significance. Conclusion: This study demonstrates a significant downward trend in the prevalence of LDL-C, HDL-C, and TC dyslipidemia, exclusively in men. These trends could be the result of males being the primary target for cardiovascular risk management

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Evaluating Gene Drive Approaches for Public Benefit

    Get PDF
    Gene drive approaches—those which bias inheritance of a genetic element in a population of sexually reproducing organisms—have the potential to provide important public benefits. The spread of selected genetic elements in wild populations of organisms may help address certain challenges, such as transmission of vector-borne human and animal diseases and biodiversity loss due to invasive animals. Adapting various naturally occurring gene drive mechanisms to these aims is a long-standing research area, and recent advances in genetics have made engineering gene drive systems significantly more technically feasible. Gene drive approaches would act through changes in natural environments, thus robust methods to evaluate potential research and use are important. Despite the fact that gene drive approaches build on existing paradigms, such as genetic modification of organisms and conventional biological control, there are material challenges to their evaluation. One challenge is the inherent complexity of ecosystems, which makes precise prediction of changes to the environment difficult. For gene drive approaches that are expected to spread spatially and/or persist temporally, responding to this difficulty with the typical stepwise increases in the scale of studies may not be straightforward after studies begin in the natural environment. A related challenge is that study or use of a gene drive approach may have implications for communities beyond the location of introduction, depending on the spatial spread and persistence of the approach and the population biology of the target organism. This poses a particular governance challenge when spread across national borders is plausible. Finally, community engagement is an important element of responsible research and governance, but effective community engagement for gene drive approaches requires addressing complexity and uncertainty and supporting representative participation in decision making. These challenges are not confronted in a void. Existing frameworks, processes, and institutions provide a basis for effective evaluation of gene drive approaches for public benefit. Although engineered gene drive approaches are relatively new, the necessities of making decisions despite uncertainty and governing actions with potential implications for shared environments are well established. There are methodologies to identify potential harms and assess risks when there is limited experience to draw upon, and these methodologies have been applied in similar contexts. There are also laws, policies, treaties, agreements, and institutions in place across many jurisdictions that support national and international decision making regarding genetically modified organisms and the potential applications of gene drive approaches, such as public health and biodiversity conservation. Community engagement is an established component of many decision-making processes, and related experience and conceptual frameworks can inform engagement by researchers. The existence of frameworks, processes, and institutions provides an important foundation for evaluating gene drive approaches, but it is not sufficient by itself. They must be rigorously applied, which requires resources for risk assessment, research, and community engagement and diligent implementation by governance institutions. The continued evolution of the frameworks, processes, and institutions is important to adapt to the growing understanding of gene drive approaches. With appropriate resources and diligence, it will be possible to responsibly evaluate and make decisions on gene drive approaches for public benefit

    Hypertension and type 2 diabetes: What family physicians can do to improve control of blood pressure - an observational study

    Get PDF
    Background: The prevalence of type 2 diabetes is rising, and most of these patients also have hypertension, substantially increasing the risk of cardiovascular morbidity and mortality. The majority of these patients do not reach target blood pressure levels for a wide variety of reasons. When a literature review provided no clear focus for action when patients are not at target, we initiated a study to identify characteristics of patients and providers associated with achieving target BP levels in community-based practice. Methods: We conducted a practice- based, cross-sectional observational and mailed survey study. The setting was the practices of 27 family physicians and nurse practitioners in 3 eastern provinces in Canada. The participants were all patients with type 2 diabetes who could understand English, were able to give consent, and would be available for follow-up for more than one year. Data were collected from each patient’s medical record and from each patient and physician/nurse practitioner by mailed survey. Our main outcome measures were overall blood pressure at target (< 130/80), systolic blood pressure at target, and diastolic blood pressure at target. Analysis included initial descriptive statistics, logistic regression models, and multivariate regression using hierarchical nonlinear modeling (HNLM). Results: Fifty-four percent were at target for both systolic and diastolic pressures. Sixty-two percent were at systolic target, and 79% were at diastolic target. Patients who reported eating food low in salt had higher odds of reaching target blood pressure. Similarly, patients reporting low adherence to their medication regimen had lower odds of reaching target blood pressure. Conclusions: When primary care health professionals are dealing with blood pressures above target in a patient with type 2 diabetes, they should pay particular attention to two factors. They should inquire about dietary salt intake, strongly emphasize the importance of reduction, and refer for detailed counseling if necessary. Similarly, they should inquire about adherence to the medication regimen, and employ a variety of patient-oriented strategies to improve adherence

    First measurement of the Hubble Constant from a Dark Standard Siren using the Dark Energy Survey Galaxies and the LIGO/Virgo Binary–Black-hole Merger GW170814

    Get PDF
    International audienceWe present a multi-messenger measurement of the Hubble constant H 0 using the binary–black-hole merger GW170814 as a standard siren, combined with a photometric redshift catalog from the Dark Energy Survey (DES). The luminosity distance is obtained from the gravitational wave signal detected by the Laser Interferometer Gravitational-Wave Observatory (LIGO)/Virgo Collaboration (LVC) on 2017 August 14, and the redshift information is provided by the DES Year 3 data. Black hole mergers such as GW170814 are expected to lack bright electromagnetic emission to uniquely identify their host galaxies and build an object-by-object Hubble diagram. However, they are suitable for a statistical measurement, provided that a galaxy catalog of adequate depth and redshift completion is available. Here we present the first Hubble parameter measurement using a black hole merger. Our analysis results in , which is consistent with both SN Ia and cosmic microwave background measurements of the Hubble constant. The quoted 68% credible region comprises 60% of the uniform prior range [20, 140] km s−1 Mpc−1, and it depends on the assumed prior range. If we take a broader prior of [10, 220] km s−1 Mpc−1, we find (57% of the prior range). Although a weak constraint on the Hubble constant from a single event is expected using the dark siren method, a multifold increase in the LVC event rate is anticipated in the coming years and combinations of many sirens will lead to improved constraints on H 0

    20-Year Risks of Breast-Cancer Recurrence after Stopping Endocrine Therapy at 5 Years

    Get PDF
    The administration of endocrine therapy for 5 years substantially reduces recurrence rates during and after treatment in women with early-stage, estrogen-receptor (ER)-positive breast cancer. Extending such therapy beyond 5 years offers further protection but has additional side effects. Obtaining data on the absolute risk of subsequent distant recurrence if therapy stops at 5 years could help determine whether to extend treatment
    corecore