9 research outputs found

    Aboriginal and Torres Strait Islander absolute cardiovascular risk assessment and management: systematic review of evidence to inform national guidelines

    Get PDF
    Australia’s absolute cardiovascular disease (CVD) risk assessment algorithm1 first examines whether individuals meet criteria for clinically determined high CVD risk and, in those not meeting these criteria, applies the Framingham Risk Equation to estimate an individual’s risk of having a CVD event in the next 5 years. The same risk equation is used for Aboriginal and Torres Strait Islander people and non-Indigenous Australians, although there is variation in underlying risk across the two populations, with the former experiencing a greater burden of cardiovascular risk factors.This research was funded by a grant from the Australian Government Department of Health

    Review of evidence for the alignment of guidelines on Aboriginal and Torres Strait Islander absolute cardiovascular disease risk: A report prepared for the Australian Government Department of Health

    Get PDF
    Policy context: Cardiovascular disease (CVD) is highly preventable. CVD continues to be the largest contributor to mortality within the Aboriginal and Torres Strait Islander population and rates of CVD are disproportionately higher within the Australian Aboriginal and Torres Strait Islander population compared to the non-Indigenous population. Improving uptake of current evidence based solutions such as the absolute risk approach to CVD within the Aboriginal and Torres Strait Islander population is important to address this disparity. Although there are several tools available supporting an absolute CVD risk approach, clinical uptake is limited due to a number of factors including an outdated continued reliance on the ‘single risk factor’ approach to prevention, diagnosis and treatment of CVD. A major barrier to uptake is inconsistent messages in the current clinical practice guidelines. Key messages: There are three main guidelines on the absolute CVD risk approach for Aboriginal and Torres Strait Islander peoples in Australia: The NVDPA Guidelines for the Management of Absolute Cardiovascular Disease Risk; The Central Australian Rural Practitioners Association Standard Treatment Manual; and the RACGP National Guide to a Preventive Health Assessment for Aboriginal and Torres Strait Islander People. There is considerable alignment between the existing guidelines, including the need for an absolute risk approach, conditions conferring automatic high risk, use of the Framingham risk equation as the basis of calculating absolute risk, and the need to treat people at a greater than 15% risk of a primary CVD event over the next five years. The guidelines diverge materially in relation to four recommendations: 1) the age at which to commence absolute CVD risk assessment; 2) whether or not calculated risk scores should be adjusted upward by 5%; 3) how often CVD risk should be assessed; and 4) treatment targets for blood pressure. Available evidence indicates that CVD events and high absolute CVD risk occurs earlier in Aboriginal and Torres Strait Islander peoples, and that prevention of CVD should also start early. The proportion of Aboriginal and Torres Strait Islander peoples at high absolute CVD risk at the ages of 18-34 years broadly corresponds to the proportion at high risk among the general population aged 45-54 years. Limited evidence suggests that the current risk scores are likely to underestimate risk in Aboriginal and Torres Strait Islander peoples. Specific data on the extent of underestimation and alternative validated risk scores in this population are lacking. There is no primary data on adjusting risk scores upwards by 5% in Aboriginal and Torres Strait Islander people. Frequency of CVD risk assessment should be based on initial level of risk but the optimal interval for risk reassessment at each level of risk is not clear. There is general agreement between the guidelines to lower blood pressure as tolerated but there are inconsistencies in the exact blood pressure target. Evidence suggests that reductions in systolic blood pressure result in proportional reductions in CVD events and all-cause mortality. CVD guidelines could be kept up to date by adopting a ‘living’ guidelines model, but consideration needs to be given to how to identify relevant updated evidence and how to integrate the updates into electronic decision support tools.This research was supported by a grant from the Australian Government Department of Health

    BHPR research: qualitative1. Complex reasoning determines patients' perception of outcome following foot surgery in rheumatoid arhtritis

    Get PDF
    Background: Foot surgery is common in patients with RA but research into surgical outcomes is limited and conceptually flawed as current outcome measures lack face validity: to date no one has asked patients what is important to them. This study aimed to determine which factors are important to patients when evaluating the success of foot surgery in RA Methods: Semi structured interviews of RA patients who had undergone foot surgery were conducted and transcribed verbatim. Thematic analysis of interviews was conducted to explore issues that were important to patients. Results: 11 RA patients (9 ♂, mean age 59, dis dur = 22yrs, mean of 3 yrs post op) with mixed experiences of foot surgery were interviewed. Patients interpreted outcome in respect to a multitude of factors, frequently positive change in one aspect contrasted with negative opinions about another. Overall, four major themes emerged. Function: Functional ability & participation in valued activities were very important to patients. Walking ability was a key concern but patients interpreted levels of activity in light of other aspects of their disease, reflecting on change in functional ability more than overall level. Positive feelings of improved mobility were often moderated by negative self perception ("I mean, I still walk like a waddling duck”). Appearance: Appearance was important to almost all patients but perhaps the most complex theme of all. Physical appearance, foot shape, and footwear were closely interlinked, yet patients saw these as distinct separate concepts. Patients need to legitimize these feelings was clear and they frequently entered into a defensive repertoire ("it's not cosmetic surgery; it's something that's more important than that, you know?”). Clinician opinion: Surgeons' post operative evaluation of the procedure was very influential. The impact of this appraisal continued to affect patients' lasting impression irrespective of how the outcome compared to their initial goals ("when he'd done it ... he said that hasn't worked as good as he'd wanted to ... but the pain has gone”). Pain: Whilst pain was important to almost all patients, it appeared to be less important than the other themes. Pain was predominately raised when it influenced other themes, such as function; many still felt the need to legitimize their foot pain in order for health professionals to take it seriously ("in the end I went to my GP because it had happened a few times and I went to an orthopaedic surgeon who was quite dismissive of it, it was like what are you complaining about”). Conclusions: Patients interpret the outcome of foot surgery using a multitude of interrelated factors, particularly functional ability, appearance and surgeons' appraisal of the procedure. While pain was often noted, this appeared less important than other factors in the overall outcome of the surgery. Future research into foot surgery should incorporate the complexity of how patients determine their outcome Disclosure statement: All authors have declared no conflicts of interes

    The spatial dynamics of microbiota-immune interactions

    No full text
    In recent years, the modulating role of the microbiota on host health has become increasingly apparent. The composition of this diverse community is intrinsic to the host-associated benefits of a microbiota and is regulated by the host’s immune system. Understanding the dynamics of microbiota-immune interactions is therefore necessary before modulating microbiota community composition for therapeutic purposes is viable. Ecological theory holds that the scale at which a community is considered changes the forces by which it is shaped. However, to date, microbiota-immune interactions have been assessed predominantly at large scales. Within this thesis the selective pressures which control microbiota community composition are explored. An individual-based model of biofilm growth is initially used to interrogate the role of adhesion in maintaining microbiota diversity. A fine-scale method is then developed to investigate the model’s predictions of spatial heterogeneity within the microbiota, allowing consideration of whether the scale at which the microbiota and adaptive immune populations are sampled changes the apparent dynamics of these communities. At small scales species interactions are important in community dynamics. To investigate species interactions within the microbiota, a method of estimating species interactions across space is developed. By grouping species according to their relative abundance, the role of rare individuals in community stability is investigated. Finally, an avian B-cell knockout is used to assess the role of immunoglobulin A (IgA) in modulating microbiota composition. By comparing the effect of IgA absence upon the microbiota at large and fine scales, the importance of sampling scale in elucidating immune and microbiota interactions is highlighted

    Aromaticity and Antiaromaticity in the Excited States of Porphyrin Nanorings

    No full text
    Aromaticity can be a useful concept for predicting the behavior of excited states. Here we show that π-conjugated porphyrin nanorings exhibit size-dependent excited-state global aromaticity and antiaromaticity, for rings containing up to eight porphyrin subunits, although they have no significant global aromaticity in their neutral singlet ground states. Applying Baird’s law, odd rings ([4n] π-electrons) are aromatic in their excited states, whereas the excited states of even rings ([4n+2] π-electrons) are antiaromatic. These predictions are borne out by density functional theory (DFT) studies of the nucleus-independent chemical shift in the T1 triplet state of each ring, which reveal the critical importance of the triplet delocalization to the emergence of excited-state aromaticity. The singlet excited states (S1) are explored by measurements of the radiative rate and fluorescence peak wavelength, revealing a subtle odd-even alternation as a function of ring size, consistent with symmetry-breaking in antiaromatic excited states

    The predicted effect and cost-effectiveness of tailoring colonoscopic surveillance according to mismatch repair gene in patients with Lynch syndrome

    Get PDF
    Purpose: Lynch syndrome-related colorectal cancer (CRC) risk substantially varies by mismatch repair (MMR) gene. We evaluated the health impact and cost-effectiveness of MMR gene-tailored colonoscopic surveillance. Methods: We first estimated sex- and MMR gene-specific cumulative lifetime risk of first CRC without colonoscopic surveillance using an optimization algorithm. Next, we harnessed these risk estimates in a microsimulation model, “Policy1-Lynch,” and compared 126 colonoscopic surveillance strategies against no surveillance. Results: The most cost-effective strategy was 3-yearly surveillance from age 25 to 70 years (pathogenic variants [path_] in MLH1 [path_MLH1], path_MSH2) with delayed surveillance for path_MSH6 (age 30-70 years) and path_PMS2 (age 35-70 years) heterozygotes (incremental cost-effectiveness ratio = Australian dollars (A) $8,833/life-year saved). This strategy averted 60 CRC deaths (153 colonoscopies per death averted) over the lifetime of 1000 confirmed patients with Lynch syndrome (vs no surveillance). This also reduced colonoscopies by 5% without substantial change in health outcomes (vs nontailored 3-yearly surveillance from 25-70 years). Generally, starting surveillance at age 25 (vs 20) years was more cost-effective with minimal effect on life-years saved and starting 5 to 10 years later for path_MSH6 and path_PMS2 heterozygotes (vs path_MLH1 and path_MSH2) further improved cost-effectiveness. Surveillance end age (70/75/80 years) had a minor effect. Three-yearly surveillance strategies were more cost-effective (vs 1 or 2-yearly) but prevented 3 fewer CRC deaths. Conclusion: MMR gene-specific colonoscopic surveillance would be effective and cost-effective

    Association between Melanoma Detected during Routine Skin Checks and Mortality

    No full text
    IMPORTANCE: Early melanoma diagnosis is associated with better health outcomes, but there is insufficient evidence that screening, such as having routine skin checks, reduces mortality. OBJECTIVE: To assess melanoma-specific and all-cause mortality associated with melanomas detected through routine skin checks, incidentally or patient detected. A secondary aim was to examine patient, sociodemographic, and clinicopathologic factors associated with different modes of melanoma detection. DESIGN, SETTING, AND PARTICIPANTS: This prospective, population-based, cohort study included patients in New South Wales, Australia, who were diagnosed with melanoma over 1 year from October 23, 2006, to October 22, 2007, in the Melanoma Patterns of Care Study and followed up until 2018 (mean [SD] length of follow-up, 11.9 [0.3] years) by using linked mortality and cancer registry data. All patients who had invasive melanomas recorded at the cancer registry were eligible for the study, but the number of in situ melanomas was capped. The treating doctors recorded details of melanoma detection and patient and clinical characteristics in a baseline questionnaire. Histopathologic variables were obtained from pathology reports. Of 3932 recorded melanomas, data were available and analyzed for 2452 (62%; 1 per patient) with primary in situ (n = 291) or invasive (n = 2161) cutaneous melanoma. Data were analyzed from March 2020 to January 2021. MAIN OUTCOMES AND MEASURES: Melanoma-specific mortality and all-cause mortality. RESULTS: A total of 2452 patients were included in the analyses. The median age at diagnosis was 65 years (range, 16-98 years), and 1502 patients (61%) were men. A total of 858 patients (35%) had their melanoma detected during a routine skin check, 1148 (47%) self-detected their melanoma, 293 (12%) had their melanoma discovered incidentally when checking another skin lesion, and 153 (6%) reported “other” presentation. Routine skin-check detection of invasive melanomas was associated with 59% lower melanoma-specific mortality (subhazard ratio, 0.41; 95% CI, 0.28-0.60; P < .001) and 36% lower all-cause mortality (hazard ratio, 0.64; 95% CI, 0.54-0.76; P < .001), adjusted for age and sex, compared with patient-detected melanomas. After adjusting for prognostic factors including ulceration and mitotic rate, the associations were 0.68 (95% CI, 0.44-1.03; P = .13), and 0.75 (95% CI, 0.63-0.90; P = .006), respectively. Factors associated with higher odds of routine skin-check melanoma detection included being male (female vs male, odds ratio [OR], 0.73; 95% CI, 0.60-0.89; P = .003), having previous melanoma (vs none, OR, 2.36; 95% CI, 1.77-3.15; P < .001), having many moles (vs not, OR, 1.39; 95% CI, 1.10-1.77; P = .02), being 50 years or older (eg, 50-59 years vs <40 years, OR, 2.89; 95% CI, 1.92-4.34; P < .001), and living in nonremote areas (eg, remote or very remote vs major cities, OR, 0.23; 95% CI, 0.05-1.04; P = .003). CONCLUSIONS AND RELEVANCE: In this cohort study, melanomas diagnosed through routine skin checks were associated with significantly lower all-cause mortality, but not melanoma-specific mortality, after adjustment for patient, sociodemographic, and clinicopathologic factors
    corecore