27 research outputs found

    Diet and Economic Modelling to Improve the Quality and Affordability of the Australian Diet for Low and Medium Socioeconomic Households

    Get PDF
    Food costs are a barrier to healthier diet selections, particularly for low socioeconomic households who regularly choose processed foods containing refined grains, added sugars, and added fats. In this study, the objectives were to: (i) identify the nutrient density-to-cost ratio of Australian foods; (ii) model the impact of substituting foods with lower nutrient density-to-cost ratio with those with the highest nutrient density-to-cost ratio for diet quality and affordability in low and medium socioeconomic households; and (iii) evaluate food processing levels. Foods were categorized, coded for processing level, analysed for nutrient density and cost, and ranked by nutrient density-to-cost ratio. The top quartile of nutrient dense, low-cost foods included 54% unprocessed (vegetables and reduced fat dairy), 33% ultra-processed (fortified wholegrain bread and breakfast cereals <20 g sugars/100 g), and 13% processed (fruit juice and canned legumes). Using substitution modelling, diet quality improved by 52% for adults and 71% for children across all households, while diet affordability improved by 25% and 27% for low and medium socioeconomic households, respectively. The results indicate that the quality and affordability of the Australian diet can be improved when nutritious, low-cost foods are selected. Processing levels in the healthier modelled diets suggest that some ultra-processed foods may provide a beneficial source of nutrition when consumed within national food group recommendations

    Nutrient dense, low-cost foods can improve the affordability and quality of the new zealand diet—a substitution modeling study

    Get PDF
    The high prevalence of non-communicable disease in New Zealand (NZ) is driven in part by unhealthy diet selections, with food costs contributing to an increased risk for vulnerable population groups. This study aimed to: (i) identify the nutrient density-to-cost ratio of NZ foods; (ii) model the impact of substituting foods with a lower nutrient density-to-cost ratio with those with a higher nutrient density-to-cost ratio on diet quality and affordability in representative NZ population samples for low and medium socioeconomic status (SES) households by ethnicity; and (iii) evaluate food processing level. Foods were categorized, coded for processing level and discretionary status, analyzed for nutrient density and cost, and ranked by nutrient density-to-cost ratio. The top quartile of nutrient dense, low-cost foods were 56% unprocessed (vegetables, fruit, porridge, pasta, rice, nuts/seeds), 31% ultra-processed (vegetable dishes, fortified bread, breakfast cereals unfortified <15 g sugars/100 g and fortified 15–30 g sugars/100 g), 6% processed (fruit juice), and 6% culinary processed (oils). Using substitution modeling, diet quality improved by 59% and 71% for adults and children, respectively, and affordability increased by 20–24%, depending on ethnicity and SES. The NZ diet can be made healthier and more affordable when nutritious, low-cost foods are selected. Processing levels in the healthier, modeled diet suggest that some non-discretionary ultra-processed foods may provide a valuable source of low-cost nutrition for food insecure populations

    Efficacy and safety of endoscopic sleeve gastroplasty and laparoscopic sleeve gastrectomy with 12+ months of adjuvant multidisciplinary support

    Get PDF
    BACKGROUND: The laparoscopic sleeve gastrectomy (LSG) and the incisionless endoscopic sleeve gastroplasty (ESG) weight loss procedures require further investigation of their efficacy, safety and patient-centered outcomes in the Australian setting. METHODS: The aim was to examine the 6- and 12-month weight loss efficacy, safety, and weight-related quality of life (QoL) of adults with obesity who received the ESG or LSG bariatric procedure with 12+ months of adjuvant multidisciplinary pre- and postprocedural support. Data were from a two-arm prospective cohort study that followed patients from baseline to 12-months postprocedure from a medical center in Queensland. Percent excess weight loss (%EWL) was the primary outcome. Secondary outcomes were body composition (fat mass, fat-free mass, android:gynoid ratio, bone mineral content) via dual energy X-ray absorptiometry, weight-related QoL, lipid, glycemic, and hepatic biochemistry, and adverse events. RESULTS: 16 ESG (19% attrition; 81.2% female; aged:41.4 (SD: 10.4) years; BMI: 35.5 (SD: 5.2) kg/m(2)) and 45 LSG (9% attrition; 84.4% female; aged:40.4 (SD: 9.0) years; BMI: 40.7 (SD: 5.6) kg/m(2)) participants were recruited. At 12-months postprocedure, ESG %EWL was 57% (SD: 32%; p  0.05]; 48.1% in LSG [p  0.05]; − 0.4 mmol/L in LSG [P < 0.05]) at 12-months. Both cohorts reduced fat mass (p < 0.05). The ESG maintained but LSG decreased fat-free mass at 6-months (p < 0.05); and both cohorts lost fat-free mass at 12-months (p < 0.05). There were no adverse events directly related to the procedure. The ESG reported 25% mild-moderate adverse events possibly related to the procedure, and the LSG reported 27% mild-severe adverse events possibly related to the procedure. CONCLUSIONS: In this setting, the ESG and LSG were safe and effective weight loss treatments for obese adults alongside multidisciplinary support. Patients who elected the ESG maintained fat-free mass at 6-months but both cohorts lost fat-free mass at 12-months postprocedure. Patients who elected the LSG had large and significant improvements to weight-related quality of life. Further well-powered studies are required to confirm these findings. TRIAL REGISTRATION: This study was registered prospectively at the Australia New Zealand Clinical Trials Registry on 06/03/2018, Registration Number ACTRN12618000337279. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12875-022-01629-7

    Early warning CUSUM plans for surveillance of negative binomial daily disease counts

    No full text
    Automated public health surveillance of disease counts for rapid outbreak, epidemic or bioterrorism detection using conventional control chart methods can be hampered by over-dispersion and background ('in-control') mean counts that vary over time. An adaptive cumulative sum (CUSUM) plan is developed for signalling unusually high incidence in prospectively monitored time series of over-dispersed daily disease counts with a non-homogeneous mean. Negative binomial transitional regression is used to prospectively model background counts and provide 'one-step-ahead' forecasts of the next day's count. A CUSUM plan then accumulates departures of observed counts from an offset (reference value) that is dynamically updated using the modelled forecasts. The CUSUM signals whenever the accumulated departures exceed a threshold. The amount of memory of past observations retained by the CUSUM plan is determined by the offset value; a smaller offset retains more memory and is efficient at detecting smaller shifts. Our approach optimises early outbreak detection by dynamically adjusting the offset value. We demonstrate the practical application of the 'optimal' CUSUM plans to daily counts of laboratory-notified influenza and Ross River virus diagnoses, with particular emphasis on the steady-state situation (i.e. changes that occur after the CUSUM statistic has run through several in-control counts).average run length, cumulative sum, monitoring, outbreak detection, surveillance,

    Quantifying Catastrophic and Climate Impacted Hazards Based on Local Expert Opinions

    No full text
    The analysis of catastrophic and climate impacted hazards is a challenging but important exercise, as the occurrence of such events is usually associated with high damage and uncertainty. Often, at the local level, there is a lack of information on rare extreme events, such that available data is not sufficient to fit a distribution and derive parameter values for the frequency and severity distributions. This paper discusses local assessments of extreme events and examines the potential of using expert opinions in order to obtain values for the distribution parameters. In particular, we illustrate a simple approach, where a local expert is required to only specify two percentiles of the loss distribution in order to provide an estimate for the severity distribution of climate impacted hazards. In our approach, we focus on so-called heavy-tailed distributions for the severity, such as the Lognormal, Weibull and Burr XII distribution. These distributions are widely used to fit data from catastrophic events and can also represent extreme losses or the so-called tail of the distribution. An illustration of the method is provided utilising an example that quantifies the risk of bushfires in a local area in Northern Sydney

    PloGO : plotting gene ontology annotation and abundance in multi-condition proteomics experiments

    No full text
    We describe the PloGO R package, a simple open-source tool for plotting gene ontology (GO) annotation and abundance information, which was developed to aid with the bioinformatics analysis of multi-condition label-free proteomics experiments using quantitation based on spectral counting. PloGO can incorporate abundance (raw spectral counts) or normalized spectral abundance factors (NSAF) data in addition to the GO annotation, as well as handle multiple files and allow for a targeted collection of GO categories of interest. Our main aims were to help identify interesting subsets of proteins for further analysis such as those arising from a protein data set partition based on the presence and absence or multiple pair-wise comparisons, as well as provide GO summaries that can be easily used in subsequent analyses. Though developed with label-free proteomics experiments in mind it is not specific to that approach and can be used for any multi-condition experiment for which GO information has been generated.5 page(s

    Label-free quantitative shotgun proteomics using normalized spectral abundance factors

    No full text
    In this chapter we describe the workflow used in our laboratory for label-free quantitative shotgun proteomics based on spectral counting. The main tools used are a series of R modules known collectively as the Scrappy program. We describe how to go from peptide to spectrum matching in a shotgun proteomics experiment using the XTandem algorithm, to simultaneous quantification of up to thousands of proteins, using normalized spectral abundance factors. The outputs of the software are described in detail, with illustrative examples provided for some of the graphical images generated. While it is not strictly within the scope of this chapter, some consideration is given to how best to extract meaningful biological information from quantitative shotgun proteomics data outputs.18 page(s

    Climate adaptation decision support tool for local governments : CATLoG

    No full text
    The Intergovernmental Panel on Climate Change (IPCC), the globally-recognised reference body for climate-related research, describes warming of the climate system as 'unequivocal'. The changing climate is likely to result in the occurrence of more frequent and intense extreme weather events. This demands preventative and preparatory actions (mitigation and adaptation) from all levels of government including local governments. No matter how robust the mitigation responses will be, adaptation actions will still be required to prepare for the already committed changes on the climate. The study of climate extremes is particularly important because of their high impact nature. Analysis of the extreme events are challenging because of their rare occurrences resulting in very few past observations that can help in any statistical analysis or conclusions. Currently available climate projections especially for extreme events at local scales are associated with a wide range of uncertainties. Apart from that, analysis and damage assessment of the extremes over a period of time also present a lot of uncertainties related to economic analysis (e.g. discount rate, growth rate) and the unknown future. Unfortunately, often end users do not understand the range of uncertainties surrounding the research outputs they use for extreme events. This research project was designed to develop a pilot tool to enable end users to analyse and prepare for extreme events in a less predictable, complex world. Due to the lack of historical data, the tool relies on expert judgements on the frequency and severity of such events. It is important to point out that the results of the analysis are highly dependent on the quality of these judgements such that the reliability of the results depends on finding appropriate experts in the field who can provide appropriate estimates for frequency and impact of the considered events. The Tool uses a combination of quantitative (Cost-Benefit Analysis) and qualitative (Multi-Criteria Analysis) methods to frame the decision support Tool. The current version of the Tool allows users to conduct sensitivity tests, examine the impact of uncertain parameters ranging from climate impacts to discount rates. The final product is a user-friendly decision tool in the form of an Excel add-in together with a user manual booklet that demonstrates sample worked out projects. The Tool is made flexible so that stakeholders can adopt or refine or upgrade it for their context specific applications.39 page(s
    corecore