365,510 research outputs found

    Food security, risk management and climate change

    Get PDF
    This report identifies major constraints to the adaptive capacity of food organisations operating in Australia. This report is about food security, climate change and risk management. Australia has enjoyed an unprecedented level of food security for more than half a century, but there are new uncertainties emerging and it would be unrealistic – if not complacent – to assume the same level of food security will persist simply because of recent history. The project collected data from more than 36 case study organisations (both foreign and local) operating in the Australian food-supply chain, and found that for many businesses,  risk management practices require substantial improvement to cope with and exploit the uncertainties that lie ahead. Three risks were identified as major constraints to adaptive capacity of food organisations operating in Australia:  risk management practices; an uncertain regulatory environment – itself a result of gaps in risk management; climate change uncertainty and projections about climate change impacts, also related to risk management

    Integrating remote sensing datasets into ecological modelling: a Bayesian approach

    Get PDF
    Process-based models have been used to simulate 3-dimensional complexities of forest ecosystems and their temporal changes, but their extensive data requirement and complex parameterisation have often limited their use for practical management applications. Increasingly, information retrieved using remote sensing techniques can help in model parameterisation and data collection by providing spatially and temporally resolved forest information. In this paper, we illustrate the potential of Bayesian calibration for integrating such data sources to simulate forest production. As an example, we use the 3-PG model combined with hyperspectral, LiDAR, SAR and field-based data to simulate the growth of UK Corsican pine stands. Hyperspectral, LiDAR and SAR data are used to estimate LAI dynamics, tree height and above ground biomass, respectively, while the Bayesian calibration provides estimates of uncertainties to model parameters and outputs. The Bayesian calibration contrasts with goodness-of-fit approaches, which do not provide uncertainties to parameters and model outputs. Parameters and the data used in the calibration process are presented in the form of probability distributions, reflecting our degree of certainty about them. After the calibration, the distributions are updated. To approximate posterior distributions (of outputs and parameters), a Markov Chain Monte Carlo sampling approach is used (25 000 steps). A sensitivity analysis is also conducted between parameters and outputs. Overall, the results illustrate the potential of a Bayesian framework for truly integrative work, both in the consideration of field-based and remotely sensed datasets available and in estimating parameter and model output uncertainties

    Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms

    Get PDF
    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action

    Big data and the SP theory of intelligence

    Get PDF
    This article is about how the "SP theory of intelligence" and its realisation in the "SP machine" may, with advantage, be applied to the management and analysis of big data. The SP system -- introduced in the article and fully described elsewhere -- may help to overcome the problem of variety in big data: it has potential as "a universal framework for the representation and processing of diverse kinds of knowledge" (UFK), helping to reduce the diversity of formalisms and formats for knowledge and the different ways in which they are processed. It has strengths in the unsupervised learning or discovery of structure in data, in pattern recognition, in the parsing and production of natural language, in several kinds of reasoning, and more. It lends itself to the analysis of streaming data, helping to overcome the problem of velocity in big data. Central in the workings of the system is lossless compression of information: making big data smaller and reducing problems of storage and management. There is potential for substantial economies in the transmission of data, for big cuts in the use of energy in computing, for faster processing, and for smaller and lighter computers. The system provides a handle on the problem of veracity in big data, with potential to assist in the management of errors and uncertainties in data. It lends itself to the visualisation of knowledge structures and inferential processes. A high-parallel, open-source version of the SP machine would provide a means for researchers everywhere to explore what can be done with the system and to create new versions of it.Comment: Accepted for publication in IEEE Acces

    Probabilistic Numerics and Uncertainty in Computations

    Full text link
    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data has led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimisers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.Comment: Author Generated Postprint. 17 pages, 4 Figures, 1 Tabl

    An update on pharmacotherapy for type 2 diabetes

    Get PDF
    Glucose lowering drugs have been available for clinical use for over the past 60 years or so with the last 2 decades seeing a significant number of new agents being developed making treatment increasingly complex and also somewhat controversial. This stems from the fact that while it is now known that patients with diabetes have an increased risk for cardiovascular disease and mortality there are mounting concerns with regards to the cardiovascular effects of certain antihyperglycemic agents leading to uncertainties when it comes to drug prescription. This has left many clinicians perplexed with respect to optimal strategies for management for management of such patients leading to many regulatory bodies to issue recommendations for antihyperglycimic therapy in adults with type 2 diabetes. These all uniformly advocate an individualised approach, keeping in mind each patients’ unique health profile (such as age and weight) and their cardiovascular risk factors vis-a-vie the specific attributes, side effects and adverse effects of each antihyperglycemic agent. This article will focus on the ten major categories of diabetic therapies looking specifically at their mode of action, safety profile as well as key trial data and where possible the long-term outcome studies for each class.peer-reviewe
    • …
    corecore