69 research outputs found

    Effects of high-pressure processing (HPP) on the microbiological, physico-chemical and sensory properties of fresh cheeses: A review

    Get PDF
    High pressure processing (HPP) is an increasingly popular food processing method that offers great potential within the food industry. The drive to use HPP is to provide minimally processed foods which are safe and have extended shelf-life that rival traditional methods of food processing. HPP is currently being applied to a wide variety of food products, although to date the dairy industry has received little attention. The present paper reviews the effects of HPP on fresh rennet- and acid-coagulated cheeses.In additional to modifying physicochemical and sensory characteristics, HPP is reported to inactivate certain micro-organisms typically found in cheeses. Pathogenic microorganisms such as Listeria monocytogenes and Escherichia coli which contaminate, spoil and limit the shelf-life of cheese can be controlled by HPP. HPP can also cause changes in milk rennet coagulation properties, produce a more continuous or homogeneous protein matrix in cheese, improve cheese structure, texture and yield, aswell as reduce moisture content variations within fresh cheese blocks. Providing HPP can be operated economically, the use of pressure may be an attractive new method for the processing of cheese

    Spud 1.0: generalising and automating the user interfaces of scientific computer models

    No full text
    The interfaces by which users specify the scenarios to be simulated by scientific computer models are frequently primitive, under-documented and ad-hoc text files which make using the model in question difficult and error-prone and significantly increase the development cost of the model. In this paper, we present a model-independent system, Spud, which formalises the specification of model input formats in terms of formal grammars. This is combined with an automated graphical user interface which guides users to create valid model inputs based on the grammar provided, and a generic options reading module, libspud, which minimises the development cost of adding model options. <br><br> Together, this provides a user friendly, well documented, self validating user interface which is applicable to a wide range of scientific models and which minimises the developer input required to maintain and extend the model interface

    Assessing erosion and flood risk in the coastal zone through the application of multilevel Monte Carlo methods

    Get PDF
    Coastal zones are vulnerable to both erosion and flood risk, which can be assessed using coupled hydro- morphodynamic models. However, the use of such models as decision support tools suffers from a high degree of uncertainty, due to both incomplete knowledge and natural variability in the system. In this work, we show for the first time how the multilevel Monte Carlo method (MLMC) can be applied in hydro-morphodynamic coastal ocean modelling, here using the popular model XBeach, to quantify uncertainty by computing statistics of key output variables given uncertain input parameters. MLMC accelerates the Monte Carlo approach through the use of a hierarchy of models with different levels of resolution. Several theoretical and real-world coastal zone case studies are considered here, for which output variables that are key to the assessment of flood and erosion risk, such as wave run-up height and total eroded volume, are estimated. We show that MLMC can significantly reduce computational cost, resulting in speed up factors of 40 or greater compared to a standard Monte Carlo approach, whilst keeping the same level of accuracy. Furthermore, a sophisticated ensemble generating technique is used to estimate the cumulative distribution of output variables from the MLMC output. This allows for the probability of a variable exceeding a certain value to be estimated, such as the probability of a wave run-up height exceeding the height of a seawall. This is a valuable capability that can be used to inform decision-making under uncertaint

    Calibration, inversion and sensitivity analysis for hydro-morphodynamic models through the application of adjoint methods

    Get PDF
    The development of reliable, sophisticated hydro-morphodynamic models is essential for protecting the coastal environment against hazards such as flooding and erosion. There exists a high degree of uncertainty associated with the application of these models, in part due to incomplete knowledge of various physical, empirical and numerical closure related parameters in both the hydrodynamic and morphodynamic solvers. This uncertainty can be addressed through the application of adjoint methods. These have the notable advantage that the number and/or dimension of the uncertain parameters has almost no effect on the computational cost associated with calculating the model sensitivities. Here, we develop the first freely available and fully flexible adjoint hydro-morphodynamic model framework. This flexibility is achieved through using the pyadjoint library, which allows us to assess the uncertainty of any parameter with respect to any model functional, without further code implementation. The model is developed within the coastal ocean model Thetis constructed using the finite element code-generation library Firedrake. We present examples of how this framework can perform sensitivity analysis, inversion and calibration for a range of uncertain parameters based on the final bedlevel. These results are verified using so-called dual-twin experiments, where the ‘correct’ parameter value is used in the generation of synthetic model test data, but is unknown to the model in subsequent testing. Moreover, we show that inversion and calibration with experimental data using our framework produces physically sensible optimum parameters and that these parameters always lead to more accurate results. In particular, we demonstrate how our adjoint framework can be applied to a tsunami-like event to invert for the tsunami wave from sediment deposits

    Multilevel multifidelity Monte Carlo methods for assessing uncertainty in coastal flooding

    Get PDF
    When choosing an appropriate hydrodynamic model, there is always a compromise between accuracy and computational cost, with high-fidelity models being more expensive than low-fidelity ones. However, when assessing uncertainty, we can use a multifidelity approach to take advantage of the accuracy of high-fidelity models and the computational efficiency of low-fidelity models. Here, we apply the multilevel multifidelity Monte Carlo method (MLMF) to quantify uncertainty by computing statistical estimators of key output variables with respect to uncertain input data, using the high-fidelity hydrodynamic model XBeach and the lower-fidelity coastal flooding model SFINCS (Super-Fast INundation of CoastS). The multilevel aspect opens up the further advantageous possibility of applying each of these models at multiple resolutions. This work represents the first application of MLMF in the coastal zone and one of its first applications in any field. For both idealised and real-world test cases, MLMF can significantly reduce computational cost for the same accuracy compared to both the standard Monte Carlo method and to a multilevel approach utilising only a single model (the multilevel Monte Carlo method). In particular, here we demonstrate using the case of Myrtle Beach, South Carolina, USA, that this improvement in computational efficiency allows for in-depth uncertainty analysis to be conducted in the case of real-world coastal environments – a task that would previously have been practically unfeasible. Moreover, for the first time, we show how an inverse transform sampling technique can be used to accurately estimate the cumulative distribution function (CDF) of variables from the MLMF outputs. MLMF-based estimates of the expectations and the CDFs of the variables of interest are of significant value to decision makers when assessing uncertainty in predictions

    Evaluating the use of the Child and Adolescent Intellectual Disability Screening Questionnaire (CAIDS-Q) to estimate IQ in children with low intellectual ability

    Get PDF
    In situations where completing a full intellectual assessment is not possible or desirable the clinician or researcher may require an alternative means of accurately estimating intellectual functioning. There has been limited research in the use of proxy IQ measures in children with an intellectual disability or low IQ. The present study aimed to provide a means of converting total scores from a screening tool (the Child and Adolescent Intellectual Disability Screening Questionnaire: CAIDS-Q) to an estimated IQ. A series of linear regression analyses were conducted on data from 428 children and young people referred to clinical services, where FSIQ was predicted from CAIDS-Q total scores. Analyses were conducted for three age groups between ages 6 and 18 years. The study presents a conversion table for converting CAIDS-Q total scores to estimates of FSIQ, with corresponding 95% prediction intervals to allow the clinician or researcher to estimate FSIQ scores from CAIDS-Q total scores. It is emphasised that, while this conversion may offer a quick means of estimating intellectual functioning in children with a below average IQ, it should be used with caution, especially in children aged between 6 and 8 years old

    Simulating tidal turbines with multi-scale mesh optimisation techniques

    No full text
    Embedding tidal turbines within simulations of realistic large-scale tidal flows is a highly multi-scale problem that poses significant computational challenges. Here this problem is tackled using actuator disc momentum (ADM) theory and Reynolds-averaged Navier-Stokes (RANS) with, for the first time, dynamically adaptive mesh optimisation techniques. Both k-ω and k-ω SST RANS models have been developed within the Fluidity framework, an adaptive mesh CFD solver, and the model is validated against two sets of experimental flume test results. A brief comparison against a similar OpenFOAM model is presented to portray the benefits of the finite element discretisation scheme employed in the Fluidity ADM model. This model has been developed with the aim that it will be seamlessly combined with larger numerical models simulating tidal flows in realistic domains. This is where the mesh optimisation capability is a major advantage as it enables the mesh to be refined dynamically in time and only in the locations required, thus making optimal use of limited computational resources
    • 

    corecore