30 research outputs found

    On the Properties of Energy Stable Flux Reconstruction Schemes for Implicit Large Eddy Simulation

    Get PDF
    We begin by investigating the stability, order of accuracy, and dispersion and dissipation characteristics of the extended range of energy stable flux reconstruction (E-ESFR) schemes in the context of implicit large eddy simulation (ILES). We proceed to demonstrate that subsets of the E-ESFR schemes are more stable than collocation nodal discontinuous Galerkin methods recovered with the flux reconstruction approach (FRDG) for marginally-resolved ILES simulations of the Taylor-Green vortex. These schemes are shown to have reduced dissipation and dispersion errors relative to FRDG schemes of the same polynomial degree and, simultaneously, have increased CourantFriedrichs-Lewy (CFL) limits. Finally, we simulate turbulent flow over an SD7003 aerofoil using two of the most stable E-ESFR schemes identified by the aforementioned Taylor-Green vortex experiments. Results demonstrate that subsets of E-ESFR schemes appear more stable than the commonly used FRDG method, have increased CFL limits, and are suitable for ILES of complex turbulent flows on unstructured grids

    Pediatric cochlear implantation: an update

    Get PDF
    Deafness in pediatric age can adversely impact language acquisition as well as educational and social-emotional development. Once diagnosed, hearing loss should be rehabilitated early; the goal is to provide the child with maximum access to the acoustic features of speech within a listening range that is safe and comfortable. In presence of severe to profound deafness, benefit from auditory amplification cannot be enough to allow a proper language development. Cochlear implants are partially implantable electronic devices designed to provide profoundly deafened patients with hearing sensitivity within the speech range. Since their introduction more than 30 years ago, cochlear implants have improved their performance to the extent that are now considered to be standard of care in the treatment of children with severe to profound deafness. Over the years patient candidacy has been expanded and the criteria for implantation continue to evolve within the paediatric population. The minimum age for implantation has progressively reduced; it has been recognized that implantation at a very early age (12–18 months) provides children with the best outcomes, taking advantage of sensitive periods of auditory development. Bilateral implantation offers a better sound localization, as well as a superior ability to understand speech in noisy environments than unilateral cochlear implant. Deafened children with special clinical situations, including inner ear malformation, cochlear nerve deficiency, cochlear ossification, and additional disabilities can be successfully treated, even thogh they require an individualized candidacy evaluation and a complex post-implantation rehabilitation. Benefits from cochlear implantation include not only better abilities to hear and to develop speech and language skills, but also improved academic attainment, improved quality of life, and better employment status. Cochlear implants permit deaf people to hear, but they have a long way to go before their performance being comparable to that of the intact human ear; researchers are looking for more sophisticated speech processing strategies as well as a more efficient coupling between the electrodes and the cochlear nerve with the goal of dramatically improving the quality of sound of the next generation of implants

    Mixed Climatology, Non-synoptic Phenomena and Downburst Wind Loading of Structures

    Get PDF
    Modern wind engineering was born in 1961, when Davenport published a paper in which meteorology, micrometeorology, climatology, bluff-body aerodynamics and structural dynamics were embedded within a homogeneous framework of the wind loading of structures called today \u201cDavenport chain\u201d. Idealizing the wind with a synoptic extra-tropical cyclone, this model was so simple and elegant as to become a sort of axiom. Between 1976 and 1977 Gomes and Vickery separated thunderstorm from non-thunderstorm winds, determined their disjoint extreme distributions and derived a mixed model later extended to other Aeolian phenomena; this study, which represents a milestone in mixed climatology, proved the impossibility of labelling a heterogeneous range of events by the generic term \u201cwind\u201d. This paper provides an overview of this matter, with particular regard to the studies conducted at the University of Genova on thunderstorm downbursts

    On the utility of GPU accelerated high-order methods for unsteady flow simulations: a comparison with industry-standard tools

    Get PDF
    First- and second-order accurate numerical methods, implemented for CPUs, underpin the majority of industrial CFD solvers. Whilst this technology has proven very successful at solving steady-state problems via a Reynolds Averaged Navier-Stokes approach, its utility for undertaking scaleresolving simulations of unsteady flows is less clear. High-order methods for unstructured grids and GPU accelerators have been proposed as an enabling technology for unsteady scale-resolving simulations of flow over complex geometries. In this study we systematically compare accuracy and cost of the high-order Flux Reconstruction solver PyFR running on GPUs and the industry-standard solver STAR-CCM+ running on CPUs when applied to a range of unsteady flow problems. Specifically, we perform comparisons of accuracy and cost for isentropic vortex advection (EV), decay of the Taylor- Green vortex (TGV), turbulent flow over a circular cylinder, and turbulent flow over an SD7003 aerofoil. We consider two configurations of STAR-CCM+: a second-order configuration, and a third-order configuration, where the latter was recommended by CD-Adapco for more effective computation of unsteady flow problems. Results from both PyFR and Star-CCM+ demonstrate that third-order schemes can be more accurate than second-order schemes for a given cost e.g. going from second- to third-order, the PyFR simulations of the EV and TGV achieve 75x and 3x error reduction respectively for the same or reduced cost, and STAR-CCM+ simulations of the cylinder recovered wake statistics significantly more accurately for only twice the cost. Moreover, advancing to higher-order schemes on GPUs with PyFR was found to offer even further accuracy vs. cost benefits relative to industry-standard tools

    On the behaviour of fully-discrete flux reconstruction schemes

    No full text
    In this study we employ von Neumann analyses to investigate the disper- sion, dissipation, group velocity, and error properties of several fully discrete flux reconstruction (FR) schemes. We consider three FR schemes paired with two explicit Runge-Kutta (ERK) schemes and two singly diagonally implicit RK (SDIRK) schemes. Key insights include the dependence of high-wavenumber numerical dissipation, relied upon for implicit large eddy simulation (ILES), on the choice of temporal scheme and time-step size. Also, the wavespeed characteristics of fully-discrete schemes and the relative dominance of temporal and spatial errors as a function of wavenumber and time-step size are investigated. Salient properties from the aforementioned theoretical analysis are then demonstrated in practice using linear advection test cases. Finally, a Burgers turbulence test case is used to demonstrate the importance of the temporal discretisation when using FR schemes for ILES

    Assessment factoren in de humane risicobeoordeling: een discussiestuk

    No full text
    TNO rapport V97.880<br>De algemene doelstelling van dit discussierapport is bij te dragen aan verdere harmonisatie van de humaan-toxicologische risicobeoordeling en aan de ontwikkeling van een formeel, geharmoniseerd stelsel van assessment factoren. De status quo van assessment factoren wordt geevalueerd. Opties worden aangeboden voor een stelsel van default waarden, of probabilistische verdelingen, van assessment factoren op basis van de huidige stand van de wetenschap. Methoden om deze default waarden of verdelingen te combineren worden beschreven. Het 'benchmark dose' concept wordt voorgesteld als een betere manier om het werkelijke geen-effectniveau voor de mens op probabilistische wijze te beschrijven. Gedemonstreerd wordt hoe de probabilistische verdeling van de benchmark dose gecombineerd kan worden met verdelingen van assessment factoren met als resultaat de waarschijnlijkheidsverdeling van een toxicologische grenswaarde voor de mens.The general goal of this discussion paper is to contribute towards further harmonisation of the human health risk assessment. It discusses the development of a formal, harmonised set of default assessment factors. The status quo with regard to assessment factors is reviewed. Options are presented for a set of default values or probabilistic distributions for assessment factors based on the state of the art. Methods of combining default values or probabilistic distributions of assessment factors are described. The benchmark dose concept is proposed for better characterisation of the true human no-effect level in a probabilistic manner. It is shown how the probabilistic benchmark dose distribution can be combined with distributions of assessment factors to arrive at the distribution of a Human Limit Value.DGM/SVSSoZaWe/A

    Assessment factoren in de humane risicobeoordeling: een discussiestuk

    No full text
    The general goal of this discussion paper is to contribute towards further harmonisation of the human health risk assessment. It discusses the development of a formal, harmonised set of default assessment factors. The status quo with regard to assessment factors is reviewed. Options are presented for a set of default values or probabilistic distributions for assessment factors based on the state of the art. Methods of combining default values or probabilistic distributions of assessment factors are described. The benchmark dose concept is proposed for better characterisation of the true human no-effect level in a probabilistic manner. It is shown how the probabilistic benchmark dose distribution can be combined with distributions of assessment factors to arrive at the distribution of a Human Limit Value.De algemene doelstelling van dit discussierapport is bij te dragen aan verdere harmonisatie van de humaan-toxicologische risicobeoordeling en aan de ontwikkeling van een formeel, geharmoniseerd stelsel van assessment factoren. De status quo van assessment factoren wordt geevalueerd. Opties worden aangeboden voor een stelsel van default waarden, of probabilistische verdelingen, van assessment factoren op basis van de huidige stand van de wetenschap. Methoden om deze default waarden of verdelingen te combineren worden beschreven. Het 'benchmark dose' concept wordt voorgesteld als een betere manier om het werkelijke geen-effectniveau voor de mens op probabilistische wijze te beschrijven. Gedemonstreerd wordt hoe de probabilistische verdeling van de benchmark dose gecombineerd kan worden met verdelingen van assessment factoren met als resultaat de waarschijnlijkheidsverdeling van een toxicologische grenswaarde voor de mens

    High-order accurate direct numerical simulation of flow over a MTU-T161 low pressure turbine blade

    No full text
    Reynolds Averaged Navier-Stokes (RANS) simulations and wind tunnel testing have become the go-to tools for industrial design of Low-Pressure Turbine (LPT) blades. However, there is also an emerging interest in use of scale-resolving simulations, including Direct Numerical Simulations (DNS). These could generate insight and data to underpin development of improved RANS models for LPT design. Additionally, they could underpin a virtual LPT wind tunnel capability, that is cheaper, quicker, and more data-rich than experiments. The current study applies PyFR, a Python based Computational Fluid Dynamics (CFD) solver, to fifth-order accurate petascale DNS of compressible flow over a three-dimensional MTU-T161 LPT blade with diverging end walls at a Reynolds number of 200, 000 on an unstructured mesh with over 11 billion degrees-of-freedom per equation. Various flow metrics, including isentropic Mach number distribution at mid-span, surface shear, and wake pressure losses are compared with available experimental data and found to be in agreement. Subsequently, a more detailed analysis of various flow features is presented. These include the separation/transition processes on both the suction and pressure sides of the blade, end-wall vortices, and wake evolution at various span-wise locations. The results, which constitute one of the largest and highest-fidelity CFD simulations ever conducted, demonstrate the potential of high-order accurate GPU-accelerated CFD as a tool for delivering industrial DNS of LPT blades
    corecore