427 research outputs found
Development of a Robust and Efficient Parallel Solver for Unsteady Turbomachinery Flows
The traditional design and analysis practice for advanced propulsion systems relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. In the design of advanced propulsion systems, CFD plays a major role in defining the required performance over the entire flight regime, as well as in testing the sensitivity of the design to the different modes of operation. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment. The turbomachinery simulation capability presented here is being developed in a computational tool called Loci-STREAM [1]. It integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci [2] which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective is to be able to routinely simulate problems involving complex geometries requiring large unstructured grids and complex multidisciplinary physics. An immediate application of interest is simulation of unsteady flows in rocket turbopumps, particularly in cryogenic liquid rocket engines. The key components of the overall methodology presented in this paper are the following: (a) high fidelity unsteady simulation capability based on Detached Eddy Simulation (DES) in conjunction with second-order temporal discretization, (b) compliance with Geometric Conservation Law (GCL) in order to maintain conservative property on moving meshes for second-order time-stepping scheme, (c) a novel cloud-of-points interpolation method (based on a fast parallel kd-tree search algorithm) for interfaces between turbomachinery components in relative motion which is demonstrated to be highly scalable, and (d) demonstrated accuracy and parallel scalability on large grids (approx 250 million cells) in full turbomachinery geometries
Recommended from our members
Poorest countries experience earlier anthropogenic emergence of daily temperature extremes
Understanding how the emergence of the anthropogenic warming signal from the noise of internal variability translates to changes in extreme event occurrence is of crucial societal importance. By utilising simulations of cumulative carbon dioxide (CO2) emissions and temperature changes from eleven earth system models, we demonstrate that the inherently lower internal variability found at tropical latitudes results in large increases in the frequency of extreme daily temperatures (exceedances of the 99.9th percentile derived from pre-industrial climate simulations) occurring much earlier than for mid-to-high latitude regions. Most of the world's poorest people live at low latitudes, when considering 2010 GDP-PPP per capita; conversely the wealthiest population quintile disproportionately inhabit more variable mid-latitude climates. Consequently, the fraction of the global population in the lowest socio-economic quintile is exposed to substantially more frequent daily temperature extremes after much lower increases in both mean global warming and cumulative CO2 emissions
Recommended from our members
The Maunder minimum and the Little Ice Age: an update from recent reconstructions and climate simulations
The Maunder minimum (MM) was a period of extremely low solar activity from approximately AD 1650 to 1715. In the solar physics literature, the MM is sometimes associated with a period of cooler global temperatures, referred to as the Little Ice Age (LIA), and thus taken as compelling evidence of a large, direct solar influence on climate. In this study, we bring together existing simulation and observational studies, particularly the most recent solar activity and paleoclimate reconstructions, to examine this relation. Using northern hemisphere surface air temperature reconstructions, the LIA can be most readily defined as an approximately 480 year period spanning AD 1440–1920, although not all of this period was notably cold. While the MM occurred within the much longer LIA period, the timing of the features are not suggestive of causation and should not, in isolation, be used as evidence of significant solar forcing of climate. Climate model simulations suggest multiple factors, particularly volcanic activity, were crucial for causing the cooler temperatures in the northern hemisphere during the LIA. A reduction in total solar irradiance likely contributed to the LIA at a level comparable to changing land use
Large-Mass Ultra-Low Noise Germanium Detectors: Performance and Applications in Neutrino and Astroparticle Physics
A new type of radiation detector, a p-type modified electrode germanium
diode, is presented. The prototype displays, for the first time, a combination
of features (mass, energy threshold and background expectation) required for a
measurement of coherent neutrino-nucleus scattering in a nuclear reactor
experiment. The device hybridizes the mass and energy resolution of a
conventional HPGe coaxial gamma spectrometer with the low electronic noise and
threshold of a small x-ray semiconductor detector, also displaying an intrinsic
ability to distinguish multiple from single-site particle interactions. The
present performance of the prototype and possible further improvements are
discussed, as well as other applications for this new type of device in
neutrino and astroparticle physics (double-beta decay, neutrino magnetic moment
and WIMP searches).Comment: submitted to Phys. Rev.
Recovery from Addiction on a University Campus – a UK Perspective
Between 30 and 40% of 18-year olds in England, Wales and Northern Ireland enter tertiary education (university) each year. Young adulthood (ages 15 to 25) is the usual period in which problems with alcohol, drugs or other behaviors begin to emerge, and yet these issues have received limited study in the UK. Government policy dictates that a full continuum of treatment and recovery services should be available in each area of the country, but uptake of these services by university students appears to be limited. In this discussion paper we describe the background to, and components of, the Collegiate Recovery Program (CRP), an initiative that has grown rapidly in the USA in the past decade. We then describe how the first UK University-led CRP was set up, before outlining what has been learnt so far and the potential challenges facing this approach
Wild state secrets: ultra-sensitive measurement of micro-movement can reveal internal processes in animals
Assessment of animal internal "state" - which includes hormonal, disease, nutritional, and emotional states - is normally considered the province of laboratory work, since its determination in animals in the wild is considered more difficult. However, we show that accelerometers attached externally to animals as diverse as elephants, cockroaches, and humans display consistent signal differences in micro-movement that are indicative of internal state. Originally used to elucidate the behavior of wild animals, accelerometers also have great potential for highlighting animal actions, which are considered as responses stemming from the interplay between internal state and external environment. Advances in accelerometry may help wildlife managers understand how internal state is linked to behavior and movement, and thus clarify issues ranging from how animals cope with the presence of newly constructed roads to how diseased animals might change movement patterns and therefore modulate disease spread
A Feynman integral in Lifshitz-point and Lorentz-violating theories in R<sup>D</sup> ⨁ R<i><sup>m</sup></i>
We evaluate a 1-loop, 2-point, massless Feynman integral ID,m(p,q) relevant for perturbative field theoretic calculations in strongly anisotropic d=D+m dimensional spaces given by the direct sum RD ⨁ Rm . Our results are valid in the whole convergence region of the integral for generic (noninteger) codimensions D and m. We obtain series expansions of ID,m(p,q) in terms of powers of the variable X:=4p2/q4, where p=|p|, q=|q|, p Є RD, q Є Rm, and in terms of generalised hypergeometric functions 3F2(−X), when X<1. These are subsequently analytically continued to the complementary region X≥1. The asymptotic expansion in inverse powers of X1/2 is derived. The correctness of the results is supported by agreement with previously known special cases and extensive numerical calculations
Recommended from our members
Clouds, aerosols, and precipitation in the marine boundary layer: an ARM Mobile Facility Deployment
The Clouds, Aerosol, and Precipitation in the Marine Boundary Layer (CAP-MBL) deployment at Graciosa Island in the Azores generated a 21-month (April 2009–December 2010) comprehensive dataset documenting clouds, aerosols, and precipitation using the Atmospheric Radiation Measurement Program (ARM) Mobile Facility (AMF). The scientific aim of the deployment is to gain improved understanding of the interactions of clouds, aerosols, and precipitation in the marine boundary layer.
Graciosa Island straddles the boundary between the subtropics and midlatitudes in the northeast Atlantic Ocean and consequently experiences a great diversity of meteorological and cloudiness conditions. Low clouds are the dominant cloud type, with stratocumulus and cumulus occurring regularly. Approximately half of all clouds contained precipitation detectable as radar echoes below the cloud base. Radar and satellite observations show that clouds with tops from 1 to 11 km contribute more or less equally to surface-measured precipitation at Graciosa. A wide range of aerosol conditions was sampled during the deployment consistent with the diversity of sources as indicated by back-trajectory analysis. Preliminary findings suggest important two-way interactions between aerosols and clouds at Graciosa, with aerosols affecting light precipitation and cloud radiative properties while being controlled in part by precipitation scavenging.
The data from Graciosa are being compared with short-range forecasts made with a variety of models. A pilot analysis with two climate and two weather forecast models shows that they reproduce the observed time-varying vertical structure of lower-tropospheric cloud fairly well but the cloud-nucleating aerosol concentrations less well. The Graciosa site has been chosen to be a permanent fixed ARM site that became operational in October 2013
Simulation-Enhanced Bayesian Optimization of System Designs using Hybrid Physical and Computer Experiments
We consider the problem of learning and optimizing the performance of a system by conducting a limited number of physical and digital experiments within a design space. Physical experiments are assumed to be unbiased but costly, while digital experiments (e.g., simulations) are less expensive but may introduce bias due to the limitations of the simulation model. This problem is relevant in many fields, such as optimizing engineered systems where performance (e.g., mechanical properties and reliability) depends on various design variables and external/internal factors. Without digital experiments, optimizing the system’s performance amounts to evaluating a noisy and expensive-to-assess black-box function, a task commonly handled using Bayesian Optimization (BO). Our research extends BO by incorporating digital experiments between subsequent physical experiments, aiming to (i) improve simulation model calibration and (ii) identify solutions that are likely to generate desirable physical experiment results. We introduce “Simulation-Enhanced Bayesian Optimization” (SEBO), a methodology that integrates these steps, and evaluate it using various one- and two-dimensional benchmark functions. A bias function is used to model the simulation model’s bias across the design space and its parameters. We compare SEBO to traditional BO, with preliminary results demonstrating SEBO’s advantages in optimizing experimental efforts; SEBO outperforms traditional BO for well-behaved functions, requiring fewer physical and digital experiments to achieve a desired objective function value. By effectively combining physical and digital experiments, SEBO offers significant potential for improving the design and optimization of engineered systems, reducing costs, speeding up design processes, and overall providing more efficient solutions in engineering and manufacturing.https://scholarworks.uark.edu/hnrcsturpc25/1016/thumbnail.jp
- …
