70,333 research outputs found
Investigation of the influence of a step change in surface roughness on turbulent heat transfer
The use is studied of smooth heat flux gages on the otherwise very rough SSME fuel pump turbine blades. To gain insights into behavior of such installations, fluid mechanics and heat transfer data were collected and are reported for a turbulent boundary layer over a surface with a step change from a rough surface to a smooth surface. The first 0.9 m length of the flat plate test surface was roughened with 1.27 mm hemispheres in a staggered, uniform array spaced 2 base diameters apart. The remaining 1.5 m length was smooth. The effect of the alignment of the smooth surface with respect to the rough surface was also studied by conducting experiments with the smooth surface aligned with the bases or alternatively with the crests of the roughness elements. Stanton number distributions, skin friction distributions, and boundary layer profiles of temperature and velocity are reported and are compared to previous data for both all rough and all smooth wall cases. The experiments show that the step change from rough to smooth has a dramatic effect on the convective heat transfer. It is concluded that use of smooth heat flux gages on otherwise rough surfaces could cause large errors
A perturbative analysis of tachyon condensation
Tachyon condensation in the open bosonic string is analyzed using a
perturbative expansion of the tachyon potential around the unstable D25-brane
vacuum. Using the leading terms in the tachyon potential, Pad\'e approximants
can apparently give the energy of the stable vacuum to arbitrarily good
accuracy. Level-truncation approximations up to level 10 for the coefficients
in the tachyon potential are extrapolated to higher levels and used to find
approximants for the full potential. At level 14 and above, the resulting
approximants give an energy less than -1 in units of the D25-brane tension, in
agreement with recent level-truncation results by Gaiotto and Rastelli. The
extrapolated energy continues to decrease below -1 until reaching a minimum
near level 26, after which the energy turns around and begins to approach -1
from below. Within the accuracy of this method, these results are completely
consistent with an energy which approaches -1 as the level of truncation is
taken to be arbitrarily large.Comment: 8 pages, 3 eps figures, Latex; v2: typo correcte
Using conditional kernel density estimation for wind power density forecasting
Of the various renewable energy resources, wind power is widely recognized as one of the most promising. The management of wind farms and electricity systems can benefit greatly from the availability of estimates of the probability distribution of wind power generation. However, most research has focused on point forecasting of wind power. In this paper, we develop an approach to producing density forecasts for the wind power generated at individual wind farms. Our interest is in intraday data and prediction from 1 to 72 hours ahead. We model wind power in terms of wind speed and wind direction. In this framework, there are two key uncertainties. First, there is the inherent uncertainty in wind speed and direction, and we model this using a bivariate VARMA-GARCH (vector autoregressive moving average-generalized autoregressive conditional heteroscedastic) model, with a Student t distribution, in the Cartesian space of wind speed and direction. Second, there is the stochastic nature of the relationship of wind power to wind speed (described by the power curve), and to wind direction. We model this using conditional kernel density (CKD) estimation, which enables a nonparametric modeling of the conditional density of wind power. Using Monte Carlo simulation of the VARMA-GARCH model and CKD estimation, density forecasts of wind speed and direction are converted to wind power density forecasts. Our work is novel in several respects: previous wind power studies have not modeled a stochastic power curve; to accommodate time evolution in the power curve, we incorporate a time decay factor within the CKD method; and the CKD method is conditional on a density, rather than a single value. The new approach is evaluated using datasets from four Greek wind farms
Recommended from our members
Hopane biomarkers traced from bedrock to recent sediments and ice at the Haughton Impact Structure, Devon Island: Implications for the search for biomarkers on Mars
Hopanoid biomarkers have been traced from bedrock to ice in the Haughton Impact Structure, suggesting that they represent a promising strategy in the search for life in ice deposits on Mars and other icy bodies
A computer aided teleoperator system Final report
Computer aided teleoperator system for remote handling task
Experimental assessment of presumed filtered density function models
Measured filtered density functions (FDFs) as well as assumed beta distribution model of mixture fraction and “subgrid” scale (SGS) scalar variance, used typically in large eddy simulations, were studied by analysing experimental data, obtained from two-dimensional planar, laser induced fluorescence measurements in isothermal swirling turbulent flows at a constant Reynolds number of 29 000 for different swirl numbers (0.3, 0.58, and 1.07)
Recommended from our members
Refinement and preliminary evaluation of two tablet-based tests of real-world visual function
PURPOSE: To describe, refine, evaluate, and provide normative control data for two freely available tablet-based tests of real-world visual function, using a cohort of young, normally-sighted adults.
METHODS: Fifty young (18-40Â years), normally-sighted adults completed tablet-based assessments of (1) face discrimination and (2) visual search. Each test was performed twice, to assess test-retest repeatability. Post-hoc analyses were performed to determine the number of trials required to obtain stable estimates of performance. Distributions were fitted to the normative data to determine the 99% population-boundary for normally sighted observers. Participants were also asked to rate their comprehension of each test.
RESULTS: Both tests provided stable estimates in around 20 trials (~1-4Â min), with only a further reduction of 14%-17% in the 95% Coefficient of Repeatability (CoR95 ) when an additional 40 trials were included. When using only ~20 trials: median durations for the first run of each test were 191Â s (Faces) and 51Â s (Search); test-retest CoR95 were 0.27Â d (Faces) and 0.84Â s (Search); and normative 99% population-limits were 3.50Â d (Faces) and 3.1Â s (Search). No participants exhibited any difficulties completing either test (100% completion rate), and ratings of task-understanding were high (Faces: 9.6 out of 10; Search: 9.7 out of 10).
CONCLUSIONS: This preliminary assessment indicated that both tablet-based tests are able to provide simple, quick, and easy-to-administer measures of real-world visual function in normally-sighted young adults. Further work is required to assess their accuracy and utility in older people and individuals with visual impairment. Potential applications are discussed, including their use in clinic waiting rooms, and as an objective complement to Patient Reported Outcome Measures (PROMs)
A comparison of CMB- and HLA-based approaches to type I interoperability reference model problems for COTS-based distributed simulation
Commercial-off-the-shelf (COTS) simulation packages (CSPs) are software used by many simulation modellers to build and experiment with models of various systems in domains such as manufacturing, health, logistics and commerce. COTS distributed simulation deals with the interoperation of CSPs and their models. Such interoperability has been classified into six interoperability reference models. As part of an on-going standardisation effort, this paper introduces the COTS Simulation Package Emulator, a proposed benchmark that can be used to investigate Type I interoperability problems in COTS distributed simulation. To demonstrate its use, two approaches to this form of interoperability are discussed, an implementation of the CMB conservative algorithm, an example of a so-called “light” approach, and an implementation of the HLA TAR algorithm, an example of a so-called “heavy” approach. Results from experimentation over four federation topologies are presented and it is shown the HLA approach out performs the CMB approach in almost all cases. The paper concludes that the CSPE benchmark is a valid basis from which the most efficient approach to Type I interoperability problems for COTS distributed simulation can be discovered
Predicting narrow states in the spectrum of a nucleus beyond the proton drip line
Properties of particle-unstable nuclei lying beyond the proton drip line can
be ascertained by considering those (usually known) properties of its mirror
neutron-rich system. We have used a multi-channel algebraic scattering theory
to map the known properties of the neutron-C system to those of the
proton-O one from which we deduce that the particle-unstable
F will have a spectrum of two low lying broad resonances of positive
parity and, at higher excitation, three narrow negative parity ones. A key
feature is to use coupling to Pauli-hindered states in the target.Comment: 5 pages, 3 figure
- …