4,842 research outputs found
Determination of the Joint Confidence Region of Optimal Operating Conditions in Robust Design by Bootstrap Technique
Robust design has been widely recognized as a leading method in reducing
variability and improving quality. Most of the engineering statistics
literature mainly focuses on finding "point estimates" of the optimum operating
conditions for robust design. Various procedures for calculating point
estimates of the optimum operating conditions are considered. Although this
point estimation procedure is important for continuous quality improvement, the
immediate question is "how accurate are these optimum operating conditions?"
The answer for this is to consider interval estimation for a single variable or
joint confidence regions for multiple variables.
In this paper, with the help of the bootstrap technique, we develop
procedures for obtaining joint "confidence regions" for the optimum operating
conditions. Two different procedures using Bonferroni and multivariate normal
approximation are introduced. The proposed methods are illustrated and
substantiated using a numerical example.Comment: Two tables, Three figure
A Bayesian spatio-temporal model of panel design data: airborne particle number concentration in Brisbane, Australia
This paper outlines a methodology for semi-parametric spatio-temporal
modelling of data which is dense in time but sparse in space, obtained from a
split panel design, the most feasible approach to covering space and time with
limited equipment. The data are hourly averaged particle number concentration
(PNC) and were collected, as part of the Ultrafine Particles from Transport
Emissions and Child Health (UPTECH) project. Two weeks of continuous
measurements were taken at each of a number of government primary schools in
the Brisbane Metropolitan Area. The monitoring equipment was taken to each
school sequentially. The school data are augmented by data from long term
monitoring stations at three locations in Brisbane, Australia.
Fitting the model helps describe the spatial and temporal variability at a
subset of the UPTECH schools and the long-term monitoring sites. The temporal
variation is modelled hierarchically with penalised random walk terms, one
common to all sites and a term accounting for the remaining temporal trend at
each site. Parameter estimates and their uncertainty are computed in a
computationally efficient approximate Bayesian inference environment, R-INLA.
The temporal part of the model explains daily and weekly cycles in PNC at the
schools, which can be used to estimate the exposure of school children to
ultrafine particles (UFPs) emitted by vehicles. At each school and long-term
monitoring site, peaks in PNC can be attributed to the morning and afternoon
rush hour traffic and new particle formation events. The spatial component of
the model describes the school to school variation in mean PNC at each school
and within each school ground. It is shown how the spatial model can be
expanded to identify spatial patterns at the city scale with the inclusion of
more spatial locations.Comment: Draft of this paper presented at ISBA 2012 as poster, part of UPTECH
projec
Synthetic LISA: Simulating Time Delay Interferometry in a Model LISA
We report on three numerical experiments on the implementation of Time-Delay
Interferometry (TDI) for LISA, performed with Synthetic LISA, a C++/Python
package that we developed to simulate the LISA science process at the level of
scientific and technical requirements. Specifically, we study the laser-noise
residuals left by first-generation TDI when the LISA armlengths have a
realistic time dependence; we characterize the armlength-measurements
accuracies that are needed to have effective laser-noise cancellation in both
first- and second-generation TDI; and we estimate the quantization and
telemetry bitdepth needed for the phase measurements. Synthetic LISA generates
synthetic time series of the LISA fundamental noises, as filtered through all
the TDI observables; it also provides a streamlined module to compute the TDI
responses to gravitational waves according to a full model of TDI, including
the motion of the LISA array and the temporal and directional dependence of the
armlengths. We discuss the theoretical model that underlies the simulation, its
implementation, and its use in future investigations on system characterization
and data-analysis prototyping for LISA.Comment: 18 pages, 14 EPS figures, REVTeX 4. Accepted PRD version. See
http://www.vallis.org/syntheticlisa for information on the Synthetic LISA
software packag
Application of Bayesian model averaging to measurements of the primordial power spectrum
Cosmological parameter uncertainties are often stated assuming a particular
model, neglecting the model uncertainty, even when Bayesian model selection is
unable to identify a conclusive best model. Bayesian model averaging is a
method for assessing parameter uncertainties in situations where there is also
uncertainty in the underlying model. We apply model averaging to the estimation
of the parameters associated with the primordial power spectra of curvature and
tensor perturbations. We use CosmoNest and MultiNest to compute the model
Evidences and posteriors, using cosmic microwave data from WMAP, ACBAR,
BOOMERanG and CBI, plus large-scale structure data from the SDSS DR7. We find
that the model-averaged 95% credible interval for the spectral index using all
of the data is 0.940 < n_s < 1.000, where n_s is specified at a pivot scale
0.015 Mpc^{-1}. For the tensors model averaging can tighten the credible upper
limit, depending on prior assumptions.Comment: 7 pages with 7 figures include
Evaluation of Macroscopic Properties in the Direct Simulation Monte Carlo Method
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/77021/1/AIAA-12542-230.pd
Retrodiction with two-level atoms: atomic previvals
In the Jaynes-Cummings model a two-level atom interacts with a single-mode
electromagnetic field. Quantum mechanics predicts collapses and revivals in the
probability that a measurement will show the atom to be excited at various
times after the initial preparation of the atom and field. In retrodictive
quantum mechanics we seek the probability that the atom was prepared in a
particular state given the initial state of the field and the outcome of a
later measurement on the atom. Although this is not simply the time reverse of
the usual predictive problem, we demonstrate in this paper that retrodictive
collapses and revivals also exist. We highlight the differences between
predictive and retrodictive evolutions and describe an interesting situation
where the prepared state is essentially unretrodictable.Comment: 15 pages, 3 (5) figure
Optical, physical and chemical characteristics of Australian continental aerosols: results from a field experiment
Mineral dust is one of the major components of the world's aerosol mix, having a number of impacts within the Earth system. However, the climate forcing impact of mineral dust is currently poorly constrained, with even its sign uncertain. As Australian deserts are more reddish than those in the Northern Hemisphere, it is important to better understand the physical, chemical and optical properties of this important aerosol. We have investigated the properties of Australian desert dust at a site in SW Queensland, which is strongly influenced by both dust and biomass burning aerosol. <br><br> Three years of ground-based monitoring of spectral optical thickness has provided a statistical picture of gross aerosol properties. The aerosol optical depth data showed a clear though moderate seasonal cycle with an annual mean of 0.06 &plusmn; 0.03. The Angstrom coefficient showed a stronger cycle, indicating the influence of the winter-spring burning season in Australia's north. AERONET size distributions showed a generally bimodal character, with the coarse mode assumed to be mineral dust, and the fine mode a mixture of fine dust, biomass burning and marine biogenic material. <br><br> In November 2006 we undertook a field campaign which collected 4 sets of size-resolved aerosol samples for laboratory analysis – ion beam analysis and ion chromatography. Ion beam analysis was used to determine the elemental composition of all filter samples, although elemental ratios were considered the most reliable output. Scatter plots showed that Fe, Al and Ti were well correlated with Si, and Co reasonably well correlated with Si, with the Fe/Al ratio somewhat higher than values reported from Northern Hemisphere sites (as expected). Scatter plots for Ca, Mn and K against Si showed clear evidence of a second population, which in some cases could be identified with a particular sample day or size fraction. These data may be used to attempt to build a signature of soil in this region of the Australian interior. <br><br> Ion chromatography was used to quantify water soluble ions for 2 of our sample sets, complementing the picture provided by ion beam analysis. The strong similarities between the MSA and SO<sub>4</sub><sup>2&minus;</sup> size distributions argue strongly for a marine origin of much of the SO<sub>4</sub><sup>2&minus;</sup>. The similarity of the Na<sup>+</sup>, Cl<sup>&minus;</sup> and Mg<sup>2+</sup> size distributions also argue for a marine contribution. Further, we believe that both NO<sub>3</sub><sup>&minus;</sup> and NH<sub>4</sub><sup>+</sup> are the result of surface reactions with appropriate gases
Role of loop entropy in the force induced melting of DNA hairpin
Dynamics of a single stranded DNA, which can form a hairpin have been studied
in the constant force ensemble. Using Langevin dynamics simulations, we
obtained the force-temperature diagram, which differs from the theoretical
prediction based on the lattice model. Probability analysis of the extreme
bases of the stem revealed that at high temperature, the hairpin to coil
transition is entropy dominated and the loop contributes significantly in its
opening. However, at low temperature, the transition is force driven and the
hairpin opens from the stem side. It is shown that the elastic energy plays a
crucial role at high force. As a result, the phase diagram differs
significantly with the theoretical prediction.Comment: 9 pages, 8 figures; J. Chem. Phys (2011
Economies of collaboration in build-to-model operations
This is the final version. Available from the publisher via the DOI in this record.The direct-from-model and tool-less manufacturing process of 3D printing (3DP) embodies a general-purpose technology, facilitating capacity sharing and outsourcing. Starting from a case study of a 3DP company (Shapeways) and a new market entrant (Panalpina), we develop dynamic practices for partial outsourcing in build-to-model manufacturing. We propose a new outsourcing scheme, bidirectional partial outsourcing (BPO), where 3D printers share capacity by alternating between the role of outsourcer and subcontractor based on need. Coupled with order book smoothing (OBS), where orders are released gradually to production, this provides 3D printers with two distinct ways to manage demand variability. By combining demand and cost field data with an analytical model, we find that BPO improves 3DP cost efficiency and delivery performance as the number of 3DP firms in the network increases. OBS is sufficient for an established 3D printer when alternatives to in-house manufacturing are few, or of limited capacity. Nevertheless, OBS comes at the cost of reduced responsiveness, whereas BPO shifts the cost and delivery performance frontier. Our analysis shows how BPO combined with OBS makes 3DP companies more resilient to downward movements in both demand and price levels.Innovate UKEngineering and Physical Sciences Research Council (EPSRC
- …