734 research outputs found
Case study in six sigma methadology : manufacturing quality improvement and guidence for managers
This article discusses the successful implementation of Six Sigma methodology in a high precision and critical process in the manufacture of automotive products. The Six Sigma define–measure–analyse–improve–control approach resulted in a reduction of tolerance-related problems and improved the first pass yield from 85% to 99.4%. Data were collected on all possible causes and regression analysis, hypothesis testing, Taguchi methods, classification and regression tree, etc. were used to analyse the data and draw conclusions. Implementation of Six Sigma methodology had a significant financial impact on the profitability of the company. An approximate saving of US$70,000 per annum was reported, which is in addition to the customer-facing benefits of improved quality on returns and sales. The project also had the benefit of allowing the company to learn useful messages that will guide future Six Sigma activities
A rainfall model for drought risk analysis in south-east UK
Drought risk assessment ideally requires long-term rainfall records especially where inter-annual droughts are of potential concern, and spatially consistent estimates of rainfall to support regional and inter-regional scale assessments. This paper addresses these challenges by developing a spatially consistent stochastic model of monthly rainfall for south-east UK. Conditioned on 50 gauged sites, the model infills the historic record from 1855-2011 in both space and time, and extends the record by synthesising droughts which are consistent with the observed rainfall statistics. The long record length allows more insight into the variability of rainfall and potentially a stronger basis for risk assessment than is generally possible. It is shown that, although localised biases exist in both space and time, the model results are generally consistent with the observed record including for a range of inter-annual droughts and spatial statistics. Simulations show that some of the most severe inter-annual droughts on the record may recur, despite a trend towards generally wetter winters
Correlation between nucleotide composition and folding energy of coding sequences with special attention to wobble bases
Background: The secondary structure and complexity of mRNA influences its
accessibility to regulatory molecules (proteins, micro-RNAs), its stability and
its level of expression. The mobile elements of the RNA sequence, the wobble
bases, are expected to regulate the formation of structures encompassing coding
sequences.
Results: The sequence/folding energy (FE) relationship was studied by
statistical, bioinformatic methods in 90 CDS containing 26,370 codons. I found
that the FE (dG) associated with coding sequences is significant and negative
(407 kcal/1000 bases, mean +/- S.E.M.) indicating that these sequences are able
to form structures. However, the FE has only a small free component, less than
10% of the total. The contribution of the 1st and 3rd codon bases to the FE is
larger than the contribution of the 2nd (central) bases. It is possible to
achieve a ~ 4-fold change in FE by altering the wobble bases in synonymous
codons. The sequence/FE relationship can be described with a simple algorithm,
and the total FE can be predicted solely from the sequence composition of the
nucleic acid. The contributions of different synonymous codons to the FE are
additive and one codon cannot replace another. The accumulated contributions of
synonymous codons of an amino acid to the total folding energy of an mRNA is
strongly correlated to the relative amount of that amino acid in the translated
protein.
Conclusion: Synonymous codons are not interchangable with regard to their
role in determining the mRNA FE and the relative amounts of amino acids in the
translated protein, even if they are indistinguishable in respect of amino acid
coding.Comment: 14 pages including 6 figures and 1 tabl
Seasonal and spatial variations of saltmarsh benthic foraminiferal communities from North Norfolk, England
Time series foraminiferal data were obtained from samples collected from three sites at Brancaster Overy Staithe, Burnham Overy Staithe and Thornham on the North Norfolk coast over a 1-year period. At each collection point, six environmental variables—temperature, chlorophyll, sand, mud, pH and salinity—were also measured. The principle aim of this study was to examine the benthic foraminiferal fauna in regard to the temporal variability of foraminiferal abundance, seasonal trend, dominant species, species diversity and the impact of environmental variables on the foraminiferal communities in the top 1 cm of sediment over a 1-year time series. The foraminiferal assemblages at the three sites were dominated by three species: Haynesina germanica, Ammonia sp. and Elphidium williamsoni. Foraminiferal species showed considerable seasonal and temporal fluctuation throughout the year at the three investigated sites. The foraminiferal assemblage at the three low marsh zones showed a maximum abundance in autumn between September and November and a minimum abundance observed between July and August. There were two separate peaks in the abundance of Ammonia sp. and E. williamsoni, one in spring and another in autumn. In contrast, H. germanica showed a single peak in its abundance in autumn. A generalized additive modelling approach was used to explain the variation in the observed foraminiferal abundance and to estimate the significant impact of each of the environmental variables on living foraminiferal assemblages, with taxa abundance as the dependent variable. When included in the model as predictors, most of the environmental variables contributed little in explaining the observed variation in foraminiferal species abundance. However, the hypotheses for differences amongst sites, salinity and pH were significant and explained most of the variability in species relative abundance
Do serum biomarkers really measure breast cancer?
Background
Because screening mammography for breast cancer is less effective for premenopausal women, we investigated the feasibility of a diagnostic blood test using serum proteins.
Methods
This study used a set of 98 serum proteins and chose diagnostically relevant subsets via various feature-selection techniques. Because of significant noise in the data set, we applied iterated Bayesian model averaging to account for model selection uncertainty and to improve generalization performance. We assessed generalization performance using leave-one-out cross-validation (LOOCV) and receiver operating characteristic (ROC) curve analysis.
Results
The classifiers were able to distinguish normal tissue from breast cancer with a classification performance of AUC = 0.82 ± 0.04 with the proteins MIF, MMP-9, and MPO. The classifiers distinguished normal tissue from benign lesions similarly at AUC = 0.80 ± 0.05. However, the serum proteins of benign and malignant lesions were indistinguishable (AUC = 0.55 ± 0.06). The classification tasks of normal vs. cancer and normal vs. benign selected the same top feature: MIF, which suggests that the biomarkers indicated inflammatory response rather than cancer.
Conclusion
Overall, the selected serum proteins showed moderate ability for detecting lesions. However, they are probably more indicative of secondary effects such as inflammation rather than specific for malignancy.United States. Dept. of Defense. Breast Cancer Research Program (Grant No. W81XWH-05-1-0292)National Institutes of Health (U.S.) (R01 CA-112437-01)National Institutes of Health (U.S.) (NIH CA 84955
Recommended from our members
Statistical decadal predictions for sea surface temperatures: a benchmark for dynamical GCM predictions
Accurate decadal climate predictions could be used to inform adaptation actions to a changing climate. The skill of such predictions from initialised dynamical global climate models (GCMs) may be assessed by comparing with predictions from statistical models which are based solely on historical observations. This paper presents two benchmark statistical models for predicting both the radiatively forced trend and internal variability of annual mean sea surface temperatures (SSTs) on a decadal timescale based on the gridded observation data set HadISST. For both statistical models, the trend related to radiative forcing is modelled using a linear regression of SST time series at each grid box on the time series of equivalent global mean atmospheric CO2 concentration. The residual internal variability is then modelled by (1) a first-order autoregressive model (AR1) and (2) a constructed analogue model (CA). From the verification of 46 retrospective forecasts with start years from 1960 to 2005, the correlation coefficient for anomaly forecasts using trend with AR1 is greater than 0.7 over parts of extra-tropical North Atlantic, the Indian Ocean and western Pacific. This is primarily related to the prediction of the forced trend. More importantly, both CA and AR1 give skillful predictions of the internal variability of SSTs in the subpolar gyre region over the far North Atlantic for lead time of 2 to 5 years, with correlation coefficients greater than 0.5. For the subpolar gyre and parts of the South Atlantic, CA is superior to AR1 for lead time of 6 to 9 years. These statistical forecasts are also compared with ensemble mean retrospective forecasts by DePreSys, an initialised GCM. DePreSys is found to outperform the statistical models over large parts of North Atlantic for lead times of 2 to 5 years and 6 to 9 years, however trend with AR1 is generally superior to DePreSys in the North Atlantic Current region, while trend with CA is superior to DePreSys in parts of South Atlantic for lead time of 6 to 9 years. These findings encourage further development of benchmark statistical decadal prediction models, and methods to combine different predictions
The Box-Cox power transformation on nursing sensitive indicators: Does it matter if structural effects are omitted during the estimation of the transformation parameter?
Measurement of the W boson mass using electrons at large rapidities
We report a measurement of the W boson mass based on an integrated luminosity of 82/pb from p-pbar collisions at sqrt(s) = 1.8 TeV recorded in 1994-1995 by the D0 detector at the Fermilab Tevatron. We identify W bosons by their decays to e-nu, where the electron is detected in the forward calorimeters. We extract the mass by fitting the transverse mass and the electron and neutrino transverse momentum spectra of 11,089 W boson candidates. We measure Mw = 80.691 +- 0.227 GeV. By combining this measurement with our previously published central calorimeter results from data taken in 1992-1993 and 1994-1995, we obtain Mw = 80.482 +- 0.091 GeV
Search for New Physics Using Quaero: A General Interface to - D0 Event Data
We describe Quaero, a method that i) enables the automatic optimization of searches for physics beyond the standard model, and ii) provides a mechanism for making high energy collider data generally available. We apply Quaero to searches for standard model WW, ZZ, and ttbar production, and to searches for these objects produced through a new heavy resonance. Through this interface, we make three data sets collected by the D0 experiment at sqrt(s)=1.8 TeV publicly available
Precalibrating an intermediate complexity climate model
Credible climate predictions require a rational quantification of uncertainty, but full Bayesian calibration requires detailed estimates of prior probability distributions and covariances, which are difficult to obtain in practice. We describe a simplified procedure, termed precalibration, which provides an approximate quantification of uncertainty in climate prediction, and requires only that uncontroversially implausible values of certain inputs and outputs are identified. The method is applied to intermediate-complexity model simulations of the Atlantic meridional overturning circulation (AMOC) and confirms the existence of a cliff-edge catastrophe in freshwaterforcing input space. When uncertainty in 14 further parameters is taken into account, an implausible, AMOC-off, region remains as a robust feature of the model dynamics, but its location is found to depend strongly on values of the other parameters
- …
