8,901 research outputs found

    Development of the D-Optimality-Based Coordinate-Exchange Algorithm for an Irregular Design Space and the Mixed-Integer Nonlinear Robust Parameter Design Optimization

    Get PDF
    Robust parameter design (RPD), originally conceptualized by Taguchi, is an effective statistical design method for continuous quality improvement by incorporating product quality into the design of processes. The primary goal of RPD is to identify optimal input variable level settings with minimum process bias and variation. Because of its practicality in reducing inherent uncertainties associated with system performance across key product and process dimensions, the widespread application of RPD techniques to many engineering and science fields has resulted in significant improvements in product quality and process enhancement. There is little disagreement among researchers about Taguchi\u27s basic philosophy. In response to apparent mathematical flaws surrounding his original version of RPD, researchers have closely examined alternative approaches by incorporating well-established statistical methods, particularly the response surface methodology (RSM), while accepting the main philosophy of his RPD concepts. This particular RSM-based RPD method predominantly employs the central composite design technique with the assumption that input variables are quantitative on a continuous scale. There is a large number of practical situations in which a combination of input variables is of real-valued quantitative variables on a continuous scale and qualitative variables such as integer- and binary-valued variables. Despite the practicality of such cases in real-world engineering problems, there has been little research attempt, if any, perhaps due to mathematical hurdles in terms of inconsistencies between a design space in the experimental phase and a solution space in the optimization phase. For instance, the design space associated with the central composite design, which is perhaps known as the most effective response surface design for a second-order prediction model, is typically a bounded convex feasible set involving real numbers due to its inherent real-valued axial design points; however, its solution space may consist of integer and real values. Along the lines, this dissertation proposes RPD optimization models under three different scenarios. Given integer-valued constraints, this dissertation discusses why the Box-Behnken design is preferred over the central composite design and other three-level designs, while maintaining constant or nearly constant prediction variance, called the design rotatability, associated with a second-order model. Box-Behnken design embedded mixed integer nonlinear programming models are then proposed. As a solution method, the Karush-Kuhn-Tucker conditions are developed and the sequential quadratic integer programming technique is also used. Further, given binary-valued constraints, this dissertation investigates why neither the central composite design nor the Box-Behnken design is effective. To remedy this potential problem, several 0-1 mixed integer nonlinear programming models are proposed by laying out the foundation of a three-level factorial design with pseudo center points. For these particular models, we use standard optimization methods such as the branch-and-bound technique, the outer approximation method, and the hybrid nonlinear based branch-and-cut algorithm. Finally, there exist some special situations during the experimental phase where the situation may call for reducing the number of experimental runs or using a reduced regression model in fitting the data. Furthermore, there are special situations where the experimental design space is constrained, and therefore optimal design points should be generated. In these particular situations, traditional experimental designs may not be appropriate. D-optimal experimental designs are investigated and incorporated into nonlinear programming models, as the design region is typically irregular which may end up being a convex problem. It is believed that the research work contained in this dissertation is the initial examination in the related literature and makes a considerable contribution to an existing body of knowledge by filling research gaps

    Model-Based Analysis for Qualitative Data: An Application in Drosophila Germline Stem Cell Regulation.

    Get PDF
    Discovery in developmental biology is often driven by intuition that relies on the integration of multiple types of data such as fluorescent images, phenotypes, and the outcomes of biochemical assays. Mathematical modeling helps elucidate the biological mechanisms at play as the networks become increasingly large and complex. However, the available data is frequently under-utilized due to incompatibility with quantitative model tuning techniques. This is the case for stem cell regulation mechanisms explored in the Drosophila germarium through fluorescent immunohistochemistry. To enable better integration of biological data with modeling in this and similar situations, we have developed a general parameter estimation process to quantitatively optimize models with qualitative data. The process employs a modified version of the Optimal Scaling method from social and behavioral sciences, and multi-objective optimization to evaluate the trade-off between fitting different datasets (e.g. wild type vs. mutant). Using only published imaging data in the germarium, we first evaluated support for a published intracellular regulatory network by considering alternative connections of the same regulatory players. Simply screening networks against wild type data identified hundreds of feasible alternatives. Of these, five parsimonious variants were found and compared by multi-objective analysis including mutant data and dynamic constraints. With these data, the current model is supported over the alternatives, but support for a biochemically observed feedback element is weak (i.e. these data do not measure the feedback effect well). When also comparing new hypothetical models, the available data do not discriminate. To begin addressing the limitations in data, we performed a model-based experiment design and provide recommendations for experiments to refine model parameters and discriminate increasingly complex hypotheses

    Psychometrics in Practice at RCEC

    Get PDF
    A broad range of topics is dealt with in this volume: from combining the psychometric generalizability and item response theories to the ideas for an integrated formative use of data-driven decision making, assessment for learning and diagnostic testing. A number of chapters pay attention to computerized (adaptive) and classification testing. Other chapters treat the quality of testing in a general sense, but for topics like maintaining standards or the testing of writing ability, the quality of testing is dealt with more specifically.\ud All authors are connected to RCEC as researchers. They present one of their current research topics and provide some insight into the focus of RCEC. The selection of the topics and the editing intends that the book should be of special interest to educational researchers, psychometricians and practitioners in educational assessment

    Radiostereometric analysis in total hip arthroplasty and hip fracture patients

    Get PDF
    Complications related to primary total hip arthroplasty (THA) are relatively rare but still impose a significant burden on the recovery of individual patients and incur significant costs to the healthcare system. Research aimed at improving the results of THA is challenging as complications can take up to decades to manifest clinically. However, radiostereometric analysis (RSA) can, in some cases, be used to predict the long-term revision rates of THA with only a two-year follow-up. The purpose of this doctoral thesis was to examine the causes of RSA-measured micromotion and to further develop the methodology for the research of THA and hip fracture patients. The first study examined whether preoperative systemic bone mineral density (BMD) had an effect on the early RSA-measured micromotion of a cementless acetabular cup in female patients with osteoarthritis. The second study considered tha suitability of model-based RSA (MBRSA) for the analysis of a cementless femoral stem using both a phantom model and a clinical cohort. The third study validated differentially-loaded RSA (DLRSA) for the study of internally-fixated femoral neck fractures in a clinical cohort of 16 patients. The final study examined if RSA data analysis would benefit from the use of a multivariate three-dimensional analytical method. Low systemic BMD was associated with increased proximal migration of the cementless acetabular cups. The MBRSA proved to have comparable accuracy and precision compared to conventional RSA thereby validating the method for future clinical studies using the examined femoral stem. The deployed DLRSA methodology could be used to detect inducibile micromotion of femoral neck fractures. A multivariate linear mixed-effects model could provide a more robust and sensitive method for the analysis of three-dimensional RSA data.   Radiostereometrinen analyysi lonkan kokotekonivelen ja lonkkamurtumien tutkimuksessa Lonkan kokotekonivelleikkauksen komplikaatiot ovat harvinaisia mutta aiheuttavat merkittävää haittaa yksittäisille potilaille sekä merkittäviä taloudellisia kustannuksia terveydenhuollolle. Tutkimustyö lonkan kokotekonivelleikkauksien tuloksien parantamiseksi on haastavaa, koska komplikaatioiden ilmenemiseen voi kulua jopa vuosikymmeniä ja silloinkin harvinaisten komplikaatioiden todentamiseksi tarvittaisiin suuria potilasjoukkoja. Radiostereometrisellä analyysillä (RSA) voidaan tietyissä tapauksissa kuitenkin ennakoida uusintaleikkauksen riskiä jo kahden vuoden seuranta-ajalla. Tämän väitöskirjan tavoitteena oli laajentaa nykyistä tietoa RSA:lla mitattavan mikroliikkeen syistä ja merkityksestä sekä kehittää RSAmenetelmää lonkan kokotekonivel- ja lonkkamurtumapotilailla. Ensimmäisessä osatyössä tutkittiin, onko luuntiheydellä merkitystä sementittömän lonkan kokotekonivelen kuppiosan RSA:lla mitattuun mikroliikkeeseen nivelrikkoa sairastaneilla naispotilailla. Toisessa osatyössä tutkittiin kolmiulotteiseen mallinnukseen perustuvan RSA-menetelmän (MBRSA) soveltuvuutta sementittömän lonkan tekonivelen varren tutkimukseen. MBRSA menetelmää tutkittiin ensin fantomia käyttäen ja myöhemmin tulokset varmistettiin lonkan kokotekoniveltutkimukseen osallistuneilla potilailla. Kolmannessa osatyössä selvitettiin kuormituksen aiheuttaman RSA-mikroliikkeen (DLRSA) käyttöä reisiluun kaulan murtumien tutkimuksessa. Neljännessä osatyössä selvitettiin, hyötyisivätkö RSA-tutkimukset moniulotteisesta tilastollisesta menetelmästä. Sementittömän lonkan kokotekonivelen kuppiosan varhainen mikroliike olisuurentunutta potilailla, joiden luuntiheys oli alentunut. MBRSA menetelmä soveltuu tutkitun tekonivelen varren seurantaan ja käyttöön tulevissa tutkimuksissa. Kehitettyä DLRSA-menetelmää voidaan käyttää reisiluun kaulan murtumien tutkimuksessa. Kolmiulotteisella tilastollisella mallintamisella voidaan havaita yksiulotteisia menetelmiä herkemmin ja spesifisemmin eroja RSA-mikroliikkeessä

    Prime diagnosticity in short-term repetition priming: Is primed evidence discounted, even when it reliably indicates the correct answer?

    Get PDF
    The authors conducted 4 repetition priming experiments that manipulated prime duration and prime diagnosticity in a visual forced-choice perceptual identification task. The strength and direction of prime diagnosticity produced marked effects on identification accuracy, but those effects were resistant to subsequent changes of diagnosticity. Participants learned to associate different diagnosticities with primes of different durations but not with primes presented in different colors. Regardless of prime diagnosticity, preference for a primed alternative covaried negatively with prime duration, suggesting that even for diagnostic primes, evidence discounting remains an important factor. A computational model, with the assumption that adaptation to the statistics of the experiment modulates the level of evidence discounting, accounted for these results

    Historical forest biomass dynamics modelled with Landsat spectral trajectories

    Get PDF
    Acknowledgements National Forest Inventory data are available online, provided by Ministerio de Agricultura, Alimentación y Medio Ambiente (España). Landsat images are available online, provided by the USGS.Peer reviewedPostprin

    Shot noise-mitigated secondary electron imaging with ion count-aided microscopy

    Full text link
    Modern science is dependent on imaging on the nanoscale, often achieved through processes that detect secondary electrons created by a highly focused incident charged particle beam. Scanning electron microscopy is employed in applications such as critical-dimension metrology and inspection for semiconductor devices, materials characterization in geology, and examination of biological samples. With its applicability to non-conducting materials (not requiring sample coating before imaging), helium ion microscopy (HIM) is especially useful in the high-resolution imaging of biological samples such as animal organs, tumor cells, and viruses. However, multiple types of measurement noise limit the ultimate trade-off between image quality and the incident particle dose, which can preclude useful imaging of dose-sensitive samples. Existing methods to improve image quality do not fundamentally mitigate the noise sources. Furthermore, barriers to assigning a physically meaningful scale make these modalities qualitative. Here we introduce ion count-aided microscopy (ICAM), which is a quantitative imaging technique that uses statistically principled estimation of the secondary electron yield. With a readily implemented change in data collection, ICAM nearly eliminates the influence of source shot noise -- the random variation in the number of incident ions in a fixed time duration. In HIM, we demonstrate 3x dose reduction; based on a good match between these empirical results and theoretical performance predictions, the dose reduction factor is larger when the secondary electron yield is higher. ICAM thus facilitates imaging of fragile samples and may make imaging with heavier particles more attractive

    Modeling and Optimization of Stochastic Process Parameters in Complex Engineering Systems

    Get PDF
    For quality engineering researchers and practitioners, a wide number of statistical tools and techniques are available for use in the manufacturing industry. The objective or goal in applying these tools has always been to improve or optimize a product or process in terms of efficiency, production cost, or product quality. While tremendous progress has been made in the design of quality optimization models, there remains a significant gap between existing research and the needs of the industrial community. Contemporary manufacturing processes are inherently more complex - they may involve multiple stages of production or require the assessment of multiple quality characteristics. New and emerging fields, such as nanoelectronics and molecular biometrics, demand increased degrees of precision and estimation, that which is not attainable with current tools and measures. And since most researchers will focus on a specific type of characteristic or a given set of conditions, there are many critical industrial processes for which models are not applicable. Thus, the objective of this research is to improve existing techniques by not only expanding their range of applicability, but also their ability to more realistically model a given process. Several quality models are proposed that seek greater precision in the estimation of the process parameters and the removal of assumptions that limit their breadth and scope. An extension is made to examine the effectiveness of these models in both non-standard conditions and in areas that have not been previously investigated. Upon the completion of an in-depth literature review, various quality models are proposed, and numerical examples are used to validate the use of these methodologies

    VI Workshop on Computational Data Analysis and Numerical Methods: Book of Abstracts

    Get PDF
    The VI Workshop on Computational Data Analysis and Numerical Methods (WCDANM) is going to be held on June 27-29, 2019, in the Department of Mathematics of the University of Beira Interior (UBI), Covilhã, Portugal and it is a unique opportunity to disseminate scientific research related to the areas of Mathematics in general, with particular relevance to the areas of Computational Data Analysis and Numerical Methods in theoretical and/or practical field, using new techniques, giving especial emphasis to applications in Medicine, Biology, Biotechnology, Engineering, Industry, Environmental Sciences, Finance, Insurance, Management and Administration. The meeting will provide a forum for discussion and debate of ideas with interest to the scientific community in general. With this meeting new scientific collaborations among colleagues, namely new collaborations in Masters and PhD projects are expected. The event is open to the entire scientific community (with or without communication/poster)
    corecore