1,523 research outputs found

    Performance analysis of Web services-based systems with sensitivity analysis.

    Get PDF
    By the time the architecture of a software system is decided in the life-cycle of software development, performance problems become very costly, if not impossible, to fix. It is, therefore, necessary to push performance analysis back to the architectural design stage as an effective means to improve the performance of software systems. This is especially true for web services-based systems, where system performance is of paramount importance. There are typically three steps in performance evaluation of software architectures. The first step is to transform the architecture of a software system in forms of annotated UML models into a performance model, such as the layered queueing network model (LQN). Experiments on the performance model are then conducted in the second step with a performance analysis tool, such as the LQN solver. Experiment results are finally fed back to architecture design in the last step for refinement of UML models according to the quantitative analysis of software performance. Nevertheless, accurate analysis results require performance analysis to take sensitivity analysis into consideration in between the second and third steps. Unfortunately, little research has been done in this regard. This thesis carries out a study in performance analysis with sensitivity analysis. It develops a new method that uses the design of experiments (DoE) techniques to quantitatively analyze the sensitivity of a system\u27s performance output due to the effect of the system\u27s input factors, and the interaction between those factors. The goal of this research is to provide more accurate feedback to software designers on the development of service-oriented software systems. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis2004 .H8245. Source: Masters Abstracts International, Volume: 44-01, page: 0393. Thesis (M.Sc.)--University of Windsor (Canada), 2005

    Delivery of consistent and high-quality antibody therapeutics by actively monitoring and controlling critical quality attributes

    Get PDF
    Therapeutic recombinant monoclonal antibodies (mAbs) display a wide variety of critical quality attributes (CQAs) that are essential for achieving their safety and efficacy endpoint in patients. Traditionally, to ensure consistent product quality, manufacturing processes are designed to control CQAs by operating process parameters within defined ranges. This “process defines product” approach has produced many successes within the biopharmaceutical industry, albeit, with limited understanding of the underlying mechanisms between the process parameters and CQAs. Recently, with inclusion of biosimilars and novel modalities into Amgen’s pipeline, meeting tightly specified CQAs using this traditional approach has sometimes proven to be challenging. To better meet such challenges moving forward, we need to develop processes that are adaptable and yet offer robust CQAs control. One strategy for accomplishing this is to develop a product attribute control (PAC) platform that integrates process science with process model control to modulate the CQAs throughout the production processes. PAC is an attribute-focused method that starts by defining desired CQAs and further elucidating the process and attribute relationship (PAR). PAR provides mechanistic understanding of how process parameters (levers) impact CQAs and identifies effective levers that could modulate CQAs of interest within pre-determined ranges. One of the key elements of a PAC process is the integration of process analytical technology (PAT) elements to enact real-time sampling and analytics. Based on real-time process inputs and CQA responses generated by PAT, a mechanistic predictive control model (MPC) or an empirical multivariate statistical process control model (MSPC) for one or more CQAs can be created, and integrated into PAC. In addition, such an approach begins with initial clone selection, with the goal of identifying production cell lines that are responsive to process levers over a dynamic range that will enable adaptive control. This PAC strategy, by combining PAR, PAT, and MPC/MSPC, enables CQAs to be monitored, predicted, and controlled throughout the production process. A study demonstrating control of glycan CQAs incorporating aforementioned PAC strategy will be demonstrated in this presentation. This newly proposed strategy enables robust CQAs control to challenging molecules and ensures the delivery of high quality mAb therapeutics to our patients

    Study of the Heat Transfer Effect in Moxibustion Practice

    Get PDF

    Support Vector Machines for Credit Scoring and discovery of significant features

    Get PDF
    The assessment of risk of default on credit is important for financial institutions. Logistic regression and discriminant analysis are techniques traditionally used in credit scoring for determining likelihood to default based on consumer application and credit reference agency data. We test support vector machines against these traditional methods on a large credit card database. We find that they are competitive and can be used as the basis of a feature selection method to discover those features that are most significant in determining risk of default. 1

    A Microparticle Swarm Optimizer for the Reconstruction of Microwave Images

    Full text link

    PhEDEx Data Service

    Get PDF
    The PhEDEx Data Service provides access to information from the central PhEDEx database, as well as certificate-authenticated managerial operations such as requesting the transfer or deletion of data. The Data Service is integrated with the SiteDB service for fine-grained access control, providing a safe and secure environment for operations. A plug-in architecture allows server-side modules to be developed rapidly and easily by anyone familiar with the schema, and can automatically return the data in a variety of formats for use by different client technologies. Using HTTP access via the Data Service instead of direct database connections makes it possible to build monitoring web-pages with complex drill-down operations, suitable for debugging or presentation from many aspects. This will form the basis of the new PhEDEx website in the near future, as well as providing access to PhEDEx information and certificate-authenticated services for other CMS dataflow and workflow management tools such as CRAB, WMCore, DBS and the dashboard. A PhEDEx command-line client tool provides one-stop access to all the functions of the PhEDEx Data Service interactively, for use in simple scripts that do not access the service directly. The client tool provides certificate-authenticated access to managerial functions, so all the functions of the PhEDEx Data Service are available to it. The tool can be expanded by plug-ins which can combine or extend the client-side manipulation of data from the Data Service, providing a powerful environment for manipulating data within PhEDEx
    • 

    corecore