146,709 research outputs found

    Data on the optimisation and validation of a liquid chromatography-high-resolution mass spectrometry (LC-HRMS) to establish the presence of phosphodiesterase 5 (PDE5) inhibitors in instant coffee premixes

    Full text link
    © 2019 The Authors This paper presents the data on the optimisation and validation of a liquid chromatography-high-resolution mass spectrometry (LC-HRMS) to establish the presence of phosphodiesterase 5 (PDE5) inhibitors and their analogues as adulterants in instant coffee premixes. The method development data covered chromatographic optimisation for better analyte separation and isomeric resolution, mass spectrometry optimisation for high sensitivity and sample preparation optimisation for high extraction recovery (RE) and low matrix effect (ME). The validation data covered specificity, linearity, range, accuracy, limit of detection, limit of quantification, precisions, ME, and RE. The optimisation and validation data presented here is related to the article: “Determination of phosphodiesterase 5 (PDE5) inhibitors in instant coffee premixes using liquid chromatography-high-resolution mass spectrometry (LC-HRMS)” Mohd Yusop et al., 2019

    Design of general-purpose sampling strategies for geometric shape measurement

    Get PDF
    Quality inspection is a preliminary step for different further analyses (process monitoring, control and optimisation) and requires one to select a measuring strategy, i.e., number and location of measurement points. This phase of data gathering usually impacts on inspection times and costs (via sample size) but it also affects the performance of the following tasks (process monitoring, control and optimisation). While most of the approaches for sampling design are specifically presented with reference to a target application (namely, monitoring, control or optimisation), this paper presents a general-purpose procedure, where the number and location of measurement points are selected in order to retain most of the information related to the feature under study. The procedure is based on principal component analysis and its application is shown with reference to a real case study concerning the left front window of a car. A different approach based on multidimensional scaling is further applied as validation tool, in order to show the effectiveness of the PCA solution

    Solving optimisation problems in metal forming using Finite Element simulation and metamodelling techniques

    Get PDF
    During the last decades, Finite Element (FEM) simulations\ud of metal forming processes have become important\ud tools for designing feasible production processes. In more\ud recent years, several authors recognised the potential of\ud coupling FEM simulations to mathematical optimisation\ud algorithms to design optimal metal forming processes instead\ud of only feasible ones.\ud Within the current project, an optimisation strategy is being\ud developed, which is capable of optimising metal forming\ud processes in general using time consuming nonlinear\ud FEM simulations. The expression “optimisation strategy”\ud is used to emphasise that the focus is not solely on solving\ud optimisation problems by an optimisation algorithm, but\ud the way these optimisation problems in metal forming are\ud modelled is also investigated. This modelling comprises\ud the quantification of objective functions and constraints\ud and the selection of design variables.\ud This paper, however, is concerned with the choice for\ud and the implementation of an optimisation algorithm for\ud solving optimisation problems in metal forming. Several\ud groups of optimisation algorithms can be encountered in\ud metal forming literature: classical iterative, genetic and\ud approximate optimisation algorithms are already applied\ud in the field. We propose a metamodel based optimisation\ud algorithm belonging to the latter group, since approximate\ud algorithms are relatively efficient in case of time consuming\ud function evaluations such as the nonlinear FEM calculations\ud we are considering. Additionally, approximate optimisation\ud algorithms strive for a global optimum and do\ud not need sensitivities, which are quite difficult to obtain\ud for FEM simulations. A final advantage of approximate\ud optimisation algorithms is the process knowledge, which\ud can be gained by visualising metamodels.\ud In this paper, we propose a sequential approximate optimisation\ud algorithm, which incorporates both Response\ud Surface Methodology (RSM) and Design and Analysis\ud of Computer Experiments (DACE) metamodelling techniques.\ud RSM is based on fitting lower order polynomials\ud by least squares regression, whereas DACE uses Kriging\ud interpolation functions as metamodels. Most authors in\ud the field of metal forming use RSM, although this metamodelling\ud technique was originally developed for physical\ud experiments that are known to have a stochastic na-\ud ¤Faculty of Engineering Technology (Applied Mechanics group),\ud University of Twente, P.O. Box 217, 7500 AE, Enschede, The Netherlands,\ud email: [email protected]\ud ture due to measurement noise present. This measurement\ud noise is absent in case of deterministic computer experiments\ud such as FEM simulations. Hence, an interpolation\ud model fitted by DACE is thought to be more applicable in\ud combination with metal forming simulations. Nevertheless,\ud the proposed algorithm utilises both RSM and DACE\ud metamodelling techniques.\ud As a Design Of Experiments (DOE) strategy, a combination\ud of a maximin spacefilling Latin Hypercubes Design\ud and a full factorial design was implemented, which takes\ud into account explicit constraints. Additionally, the algorithm\ud incorporates cross validation as a metamodel validation\ud technique and uses a Sequential Quadratic Programming\ud algorithm for metamodel optimisation. To overcome\ud the problem of ending up in a local optimum, the\ud SQP algorithm is initialised from every DOE point, which\ud is very time efficient since evaluating the metamodels can\ud be done within a fraction of a second. The proposed algorithm\ud allows for sequential improvement of the metamodels\ud to obtain a more accurate optimum.\ud As an example case, the optimisation algorithm was applied\ud to obtain the optimised internal pressure and axial\ud feeding load paths to minimise wall thickness variations\ud in a simple hydroformed product. The results are satisfactory,\ud which shows the good applicability of metamodelling\ud techniques to optimise metal forming processes using\ud time consuming FEM simulations

    Human activity recognition making use of long short-term memory techniques

    Get PDF
    The optimisation and validation of a classifiers performance when applied to real world problems is not always effectively shown. In much of the literature describing the application of artificial neural network architectures to Human Activity Recognition (HAR) problems, postural transitions are grouped together and treated as a singular class. This paper proposes, investigates and validates the development of an optimised artificial neural network based on Long-Short Term Memory techniques (LSTM), with repeated cross validation used to validate the performance of the classifier. The results of the optimised LSTM classifier are comparable or better to that of previous research making use of the same dataset, achieving 95% accuracy under repeated 10-fold cross validation using grouped postural transitions. The work in this paper also achieves 94% accuracy under repeated 10-fold cross validation whilst treating each common postural transition as a separate class (and thus providing more context to each activity)

    Kriging based robust optimisation algorithm for minimax problems in electromagnetics

    No full text
    The paper discusses some of the recent progress in kriging based worst-case design optimisation and proposes a new two-stage approach to solve practical problems. The efficiency of the infill points allocation is largely improved by adding an extra layer of optimisation enhanced by a validation process

    A Review of System Development Systems

    Get PDF
    The requirements for a system development system are defined and used as guidelines to review six such systems: SAMM, SREM, SADT, ADS / SODA, PSL/PSA and Systematics. It is found that current system development systems emphasise only validation and user verification. They can perform relatively little on automatic file optimisation, process optimisation and maintenance.postprin

    Automated model based engine calibration procedure using co-simulation

    Get PDF
    The final validation and sign-off of a production powertrain control module (PCM) calibration is a time-consuming and expensive task and requires a high degree of expertise. There are two main reasons for this; firstly, the validation test is an iterative process due to the fact that calibration changes may affect the true operating point of the engine at the desired test point. Secondly, modifications to the calibration require expert knowledge of the complete control strategy so as to improve the correlation to validation data without potentially negatively impacting the correlated mapping points. This paper describes the implementation of an optimisation routine on a virtual platform in order to both reduce the requirement for experimental testing during the validation procedure, and for development of the optimisation routine itself prior to execution on the engine dynamometer. It is shown that in simulation, the optimisation routine is capable of producing an acceptable calibration within just 5 iterations, reducing the 11-week process down to just a few days. It is also concluded that there are also a number of further improvements that could be made to further improve the efficiency of this process
    corecore