1,202 research outputs found

    The Douglas–Rachford algorithm for convex and nonconvex feasibility problems

    Get PDF
    The Douglas–Rachford algorithm is an optimization method that can be used for solving feasibility problems. To apply the method, it is necessary that the problem at hand is prescribed in terms of constraint sets having efficiently computable nearest points. Although the convergence of the algorithm is guaranteed in the convex setting, the scheme has demonstrated to be a successful heuristic for solving combinatorial problems of different type. In this self-contained tutorial, we develop the convergence theory of projection algorithms within the framework of fixed point iterations, explain how to devise useful feasibility problem formulations, and demonstrate the application of the Douglas–Rachford method to said formulations. The paradigm is then illustrated on two concrete problems: a generalization of the “eight queens puzzle” known as the “(m, n)-queens problem”, and the problem of constructing a probability distribution with prescribed moments.FJAA and RC were partially supported by Ministerio de Economía, Industria y Competitividad (MINECO) and European Regional Development Fund (ERDF), grant MTM2014-59179-C2-1-P. FJAA was supported by the Ramón y Cajal program by MINECO and ERDF (RYC-2013-13327) and by the Ministerio de Ciencia, Innovación y Universidades and ERDF, grant PGC2018-097960-B-C22. RC was supported by MINECO and European Social Fund (BES-2015-073360) under the program “Ayudas para contratos predoctorales para la formación de doctores 2015”

    Non-linear projection to latent structures

    Get PDF
    PhD ThesisThis Thesis focuses on the study of multivariate statistical regression techniques which have been used to produce non-linear empirical models of chemical processes, and on the development of a novel approach to non-linear Projection to Latent Structures regression. Empirical modelling relies on the availability of process data and sound empirical regression techniques which can handle variable collinearities, measurement noise, unknown variable and noise distributions and high data set dimensionality. Projection based techniques, such as Principal Component Analysis (PCA) and Projection to Latent Structures (PLS), have been shown to be appropriate for handling such data sets. The multivariate statistical projection based techniques of PCA and linear PLS are described in detail, highlighting the benefits which can be gained by using these approaches. However, many chemical processes exhibit severely nonlinear behaviour and non-linear regression techniques are required to develop empirical models. The derivation of an existing quadratic PLS algorithm is described in detail. The procedure for updating the model parameters which is required by the quadratic PLS algorithms is explored and modified. A new procedure for updating the model parameters is presented and is shown to perform better the existing algorithm. The two procedures have been evaluated on the basis of the performance of the corresponding quadratic PLS algorithms in modelling data generated with a strongly non-linear mathematical function and data generated with a mechanistic model of a benchmark pH neutralisation system. Finally a novel approach to non-linear PLS modelling is then presented combining the general approximation properties of sigmoid neural networks and radial basis function networks with the new weights updating procedure within the PLS framework. These algorithms are shown to outperform existing neural network PLS algorithms and the quadratic PLS approaches. The new neural network PLS algorithms have been evaluated on the basis of their performance in modelling the same data used to compare the quadratic PLS approaches.Strang Studentship European project ESPRIT PROJECT 22281 (PROGNOSIS) Centre for Process Analysis, Chemometrics and Control

    Kernel PCA and the Nyström method

    Get PDF
    This thesis treats kernel PCA and the Nystrom method. We present a novel incre- ¨ mental algorithm for calculation of kernel PCA, which we extend to incremental calculation of the Nystrom approximation. We suggest a new data-dependent ¨ method to select the number of data points to include in the Nystrom subset, ¨ and create a statistical hypothesis test for the same purpose. We further present a cross-validation procedure for kernel PCA to select the number of principal components to retain. Finally, we derive kernel PCA with the Nystrom method ¨ in line with linear PCA and study its statistical accuracy through a confidence bound

    CHEMOMETRICS, SPECTROMETRY, AND SENSORS FOR INTEGRATED SENSING AND PROCESSING: ADVANCING PROCESS ANALYTICAL TECHNOLOGY

    Get PDF
    The research contained in the following dissertation spans a diverse range of scientific scholarship, including; chemometrics for integrated sensing and processing (ISP), near infrared and acoustic resonance spectrometry for analyte quantification and classification, and an ISP acoustic sensor as an alternative to conventional acoustic spectrometry. These topics may at first seem disjointed; however, closer inspection reveals that chemometrics, spectrometry, and sensors taken together form the umbrella under which applied spectrometry and analytical chemistry fall. The inclusion of each of these three serves to paint the complete portrait of the role of applied spectrometry for the advancement of process analytical technology. To illustrate the totality of this portrait, this research seeks to introduce and substantiate three key claims. (1) When applicable, optical spectrometry and acoustic spectrometry are preferred alternatives to slower and more invasive methods of analysis. (2) Chemometrics can be implemented directly into the physical design of spectrometers, thus sparing the need for computationally demanding post-collection multivariate analyses. (3) Using this principle, ISP sensors can be developed specifically for use in highly applied situations, making possible automatic analyte quantification or classification without the computational burden and extensive data analysis typically associated with conventional spectrometry. More concisely, these three claims can be stated as follows: spectrometry has a broad range of uses, chemometrics for ISP makes spectrometry more efficient, and for all analytical problems with a spectrometric solution, an ISP sensor, specifically tailored to the needs of the experiment, can more effectively solve the same analytical problem

    Multispace & Multistructure. Neutrosophic Transdisciplinarity (100 Collected Papers of Sciences), Vol. IV

    Get PDF
    The fourth volume, in my book series of “Collected Papers”, includes 100 published and unpublished articles, notes, (preliminary) drafts containing just ideas to be further investigated, scientific souvenirs, scientific blogs, project proposals, small experiments, solved and unsolved problems and conjectures, updated or alternative versions of previous papers, short or long humanistic essays, letters to the editors - all collected in the previous three decades (1980-2010) – but most of them are from the last decade (2000-2010), some of them being lost and found, yet others are extended, diversified, improved versions. This is an eclectic tome of 800 pages with papers in various fields of sciences, alphabetically listed, such as: astronomy, biology, calculus, chemistry, computer programming codification, economics and business and politics, education and administration, game theory, geometry, graph theory, information fusion, neutrosophic logic and set, non-Euclidean geometry, number theory, paradoxes, philosophy of science, psychology, quantum physics, scientific research methods, and statistics. It was my preoccupation and collaboration as author, co-author, translator, or cotranslator, and editor with many scientists from around the world for long time. Many topics from this book are incipient and need to be expanded in future explorations
    • …
    corecore