124 research outputs found

    Duality, Derivative-Based Training Methods and Hyperparameter Optimization for Support Vector Machines

    Get PDF
    In this thesis we consider the application of Fenchel's duality theory and gradient-based methods for the training and hyperparameter optimization of Support Vector Machines. We show that the dualization of convex training problems is possible theoretically in a rather general formulation. For training problems following a special structure (for instance, standard training problems) we find that the resulting optimality conditions can be interpreted concretely. This approach immediately leads to the well-known notion of support vectors and a formulation of the Representer Theorem. The proposed theory is applied to several examples such that dual formulations of training problems and associated optimality conditions can be derived straightforwardly. Furthermore, we consider different formulations of the primal training problem which are equivalent under certain conditions. We also argue that the relation of the corresponding solutions to the solution of the dual training problem is not always intuitive. Based on the previous findings, we consider the application of customized optimization methods to the primal and dual training problems. A particular realization of Newton's method is derived which could be used to solve the primal training problem accurately. Moreover, we introduce a general convergence framework covering different types of decomposition methods for the solution of the dual training problem. In doing so, we are able to generalize well-known convergence results for the SMO method. Additionally, a discussion of the complexity of the SMO method and a motivation for a shrinking strategy reducing the computational effort is provided. In a last theoretical part, we consider the problem of hyperparameter optimization. We argue that this problem can be handled efficiently by means of gradient-based methods if the training problems are formulated appropriately. Finally, we evaluate the theoretical results concerning the training and hyperparameter optimization approaches practically by means of several example training problems

    Computational Approaches For Designing Protein/inhibitor Complexes And Membrane Protein Variants

    Get PDF
    Drug discovery of small-molecule protein inhibitors is a vast enterprise that involves several scientific disciplines (i.e. genomics, cell biology, x-ray crystallography, chemistry, computer science, statistics), with each discipline focusing on a particular aspect of the process. In this thesis, I use computational and experimental approaches to explore the most fundamental aspect of drug discovery: the molecular interactions of small-molecules inhibitors with proteins. In Part I (Chapters I and II), I describe how computational docking approaches can be used to identify structurally diverse molecules that can inhibit multiple protein targets in the brain. I illustrate this approach using the examples of microtubule-stabilizing agents and inhibitors of cyclooxygenase(COX)-I and 5-lipoxygenase (5-LOX). In Part II (Chapters III and IV), I focus on membrane proteins, which are notoriously difficult to work with due to their low natural abundances, low yields for heterologous over expression, and propensities toward aggregation. I describe a general approach for designing water-soluble variants of membrane proteins, for the purpose of developing cell-free, label-free, detergent-free, solution-phase studies of protein structure and small-molecule binding. I illustrate this approach through the design of a water-soluble variant of the membrane protein Smoothened, wsSMO. This wsSMO stands to serve as a first-step towards developing membrane protein analogs of this important signaling protein and drug target
    • …
    corecore