47 research outputs found

    Homology modeling and molecular dynamics simulations of MUC1-9/H-2Kb complex suggest novel binding interactions

    Get PDF
    International audienceHuman MUC1 is over-expressed in human adenocarcinomas and has been used as a target for immunotherapy studies. The 9-mer MUC1-9 peptide has been identified as one of the peptides which binds to murine MHC class I H-2K. The structure of MUC1-9 in complex with H-2K has been modeled and simulated with classical molecular dynamics, based on the x-ray structure of the SEV9 peptide/H-2K complex. Two independent trajectories with the solvated complex (10 ns in length) were produced. Approximately 12 hydrogen bonds were identified during both trajectories to contribute to peptide/MHC complex, as well as 1-2 water mediated hydrogen bonds. Stability of the complex was also confirmed by buried surface area analysis, although the corresponding values were about 20% lower than those of the original x-ray structure. Interestingly, a bulged conformation of the peptide's central region, partially characterized as a -turn, was found exposed form the binding groove. In addition, P1 and P9 residues remained bound in the A and F binding pockets, even though there was a suggestion that P9 was more flexible. The complex lacked numerous water mediated hydrogen bonds that were present in the reference peptide x-ray structure. Moreover, local displacements of residues Asp4, Thr5 and Pro9 resulted in loss of some key interactions with the MHC molecule. This might explain the reduced affinity of the MUC1-9 peptide, relatively to SEV9, for the MHC class I H-2K

    Learning Functions and Classes Using Rules

    No full text
    In the current work, a novel method is presented for generating rules for data classification as well as for regression problems. The proposed method generates simple rules in a high-level programming language with the help of grammatical evolution. The method does not depend on any prior knowledge of the dataset; the memory it requires for its execution is constant regardless of the objective problem, and it can be used to detect any hidden dependencies between the features of the input problem as well. The proposed method was tested on a extensive range of problems from the relevant literature, and comparative results against other machine learning techniques are presented in this manuscript

    Constructing Features Using a Hybrid Genetic Algorithm

    No full text
    A hybrid procedure that incorporates grammatical evolution and a weight decaying technique is proposed here for various classification and regression problems. The proposed method has two main phases: the creation of features and the evaluation of these features. During the first phase, using grammatical evolution, new features are created as non-linear combinations of the original features of the datasets. In the second phase, based on the characteristics of the first phase, the original dataset is modified and a neural network trained with a genetic algorithm is applied to this dataset. The proposed method was applied to an extremely wide set of datasets from the relevant literature and the experimental results were compared with four other techniques

    QFC: A Parallel Software Tool for Feature Construction, Based on Grammatical Evolution

    No full text
    This paper presents and analyzes a programming tool that implements a method for classification and function regression problems. This method builds new features from existing ones with the assistance of a hybrid algorithm that makes use of artificial neural networks and grammatical evolution. The implemented software exploits modern multi-core computing units for faster execution. The method has been applied to a variety of classification and function regression problems, and an extensive comparison with other methods of computational intelligence is made

    Toward an Ideal Particle Swarm Optimizer for Multidimensional Functions

    No full text
    The Particle Swarm Optimization (PSO) method is a global optimization technique based on the gradual evolution of a population of solutions called particles. The method evolves the particles based on both the best position of each of them in the past and the best position of the whole. Due to its simplicity, the method has found application in many scientific areas, and for this reason, during the last few years, many modifications have been presented. This paper introduces three modifications to the method that aim to reduce the required number of function calls while maintaining the accuracy of the method in locating the global minimum. These modifications affect important components of the method, such as how fast the particles change or even how the method is terminated. The above modifications were tested on a number of known universal optimization problems from the relevant literature, and the results were compared with similar techniques

    Locating the Parameters of RBF Networks Using a Hybrid Particle Swarm Optimization Method

    No full text
    In the present work, an innovative two-phase method is presented for parameter tuning in radial basis function artificial neural networks. These kinds of machine learning models find application in many scientific fields in classification problems or in function regression. In the first phase, a technique based on particle swarm optimization is performed to locate a promising interval of values for the network parameters. Particle swarm optimization was used as it is a highly reliable method for global optimization problems, and in addition, it is one of the fastest and most-flexible techniques of its class. In the second phase, the network was trained within the optimal interval using a global optimization technique such as a genetic algorithm. Furthermore, in order to speed up the training of the network and due to the use of a two-stage method, parallel programming techniques were utilized. The new method was applied to a number of famous classification and regression datasets, and the results were more than promising

    Locating the Parameters of RBF Networks Using a Hybrid Particle Swarm Optimization Method

    No full text
    In the present work, an innovative two-phase method is presented for parameter tuning in radial basis function artificial neural networks. These kinds of machine learning models find application in many scientific fields in classification problems or in function regression. In the first phase, a technique based on particle swarm optimization is performed to locate a promising interval of values for the network parameters. Particle swarm optimization was used as it is a highly reliable method for global optimization problems, and in addition, it is one of the fastest and most-flexible techniques of its class. In the second phase, the network was trained within the optimal interval using a global optimization technique such as a genetic algorithm. Furthermore, in order to speed up the training of the network and due to the use of a two-stage method, parallel programming techniques were utilized. The new method was applied to a number of famous classification and regression datasets, and the results were more than promising

    Stavrakoudis “Enhancing PSO methods for global optimization

    No full text
    a b s t r a c t The Particle Swarm Optimization (PSO) method is a well-established technique for global optimization. During the past years several variations of the original PSO have been proposed in the relevant literature. Because of the increasing necessity in global optimization methods in almost all fields of science there is a great demand for efficient and fast implementations of relative algorithms. In this work we propose three modifications of the original PSO method in order to increase the speed and its efficiency that can be applied independently in almost every PSO variant. These modifications are: (a) a new stopping rule, (b) a similarity check and (c) a conditional application of some local search method. The proposed were tested using three popular PSO variants and a variety test functions. We have found that the application of these modifications resulted in significant gain in speed and efficiency

    Training Artificial Neural Networks Using a Global Optimization Method That Utilizes Neural Networks

    No full text
    Perhaps one of the best-known machine learning models is the artificial neural network, where a number of parameters must be adjusted to learn a wide range of practical problems from areas such as physics, chemistry, medicine, etc. Such problems can be reduced to pattern recognition problems and then modeled from artificial neural networks, whether these problems are classification problems or regression problems. To achieve the goal of neural networks, they must be trained by appropriately adjusting their parameters using some global optimization methods. In this work, the application of a recent global minimization technique is suggested for the adjustment of neural network parameters. In this technique, an approximation of the objective function to be minimized is created using artificial neural networks and then sampling is performed from the approximation function and not the original one. Therefore, in the present work, learning of the parameters of artificial neural networks is performed using other neural networks. The new training method was tested on a series of well-known problems, a comparative study was conducted against other neural network parameter tuning techniques, and the results were more than promising. From what was seen after performing the experiments and comparing the proposed technique with others that have been used for classification datasets as well as regression datasets, there was a significant difference in the performance of the proposed technique, starting with 30% for classification datasets and reaching 50% for regression problems. However, the proposed technique, because it presupposes the use of global optimization techniques involving artificial neural networks, may require significantly higher execution time than other techniques
    corecore