12 research outputs found

    Effectiveness of Unsupervised Training in Deep Learning Neural Networks

    Get PDF
    Deep learning is a field of research attracting nowadays much attention, mainly because deep architectures help in obtaining outstanding results on many vision, speech and natural language processing – related tasks. To make deep learning effective, very often an unsupervised pretraining phase is applied. In this article, we present experimental study evaluating usefulness of such approach, testing on several benchmarks and different percentages of labeled data, how Contrastive Divergence (CD), one of the most popular pretraining methods, influences network generalization

    Evolutionary prototype selection for multi-output regression

    Get PDF
    A novel approach to prototype selection for multi-output regression data sets is presented. A multi-objective evolutionary algorithm is used to evaluate the selections using two criteria: training data set compression and prediction quality expressed in terms of root mean squared error. A multi-target regressor based on k-NN was used for that purpose during the training to evaluate the error, while the tests were performed using four different multi-target predictive models. The distance matrices used by the multi-target regressor were cached to accelerate operational performance. Multiple Pareto fronts were also used to prevent overfitting and to obtain a broader range of solutions, by using different probabilities in the initialization of populations and different evolutionary parameters in each one. The results obtained with the benchmark data sets showed that the proposed method greatly reduced data set size and, at the same time, improved the predictive capabilities of the multi-output regressors trained on the reduced data set.NCN (Polish National Science Center) grant “Evolutionary Methods in Data Selection” No. 2017/01/X/ST6/00202, project TIN2015-67534-P (MINECO/FEDER, UE) of the Ministerio de Economía y Competitividad of the Spanish Government, and project BU085P17 (JCyL/FEDER, UE) of the Junta de Castilla y León cofinanced with European Union FEDER funds

    A hybrid system with regression trees in steel-making process.

    Get PDF
    Abstract. The paper presents a hybrid regresseion model with the main emphasis put on the regression tree unit. It discusses input and output variable transformation, determining the final decision of hybrid models and node split optimization of regression trees. Because of the ability to generate logical rules, a regression tree maybe the preferred module if it produces comparable results to other modules, therefore the optimization of node split in regression trees is discussed in more detail. A set of split criteria based on different forms of variance reduction is analyzed and guidelines for the choice of the criterion are discussed, including the trade-off between the accuracy of the tree, its size and balance between minimizing the node variance and keeping a symmetric structure of the tree. The presented approach found practical applications in the metallurgical industry

    The latest advances in wireless communication in aviation, wind turbines and bridges

    Get PDF
    Present-day technologies used in SHM (Structural Health Monitoring) systems in many implementations are based on wireless sensor networks (WSN). In the context of the continuous development of these systems, the costs of the elements that form the monitoring system are decreasing. In this situation, the challenge is to select the optimal number of sensors and the network architecture, depending on the wireless system’s other parameters and requirements. It is a challenging task for WSN to provide scalability to cover a large area, fault tolerance, transmission reliability, and energy efficiency when no events are detected. In this article, fundamental issues concerning wireless communication in structural health monitoring systems (SHM) in the context of non-destructive testing sensors (NDT) were presented. Wireless technology developments in several crucial areas were also presented, and these include engineering facilities such as aviation and wind turbine systems as well as bridges and associated engineering facilities

    Multi-Objective Evolutionary Instance Selection for Regression Tasks

    No full text
    The purpose of instance selection is to reduce the data size while preserving as much useful information stored in the data as possible and detecting and removing the erroneous and redundant information. In this work, we analyze instance selection in regression tasks and apply the NSGA-II multi-objective evolutionary algorithm to direct the search for the optimal subset of the training dataset and the k-NN algorithm for evaluating the solutions during the selection process. A key advantage of the method is obtaining a pool of solutions situated on the Pareto front, where each of them is the best for certain RMSE-compression balance. We discuss different parameters of the process and their influence on the results and put special efforts to reducing the computational complexity of our approach. The experimental evaluation proves that the proposed method achieves good performance in terms of minimization of prediction error and minimization of dataset size

    Variable Step Search MLP Training Method.

    No full text
    Abstract. The MLP training process is analyzed and a variable step search-based algorithm (VSS) that does not require gradient information is introduced. This algorithm finds rough position of the minima in each single weight direction, and successively updates the weights. Only a small fragment of the network is analyzed for each update, making the method computationally efficient. The VSS algorithm is simpler to program than backpropagation, yet the quality of results and the speed of convergence are at the level of state-of-the-art Levenberg-Marquardt and scaled conjugate gradient algorithms. 1 Introduction. Multilayer perceptrons (MLP) are usually trained using analytical gradient-based algorithms with error backpropagation. Some of the most popular methods that include the standard backpropagation (BP), RPROP, Quickprop, Levenberg-Marquardt (LM) [1] [2], and the scaled conjugate gradient (SCG) algorithm [3] [4]. Also many global optimizatio
    corecore