3,503 research outputs found

    Magnetic Modelling of Synchronous Reluctance and Internal Permanent Magnet Motors Using Radial Basis Function Networks

    Get PDF
    The general trend toward more intelligent energy-aware ac drives is driving the development of new motor topologies and advanced model-based control techniques. Among the candidates, pure reluctance and anisotropic permanent magnet motors are gaining popularity, despite their complex structure. The availability of accurate mathematical models that describe these motors is essential to the design of any model-based advanced control. This paper focuses on the relations between currents and flux linkages, which are obtained through innovative radial basis function neural networks. These special drive-oriented neural networks take as inputs the motor voltages and currents, returning as output the motor flux linkages, inclusive of any nonlinearity and cross-coupling effect. The theoretical foundations of the radial basis function networks, the design hints, and a commented series of experimental results on a real laboratory prototype are included in this paper. The simple structure of the neural network fits for implementation on standard drives. The online training and tracking will be the next steps in field programmable gate array based control systems

    Easy over Hard: A Case Study on Deep Learning

    Full text link
    While deep learning is an exciting new technique, the benefits of this method need to be assessed with respect to its computational cost. This is particularly important for deep learning since these learners need hours (to weeks) to train the model. Such long training time limits the ability of (a)~a researcher to test the stability of their conclusion via repeated runs with different random seeds; and (b)~other researchers to repeat, improve, or even refute that original work. For example, recently, deep learning was used to find which questions in the Stack Overflow programmer discussion forum can be linked together. That deep learning system took 14 hours to execute. We show here that applying a very simple optimizer called DE to fine tune SVM, it can achieve similar (and sometimes better) results. The DE approach terminated in 10 minutes; i.e. 84 times faster hours than deep learning method. We offer these results as a cautionary tale to the software analytics community and suggest that not every new innovation should be applied without critical analysis. If researchers deploy some new and expensive process, that work should be baselined against some simpler and faster alternatives.Comment: 12 pages, 6 figures, accepted at FSE201

    Study on multi-SVM systems and their applications to pattern recognition

    Get PDF
    制度:新 ; 報告番号:甲3136号 ; 学位の種類:博士(工学) ; 授与年月日:2010/7/12 ; 早大学位記番号:新541

    Kernel learning at the first level of inference

    Get PDF
    Kernel learning methods, whether Bayesian or frequentist, typically involve multiple levels of inference, with the coefficients of the kernel expansion being determined at the first level and the kernel and regularisation parameters carefully tuned at the second level, a process known as model selection. Model selection for kernel machines is commonly performed via optimisation of a suitable model selection criterion, often based on cross-validation or theoretical performance bounds. However, if there are a large number of kernel parameters, as for instance in the case of automatic relevance determination (ARD), there is a substantial risk of over-fitting the model selection criterion, resulting in poor generalisation performance. In this paper we investigate the possibility of learning the kernel, for the Least-Squares Support Vector Machine (LS-SVM) classifier, at the first level of inference, i.e.parameter optimisation. The kernel parameters and the coefficients of the kernel expansion are jointly optimised at the first level of inference, minimising a training criterion with an additional regularisation term acting on the kernel parameters. The key advantage of this approach is that the values of only two regularisation parameters need be determined in model selection, substantially alleviating the problem of over-fitting the model selection criterion. The benefits of this approach are demonstrated using a suite of synthetic and real-world binary classification benchmark problems, where kernel learning at the first level of inference is shown to be statistically superior to the conventional approach, improves on our previous work (Cawley and Talbot, 2007) and is competitive with Multiple Kernel Learning approaches, but with reduced computational expense

    Integrating Local and Global Error Statistics for Multi-Scale RBF Network Training: An Assessment on Remote Sensing Data

    Get PDF
    Background This study discusses the theoretical underpinnings of a novel multi-scale radial basis function (MSRBF) neural network along with its application to classification and regression tasks in remote sensing. The novelty of the proposed MSRBF network relies on the integration of both local and global error statistics in the node selection process. Methodology and Principal Findings The method was tested on a binary classification task, detection of impervious surfaces using a Landsat satellite image, and a regression problem, simulation of waveform LiDAR data. In the classification scenario, results indicate that the MSRBF is superior to existing radial basis function and back propagation neural networks in terms of obtained classification accuracy and training-testing consistency, especially for smaller datasets. The latter is especially important as reference data acquisition is always an issue in remote sensing applications. In the regression case, MSRBF provided improved accuracy and consistency when contrasted with a multi kernel RBF network. Conclusion and Significance Results highlight the potential of a novel training methodology that is not restricted to a specific algorithmic type, therefore significantly advancing machine learning algorithms for classification and regression tasks. The MSRBF is expected to find numerous applications within and outside the remote sensing field
    corecore