130,202 research outputs found

    Block oriented model order reduction of interconnected systems

    Get PDF
    Unintended and parasitic coupling effects are becoming more relevant in currently designed, small-scale/highfrequency RFICs. Electromagnetic (EM) based procedures must be used to generate accurate models for proper verification of system behaviour. But these EM methodologies may take advantage of structural sub-system organization as well as information inherent to the IC physical layout, to improve their efficiency. Model order reduction techniques, required for fast and accurate evaluation and simulation of such models, must address and may benefit from the provided hierarchical information. System-based interconnection techniques can handle some of these situations, but suffer from some drawbacks when applied to complete EM models. We will present an alternative methodology, based on similar principles, that overcomes the limitations of such approaches. The procedure, based on structure-preserving model order reduction techniques, is proved to be a generalization of the interconnected system based framework. Further improvements that allow a trade off between global error and block size, and thus allow a better control on the reduction, will be also presented

    Using boosting to prune bagging ensembles

    Full text link
    This is the author’s version of a work that was accepted for publication in Pattern Recognition Letters. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition Letters 28.1 (2007): 156 – 165, DOI: 10.1016/j.patrec.2006.06.018Boosting is used to determine the order in which classifiers are aggregated in a bagging ensemble. Early stopping in the aggregation of the classifiers in the ordered bagging ensemble allows the identification of subensembles that require less memory for storage, classify faster and can improve the generalization accuracy of the original bagging ensemble. In all the classification problems investigated pruned ensembles with 20 % of the original classifiers show statistically significant improvements over bagging. In problems where boosting is superior to bagging, these improvements are not sufficient to reach the accuracy of the corresponding boosting ensembles. However, ensemble pruning preserves the performance of bagging in noisy classification tasks, where boosting often has larger generalization errors. Therefore, pruned bagging should generally be preferred to complete bagging and, if no information about the level of noise is available, it is a robust alternative to AdaBoost.The authors acknowledge financial support from the Spanish Dirección General de Investigación, project TIN2004-07676-C02-02
    • …
    corecore