130,202 research outputs found
Block oriented model order reduction of interconnected systems
Unintended and parasitic coupling effects are becoming more relevant in currently designed, small-scale/highfrequency RFICs. Electromagnetic (EM) based procedures must be used to generate accurate models for proper verification of system behaviour. But these EM methodologies may take advantage of structural sub-system organization as well as information inherent to the IC physical layout, to improve their efficiency. Model order reduction techniques, required for fast and accurate evaluation and simulation of such models, must address and may benefit from the provided hierarchical information. System-based interconnection techniques can handle some of these situations, but suffer from some drawbacks when applied to complete EM models. We will present an alternative methodology, based on similar principles, that overcomes the limitations of such approaches. The procedure, based on structure-preserving model order reduction techniques, is proved to be a generalization of the interconnected system based framework. Further improvements that allow a trade off between global error and block size, and thus allow a better control on the reduction, will be also presented
Recommended from our members
A novel improved model for building energy consumption prediction based on model integration
Building energy consumption prediction plays an irreplaceable role in energy planning, management, and conservation. Constantly improving the performance of prediction models is the key to ensuring the efficient operation of energy systems. Moreover, accuracy is no longer the only factor in revealing model performance, it is more important to evaluate the model from multiple perspectives, considering the characteristics of engineering applications. Based on the idea of model integration, this paper proposes a novel improved integration model (stacking model) that can be used to forecast building energy consumption. The stacking model combines advantages of various base prediction algorithms and forms them into “meta-features” to ensure that the final model can observe datasets from different spatial and structural angles. Two cases are used to demonstrate practical engineering applications of the stacking model. A comparative analysis is performed to evaluate the prediction performance of the stacking model in contrast with existing well-known prediction models including Random Forest, Gradient Boosted Decision Tree, Extreme Gradient Boosting, Support Vector Machine, and K-Nearest Neighbor. The results indicate that the stacking method achieves better performance than other models, regarding accuracy (improvement of 9.5%–31.6% for Case A and 16.2%–49.4% for Case B), generalization (improvement of 6.7%–29.5% for Case A and 7.1%-34.6% for Case B), and robustness (improvement of 1.5%–34.1% for Case A and 1.8%–19.3% for Case B). The proposed model enriches the diversity of algorithm libraries of empirical models
Using boosting to prune bagging ensembles
This is the author’s version of a work that was accepted for publication in Pattern Recognition Letters. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition Letters 28.1 (2007): 156 – 165, DOI: 10.1016/j.patrec.2006.06.018Boosting is used to determine the order in which classifiers are aggregated in a
bagging ensemble. Early stopping in the aggregation of the classifiers in the ordered
bagging ensemble allows the identification of subensembles that require less memory
for storage, classify faster and can improve the generalization accuracy of the original
bagging ensemble. In all the classification problems investigated pruned ensembles
with 20 % of the original classifiers show statistically significant improvements over
bagging. In problems where boosting is superior to bagging, these improvements
are not sufficient to reach the accuracy of the corresponding boosting ensembles.
However, ensemble pruning preserves the performance of bagging in noisy classification
tasks, where boosting often has larger generalization errors. Therefore, pruned
bagging should generally be preferred to complete bagging and, if no information
about the level of noise is available, it is a robust alternative to AdaBoost.The authors acknowledge financial support from the Spanish DirecciĂłn General de InvestigaciĂłn, project TIN2004-07676-C02-02
- …