41,432 research outputs found

    Scalable macromodelling methodology for the efficient design of microwave filters

    Get PDF
    The complexity of the design of microwave filters increases steadily over the years. General design techniques available in literature yield relatively good initial designs, but electromagnetic (EM) optimisation is often needed to meet the specifications. Although interesting optimisation strategies exist, they depend on computationally expensive EM simulations. This makes the optimisation process time consuming. Moreover, brute force optimisation does not provide physical insights into the design and it is only applicable to one set of specifications. If the specifications change, the design and optimisation process must be redone. The authors propose a scalable macromodel-based design approach to overcome this. Scalable macromodels can be generated in an automated way. So far the inclusion of scalable macromodels in the design cycle of microwave filters has not been studied. In this study, it is shown that scalable macromodels can be included in the design cycle of microwave filters and re-used in multiple design scenarios at low computational cost. Guidelines to properly generate and use scalable macromodels in a filter design context are given. The approach is illustrated on a state-of-the-art microstrip dual-band bandpass filter with closely spaced pass bands and a complex geometrical structure. The results confirm that scalable macromodels are proper design tools and a valuable alternative to a computationally expensive EM simulator-based design flow

    Regularity scalable image coding based on wavelet singularity detection

    Get PDF
    In this paper, we propose an adaptive algorithm for scalable wavelet image coding, which is based on the general feature, the regularity, of images. In pattern recognition or computer vision, regularity of images is estimated from the oriented wavelet coefficients and quantified by the Lipschitz exponents. To estimate the Lipschitz exponents, evaluating the interscale evolution of the wavelet transform modulus sum (WTMS) over the directional cone of influence was proven to be a better approach than tracing the wavelet transform modulus maxima (WTMM). This is because the irregular sampling nature of the WTMM complicates the reconstruction process. Moreover, examples were found to show that the WTMM representation cannot uniquely characterize a signal. It implies that the reconstruction of signal from its WTMM may not be consistently stable. Furthermore, the WTMM approach requires much more computational effort. Therefore, we use the WTMS approach to estimate the regularity of images from the separable wavelet transformed coefficients. Since we do not concern about the localization issue, we allow the decimation to occur when we evaluate the interscale evolution. After the regularity is estimated, this information is utilized in our proposed adaptive regularity scalable wavelet image coding algorithm. This algorithm can be simply embedded into any wavelet image coders, so it is compatible with the existing scalable coding techniques, such as the resolution scalable and signal-to-noise ratio (SNR) scalable coding techniques, without changing the bitstream format, but provides more scalable levels with higher peak signal-to-noise ratios (PSNRs) and lower bit rates. In comparison to the other feature-based wavelet scalable coding algorithms, the proposed algorithm outperforms them in terms of visual perception, computational complexity and coding efficienc

    Influence Maximization Meets Efficiency and Effectiveness: A Hop-Based Approach

    Full text link
    Influence Maximization is an extensively-studied problem that targets at selecting a set of initial seed nodes in the Online Social Networks (OSNs) to spread the influence as widely as possible. However, it remains an open challenge to design fast and accurate algorithms to find solutions in large-scale OSNs. Prior Monte-Carlo-simulation-based methods are slow and not scalable, while other heuristic algorithms do not have any theoretical guarantee and they have been shown to produce poor solutions for quite some cases. In this paper, we propose hop-based algorithms that can easily scale to millions of nodes and billions of edges. Unlike previous heuristics, our proposed hop-based approaches can provide certain theoretical guarantees. Experimental evaluations with real OSN datasets demonstrate the efficiency and effectiveness of our algorithms.Comment: Extended version of the conference paper at ASONAM 2017, 11 page

    Quantifying the Influence of Component Failure Probability on Cascading Blackout Risk

    Get PDF
    The risk of cascading blackouts greatly relies on failure probabilities of individual components in power grids. To quantify how component failure probabilities (CFP) influences blackout risk (BR), this paper proposes a sample-induced semi-analytic approach to characterize the relationship between CFP and BR. To this end, we first give a generic component failure probability function (CoFPF) to describe CFP with varying parameters or forms. Then the exact relationship between BR and CoFPFs is built on the abstract Markov-sequence model of cascading outages. Leveraging a set of samples generated by blackout simulations, we further establish a sample-induced semi-analytic mapping between the unbiased estimation of BR and CoFPFs. Finally, we derive an efficient algorithm that can directly calculate the unbiased estimation of BR when the CoFPFs change. Since no additional simulations are required, the algorithm is computationally scalable and efficient. Numerical experiments well confirm the theory and the algorithm

    Outward Influence and Cascade Size Estimation in Billion-scale Networks

    Full text link
    Estimating cascade size and nodes' influence is a fundamental task in social, technological, and biological networks. Yet this task is extremely challenging due to the sheer size and the structural heterogeneity of networks. We investigate a new influence measure, termed outward influence (OI), defined as the (expected) number of nodes that a subset of nodes SS will activate, excluding the nodes in S. Thus, OI equals, the de facto standard measure, influence spread of S minus |S|. OI is not only more informative for nodes with small influence, but also, critical in designing new effective sampling and statistical estimation methods. Based on OI, we propose SIEA/SOIEA, novel methods to estimate influence spread/outward influence at scale and with rigorous theoretical guarantees. The proposed methods are built on two novel components 1) IICP an important sampling method for outward influence, and 2) RSA, a robust mean estimation method that minimize the number of samples through analyzing variance and range of random variables. Compared to the state-of-the art for influence estimation, SIEA is Ω(log4n)\Omega(\log^4 n) times faster in theory and up to several orders of magnitude faster in practice. For the first time, influence of nodes in the networks of billions of edges can be estimated with high accuracy within a few minutes. Our comprehensive experiments on real-world networks also give evidence against the popular practice of using a fixed number, e.g. 10K or 20K, of samples to compute the "ground truth" for influence spread.Comment: 16 pages, SIGMETRICS 201
    corecore