673 research outputs found

    A Response-Function-Based Coordination Method for Transmission-Distribution-Coupled AC OPF

    Full text link
    With distributed generation highly integrated into the grid, the transmission-distribution-coupled AC OPF (TDOPF) becomes increasingly important. This paper proposes a response-function-based coordination method to solve the TDOPF. Different from typical decomposition methods, this method employs approximate response functions of the power injections with respect to the bus voltage magnitude in the transmission-distribution (T-D) interface to reflect the "reaction" of the distribution to the transmission system control. By using the response functions, only one or two iterations between the transmission system operator (TSO) and the distribution system operator(s) (DSO(s)) are required to attain a nearly optimal TDOPF solution. Numerical tests confirm that, relative to a typical decomposition method, the proposed method does not only enjoy a cheaper computational cost but is workable even when the objectives of the TSO and the DSO(s) are in distinct scales.Comment: This paper will appear at 2018 IEEE PES Transmission and Distribution Conference and Expositio

    Enhance Diffusion to Improve Robust Generalization

    Full text link
    Deep neural networks are susceptible to human imperceptible adversarial perturbations. One of the strongest defense mechanisms is \emph{Adversarial Training} (AT). In this paper, we aim to address two predominant problems in AT. First, there is still little consensus on how to set hyperparameters with a performance guarantee for AT research, and customized settings impede a fair comparison between different model designs in AT research. Second, the robustly trained neural networks struggle to generalize well and suffer from tremendous overfitting. This paper focuses on the primary AT framework - Projected Gradient Descent Adversarial Training (PGD-AT). We approximate the dynamic of PGD-AT by a continuous-time Stochastic Differential Equation (SDE), and show that the diffusion term of this SDE determines the robust generalization. An immediate implication of this theoretical finding is that robust generalization is positively correlated with the ratio between learning rate and batch size. We further propose a novel approach, \emph{Diffusion Enhanced Adversarial Training} (DEAT), to manipulate the diffusion term to improve robust generalization with virtually no extra computational burden. We theoretically show that DEAT obtains a tighter generalization bound than PGD-AT. Our empirical investigation is extensive and firmly attests that DEAT universally outperforms PGD-AT by a significant margin.Comment: Accepted at KDD 202

    Augmenting Knowledge Transfer across Graphs

    Full text link
    Given a resource-rich source graph and a resource-scarce target graph, how can we effectively transfer knowledge across graphs and ensure a good generalization performance? In many high-impact domains (e.g., brain networks and molecular graphs), collecting and annotating data is prohibitively expensive and time-consuming, which makes domain adaptation an attractive option to alleviate the label scarcity issue. In light of this, the state-of-the-art methods focus on deriving domain-invariant graph representation that minimizes the domain discrepancy. However, it has recently been shown that a small domain discrepancy loss may not always guarantee a good generalization performance, especially in the presence of disparate graph structures and label distribution shifts. In this paper, we present TRANSNET, a generic learning framework for augmenting knowledge transfer across graphs. In particular, we introduce a novel notion named trinity signal that can naturally formulate various graph signals at different granularity (e.g., node attributes, edges, and subgraphs). With that, we further propose a domain unification module together with a trinity-signal mixup scheme to jointly minimize the domain discrepancy and augment the knowledge transfer across graphs. Finally, comprehensive empirical results show that TRANSNET outperforms all existing approaches on seven benchmark datasets by a significant margin
    • …
    corecore