2,842 research outputs found

    Automatic Construction of Parallel Portfolios via Explicit Instance Grouping

    Full text link
    Simultaneously utilizing several complementary solvers is a simple yet effective strategy for solving computationally hard problems. However, manually building such solver portfolios typically requires considerable domain knowledge and plenty of human effort. As an alternative, automatic construction of parallel portfolios (ACPP) aims at automatically building effective parallel portfolios based on a given problem instance set and a given rich design space. One promising way to solve the ACPP problem is to explicitly group the instances into different subsets and promote a component solver to handle each of them.This paper investigates solving ACPP from this perspective, and especially studies how to obtain a good instance grouping.The experimental results showed that the parallel portfolios constructed by the proposed method could achieve consistently superior performances to the ones constructed by the state-of-the-art ACPP methods,and could even rival sophisticated hand-designed parallel solvers

    Quantifying the Impact of Label Noise on Federated Learning

    Full text link
    Federated Learning (FL) is a distributed machine learning paradigm where clients collaboratively train a model using their local (human-generated) datasets. While existing studies focus on FL algorithm development to tackle data heterogeneity across clients, the important issue of data quality (e.g., label noise) in FL is overlooked. This paper aims to fill this gap by providing a quantitative study on the impact of label noise on FL. We derive an upper bound for the generalization error that is linear in the clients' label noise level. Then we conduct experiments on MNIST and CIFAR-10 datasets using various FL algorithms. Our empirical results show that the global model accuracy linearly decreases as the noise level increases, which is consistent with our theoretical analysis. We further find that label noise slows down the convergence of FL training, and the global model tends to overfit when the noise level is high.Comment: Accepted by The AAAI 2023 Workshop on Representation Learning for Responsible Human-Centric A

    Adiponectin improves coronary no-reflow injury by protecting the endothelium in rats with type 2 diabetes mellitus.

    Get PDF
    To determine the effect of adiponectin (APN) on the coronary no-reflow (NR) injury in rats with Type 2 diabetes mellitus (T2DM), 80 male Sprague-Dawley rats were fed with a high-sugar-high-fat diet to build a T2DM model. Rats received vehicle or APN in the last week and then were subjected to myocardial ischemia reperfusion (MI/R) injury. Endothelium-dependent vasorelaxation of the thoracic aorta was significantly decreased and serum levels of endothelin-1 (ET-1), intercellular cell adhesion molecule-1 (ICAM-1) and vascular cell adhesion molecule-1 (VCAM-1) were noticably increased in T2DM rats compared with rats without T2DM. Serum APN was positively correlated with the endothelium-dependent vasorelaxation, but negatively correlated with the serum level of ET-1. Treatment with APN improved T2DM-induced endothelium-dependent vasorelaxation, recovered cardiac function, and decreased both NR size and the levels of ET-1, ICAM-1 and VCAM-1. Hypoadiponectinemia was associated with the aggravation of coronary NR in T2DM rats. APN could alleviate coronary NR injury in T2DM rats by protecting the endothelium and improving microcirculation
    • …
    corecore