1,115 research outputs found

    Collaboration based Multi-Label Learning

    Full text link
    It is well-known that exploiting label correlations is crucially important to multi-label learning. Most of the existing approaches take label correlations as prior knowledge, which may not correctly characterize the real relationships among labels. Besides, label correlations are normally used to regularize the hypothesis space, while the final predictions are not explicitly correlated. In this paper, we suggest that for each individual label, the final prediction involves the collaboration between its own prediction and the predictions of other labels. Based on this assumption, we first propose a novel method to learn the label correlations via sparse reconstruction in the label space. Then, by seamlessly integrating the learned label correlations into model training, we propose a novel multi-label learning approach that aims to explicitly account for the correlated predictions of labels while training the desired model simultaneously. Extensive experimental results show that our approach outperforms the state-of-the-art counterparts.Comment: Accepted by AAAI-1

    EnPAC: Petri Net Model Checking for Linear Temporal Logic

    Full text link
    State generation and exploration (counterexample search) are two cores of explicit-state Petri net model checking for linear temporal logic (LTL). Traditional state generation updates a structure to reduce the computation of all transitions and frequently encodes/decodes to read each encoded state. We present the optimized calculation of enabled transitions on demand by dynamic fireset to avoid such a structure. And we propose direct read/write (DRW) operation on encoded markings without decoding and re-encoding to make state generation faster and reduce memory consumption. To search counterexamples more quickly under an on-the-fly framework, we add heuristic information to the Buchi automaton to guide the exploration in the direction of accepted states. The above strategies can optimize existing methods for LTL model checking. We implement these optimization strategies in a Petri net model-checking tool called EnPAC (Enhanced Petri-net Analyser and Checker) for linear temporal logic. Then, we evaluate it on the benchmarks of MCC (Model Checking Contest), which shows a drastic improvement over the existing methods.Comment: 11 pages, 5 figure

    A Novel Neural Network-based Multi-objective Evolution Lower Upper Bound Estimation Method for Electricity Load Interval Forecast

    Get PDF
    Currently, an interval prediction model, lower and upper bounds estimation (LUBE) which constructs the prediction intervals (PIs) by using the double outputs of the neural network (NN) is growing popular. However, existing LUBE researches have two problems. One is that the applied NNs are flawed: feedforward NN (FNN) cannot map the dynamic relationship of data and recurrent NN (RNN) is computationally expensive. The other is that most LUBE models are built under a single-objective frame in which the uncertainty cannot be fully quantified. In this article, a novel wavelet NN (WNN) with direct input–output links (DLWNN) is proposed to obtain PIs in a multiobjective LUBE frame. Different from WNN, the proposed DLWNN adds the direct links from the input layer to output layer which can make full use of the information of time series data. Besides, a niched differential evolution nondominated fast sort genetic algorithm (NDENSGA) is proposed to optimize the prediction model, so as to achieve a balance between estimation accuracy and the average width of the PIs. NDENSGA modifies the traditional population renewal mechanism to increase population diversity and adopts a new elite selection strategy for obtaining more extensive and uniform solutions. The effectiveness of DLWNN and NDENSGA is evaluated through a series of experiments with real electricity load data sets. The results show that the proposed model has better performance than others in terms of convergence and diversity of obtained nondominated solutions

    Large-scale Dataset Pruning with Dynamic Uncertainty

    Full text link
    The state of the art of many learning tasks, e.g., image classification, is advanced by collecting larger datasets and then training larger models on them. As the outcome, the increasing computational cost is becoming unaffordable. In this paper, we investigate how to prune the large-scale datasets, and thus produce an informative subset for training sophisticated deep models with negligible performance drop. We propose a simple yet effective dataset pruning method by exploring both the prediction uncertainty and training dynamics. To our knowledge, this is the first work to study dataset pruning on large-scale datasets, i.e., ImageNet-1K and ImageNet-21K, and advanced models, i.e., Swin Transformer and ConvNeXt. Extensive experimental results indicate that our method outperforms the state of the art and achieves 75% lossless compression ratio on both ImageNet-1K and ImageNet-21K. The code and pruned datasets are available at https://github.com/BAAI-DCAI/Dataset-Pruning

    Program Dependence Net and On-demand Slicing for Property Verification of Concurrent System and Software

    Full text link
    When checking concurrent software using a finite-state model, we face a formidable state explosion problem. One solution to this problem is dependence-based program slicing, whose use can effectively reduce verification time. It is orthogonal to other model-checking reduction techniques. However, when slicing concurrent programs for model checking, there are conversions between multiple irreplaceable models, and dependencies need to be found for variables irrelevant to the verified property, which results in redundant computation. To resolve this issue, we propose a Program Dependence Net (PDNet) based on Petri net theory. It is a unified model that combines a control-flow structure with dependencies to avoid conversions. For reduction, we present a PDNet slicing method to capture the relevant variables' dependencies when needed. PDNet in verifying linear temporal logic and its on-demand slicing can be used to significantly reduce computation cost. We implement a model-checking tool based on PDNet and its on-demand slicing, and validate the advantages of our proposed methods.Comment: 17 pages, 3 figure
    • …
    corecore