890 research outputs found

    Fuzzy logic based intention recognition in STS processes

    Get PDF
    This paper represents a fuzzy logic based classifier that is able to recognise human users' intention of standing up from their behaviours in terms of the force they apply to the ground. The research reported focused on the selection of meaningful input data to the classifier and on the determination of fuzzy sets that best represent the intention information hidden in the force data. The classifier is a component of a robot chair which provides the users with assistance to stand up based on the recognised intention by the classifier

    A Viscosity Solution Theory of Stochastic Hamilton-Jacobi-Bellman equations in the Wasserstein Space

    Full text link
    This paper is devoted to a viscosity solution theory of the stochastic Hamilton-Jacobi-Bellman equation in the Wasserstein spaces for the mean-field type control problem which allows for random coefficients and may thus be non-Markovian. The value function of the control problem is proven to be the unique viscosity solution. The major challenge lies in the mixture of the lack of local compactness of the Wasserstein spaces and the non-Markovian setting with random coefficients and various techniques are used, including Ito processes parameterized by random measures, the conditional law invariance of the value function, a novel tailor-made compact subset of measure-valued processes, finite dimensional approximations via stochastic n-player differential games with common noises, and so on.Comment: 41 page

    Minimum Cost Active Labeling

    Full text link
    Labeling a data set completely is important for groundtruth generation. In this paper, we consider the problem of minimum-cost labeling: classifying all images in a large data set with a target accuracy bound at minimum dollar cost. Human labeling can be prohibitive, so we train a classifier to accurately label part of the data set. However, training the classifier can be expensive too, particularly with active learning. Our min-cost labeling uses a variant of active learning to learn a model to predict the optimal training set size for the classifier that minimizes overall cost, then uses active learning to train the classifier to maximize the number of samples the classifier can correctly label. We validate our approach on well-known public data sets such as Fashion, CIFAR-10, and CIFAR-100. In some cases, our approach has 6X lower overall cost relative to human labeling, and is always cheaper than the cheapest active learning strategy

    SDCL: Self-Distillation Contrastive Learning for Chinese Spell Checking

    Full text link
    Due to the ambiguity of homophones, Chinese Spell Checking (CSC) has widespread applications. Existing systems typically utilize BERT for text encoding. However, CSC requires the model to account for both phonetic and graphemic information. To adapt BERT to the CSC task, we propose a token-level self-distillation contrastive learning method. We employ BERT to encode both the corrupted and corresponding correct sentence. Then, we use contrastive learning loss to regularize corrupted tokens' hidden states to be closer to counterparts in the correct sentence. On three CSC datasets, we confirmed our method provides a significant improvement above baselines
    • …
    corecore