16 research outputs found

    A local Gaussian filter and adaptive morphology as tools for completing partially discontinuous curves

    Full text link
    This paper presents a method for extraction and analysis of curve--type structures which consist of disconnected components. Such structures are found in electron--microscopy (EM) images of metal nanograins, which are widely used in the field of nanosensor technology. The topography of metal nanograins in compound nanomaterials is crucial to nanosensor characteristics. The method of completing such templates consists of three steps. In the first step, a local Gaussian filter is used with different weights for each neighborhood. In the second step, an adaptive morphology operation is applied to detect the endpoints of curve segments and connect them. In the last step, pruning is employed to extract a curve which optimally fits the template

    HyperMAML: Few-Shot Adaptation of Deep Models with Hypernetworks

    Full text link
    The aim of Few-Shot learning methods is to train models which can easily adapt to previously unseen tasks, based on small amounts of data. One of the most popular and elegant Few-Shot learning approaches is Model-Agnostic Meta-Learning (MAML). The main idea behind this method is to learn the general weights of the meta-model, which are further adapted to specific problems in a small number of gradient steps. However, the model's main limitation lies in the fact that the update procedure is realized by gradient-based optimisation. In consequence, MAML cannot always modify weights to the essential level in one or even a few gradient iterations. On the other hand, using many gradient steps results in a complex and time-consuming optimization procedure, which is hard to train in practice, and may lead to overfitting. In this paper, we propose HyperMAML, a novel generalization of MAML, where the training of the update procedure is also part of the model. Namely, in HyperMAML, instead of updating the weights with gradient descent, we use for this purpose a trainable Hypernetwork. Consequently, in this framework, the model can generate significant updates whose range is not limited to a fixed number of gradient steps. Experiments show that HyperMAML consistently outperforms MAML and performs comparably to other state-of-the-art techniques in a number of standard Few-Shot learning benchmarks

    Subspaces clustering approach to lossy image compression

    No full text
    Part 8: Pattern Recognition and Image ProcessingInternational audienceIn this contribution lossy image compression based on subspaces clustering is considered. Given a PCA factorization of each cluster into subspaces and a maximal compression error, we show that the selection of those subspaces that provide the optimal lossy image compression is equivalent to the 0-1 Knapsack Problem. We present a theoretical and an experimental comparison between accurate and approximate algorithms for solving the 0-1 Knapsack problem in the case of lossy image compression
    corecore