318,216 research outputs found
Strict Intuitionistic Fuzzy Distance/Similarity Measures Based on Jensen-Shannon Divergence
Being a pair of dual concepts, the normalized distance and similarity
measures are very important tools for decision-making and pattern recognition
under intuitionistic fuzzy sets framework. To be more effective for
decision-making and pattern recognition applications, a good normalized
distance measure should ensure that its dual similarity measure satisfies the
axiomatic definition. In this paper, we first construct some examples to
illustrate that the dual similarity measures of two nonlinear distance measures
introduced in [A distance measure for intuitionistic fuzzy sets and its
application to pattern classification problems, \emph{IEEE Trans. Syst., Man,
Cybern., Syst.}, vol.~51, no.~6, pp. 3980--3992, 2021] and [Intuitionistic
fuzzy sets: spherical representation and distances, \emph{Int. J. Intell.
Syst.}, vol.~24, no.~4, pp. 399--420, 2009] do not meet the axiomatic
definition of intuitionistic fuzzy similarity measure. We show that (1) they
cannot effectively distinguish some intuitionistic fuzzy values (IFVs) with
obvious size relationship; (2) except for the endpoints, there exist infinitely
many pairs of IFVs, where the maximum distance 1 can be achieved under these
two distances; leading to counter-intuitive results. To overcome these
drawbacks, we introduce the concepts of strict intuitionistic fuzzy distance
measure (SIFDisM) and strict intuitionistic fuzzy similarity measure (SIFSimM),
and propose an improved intuitionistic fuzzy distance measure based on
Jensen-Shannon divergence. We prove that (1) it is a SIFDisM; (2) its dual
similarity measure is a SIFSimM; (3) its induced entropy is an intuitionistic
fuzzy entropy. Comparative analysis and numerical examples demonstrate that our
proposed distance measure is completely superior to the existing ones
Convex Global 3D Registration with Lagrangian Duality
The registration of 3D models by a Euclidean transformation is a fundamental task at the core of many application in computer vision. This problem is non-convex due to the presence of rotational constraints, making traditional local optimization methods prone to getting stuck in local minima. This paper addresses finding the globally optimal transformation in various 3D registration problems by a unified formulation that integrates common geometric registration modalities (namely point-to-point, point-to-line and point-to-plane). This formulation renders the optimization problem independent of both the number and nature of the correspondences.
The main novelty of our proposal is the introduction of a strengthened Lagrangian dual relaxation for this problem, which surpasses previous similar approaches [32] in effectiveness.
In fact, even though with no theoretical guarantees, exhaustive empirical evaluation in both synthetic and real experiments always resulted on a tight relaxation that allowed to recover a guaranteed globally optimal solution by exploiting duality theory.
Thus, our approach allows for effectively solving the 3D registration with global optimality guarantees while running at a fraction of the time for the state-of-the-art alternative [34], based on a more computationally intensive Branch and Bound method.Universidad de MĂĄlaga. Campus de Excelencia Internacional AndalucĂa Tech
An Efficient Dual Approach to Distance Metric Learning
Distance metric learning is of fundamental interest in machine learning
because the distance metric employed can significantly affect the performance
of many learning methods. Quadratic Mahalanobis metric learning is a popular
approach to the problem, but typically requires solving a semidefinite
programming (SDP) problem, which is computationally expensive. Standard
interior-point SDP solvers typically have a complexity of (with
the dimension of input data), and can thus only practically solve problems
exhibiting less than a few thousand variables. Since the number of variables is
, this implies a limit upon the size of problem that can
practically be solved of around a few hundred dimensions. The complexity of the
popular quadratic Mahalanobis metric learning approach thus limits the size of
problem to which metric learning can be applied. Here we propose a
significantly more efficient approach to the metric learning problem based on
the Lagrange dual formulation of the problem. The proposed formulation is much
simpler to implement, and therefore allows much larger Mahalanobis metric
learning problems to be solved. The time complexity of the proposed method is
, which is significantly lower than that of the SDP approach.
Experiments on a variety of datasets demonstrate that the proposed method
achieves an accuracy comparable to the state-of-the-art, but is applicable to
significantly larger problems. We also show that the proposed method can be
applied to solve more general Frobenius-norm regularized SDP problems
approximately
Constrained Deep Networks: Lagrangian Optimization via Log-Barrier Extensions
This study investigates the optimization aspects of imposing hard inequality
constraints on the outputs of CNNs. In the context of deep networks,
constraints are commonly handled with penalties for their simplicity, and
despite their well-known limitations. Lagrangian-dual optimization has been
largely avoided, except for a few recent works, mainly due to the computational
complexity and stability/convergence issues caused by alternating explicit dual
updates/projections and stochastic optimization. Several studies showed that,
surprisingly for deep CNNs, the theoretical and practical advantages of
Lagrangian optimization over penalties do not materialize in practice. We
propose log-barrier extensions, which approximate Lagrangian optimization of
constrained-CNN problems with a sequence of unconstrained losses. Unlike
standard interior-point and log-barrier methods, our formulation does not need
an initial feasible solution. Furthermore, we provide a new technical result,
which shows that the proposed extensions yield an upper bound on the duality
gap. This generalizes the duality-gap result of standard log-barriers, yielding
sub-optimality certificates for feasible solutions. While sub-optimality is not
guaranteed for non-convex problems, our result shows that log-barrier
extensions are a principled way to approximate Lagrangian optimization for
constrained CNNs via implicit dual variables. We report comprehensive weakly
supervised segmentation experiments, with various constraints, showing that our
formulation outperforms substantially the existing constrained-CNN methods,
both in terms of accuracy, constraint satisfaction and training stability, more
so when dealing with a large number of constraints
Playing with Duality: An Overview of Recent Primal-Dual Approaches for Solving Large-Scale Optimization Problems
Optimization methods are at the core of many problems in signal/image
processing, computer vision, and machine learning. For a long time, it has been
recognized that looking at the dual of an optimization problem may drastically
simplify its solution. Deriving efficient strategies which jointly brings into
play the primal and the dual problems is however a more recent idea which has
generated many important new contributions in the last years. These novel
developments are grounded on recent advances in convex analysis, discrete
optimization, parallel processing, and non-smooth optimization with emphasis on
sparsity issues. In this paper, we aim at presenting the principles of
primal-dual approaches, while giving an overview of numerical methods which
have been proposed in different contexts. We show the benefits which can be
drawn from primal-dual algorithms both for solving large-scale convex
optimization problems and discrete ones, and we provide various application
examples to illustrate their usefulness
Selection pressure and organizational cognition: implications for the social determinants of health
We model the effects of Schumperterian 'selecton pressures' -- in particular Apartheid and the neoliberal 'market economy' -- on organizational cognition in minority communities, given the special role of culture in human biology. Our focus is on the dual-function social networks by which culture is imposed and maintained on individuals and by which immediate patterns of opportunity and threat are recognized and given response. A mathematical model based on recent advances in complexity theory displays a joint cross-scale linkage of social, individual central nervous system, and immune cognition with external selection pressure through mixed and synergistic punctuated 'learning plateaus.' This provides a natural mechanism for addressing the social determinants of health at the individual level. The implications of the model, particularly the predictions of synergistic punctuation, appear to be empirically testable
- âŠ