20,652 research outputs found
Cluster Weighted Modeling as a Basis for Fuzzy Modeling
The Cluster-Weighted Modeling (CWM) is emerging as a versatile tool for modeling dynamical systems. It is a mixture density estimator around local models. To be specific, the input regions together with output regions are treated to be Gaussian serving as local models. These models are linked by a linear or non-linear function involving the mixture of densities of local models. The present work shows a connection between the CWM and Generalized Fuzzy Model (GFM) thus paving the way for utilizing the concepts of probability theory in fuzzy domain that has already emerged as a versatile tool for solving problems in uncertain dynamic systems
Adaptive fuzzy Gaussian mixture models for shape approximation in Robot Grasping
Robotic grasping has always been a challenging task for both service and industrial robots. The ability of grasp planning for novel objects is necessary for a robot to autonomously perform grasps under unknown environments.In this work, we consider the task of grasp planning for a parallel gripper to grasp a novel object, given an RGB image and its corresponding depth image taken from a single view. In this paper, we show that this problem can be simplified by modeling a novel object as a set of simple shape primitives, such as ellipses. We adopt fuzzy Gaussian mixture models (GMMs) for novel objects’ shape approximation. With the obtained GMM, we decompose the object into several ellipses, while each ellipse is corresponding to a grasping rectangle. After comparing the grasp quality among these rectangles, we will obtain the most proper part for a gripper to grasp. Extensive experiments on a real robotic platform demonstrate that our algorithm assists the robot to grasp a variety of novel objects with good grasp quality and computational efficiency
Semantic Information G Theory and Logical Bayesian Inference for Machine Learning
An important problem with machine learning is that when label number n\u3e2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists of a group of truth functions or membership functions. In comparison with likelihood functions, Bayesian posteriors, and Logistic functions used by popular methods, membership functions can be more conveniently used as learning functions without the above problem. In Logical Bayesian Inference (LBI), every label’s learning is independent. For Multilabel learning, we can directly obtain a group of optimized membership functions from a big enough sample with labels, without preparing different samples for different labels. A group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions on a two-dimensional feature space, 2-3 iterations can make mutual information between three classes and three labels surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maxmization (EM) algorithm is improved and becomes the CM-EM algorithm, which can outperform the EM algorithm when mixture ratios are imbalanced, or local convergence exists. The CM iteration algorithm needs to combine neural networks for MMI classifications on high-dimensional feature spaces. LBI needs further studies for the unification of statistics and logic
An empirical learning-based validation procedure for simulation workflow
Simulation workflow is a top-level model for the design and control of
simulation process. It connects multiple simulation components with time and
interaction restrictions to form a complete simulation system. Before the
construction and evaluation of the component models, the validation of
upper-layer simulation workflow is of the most importance in a simulation
system. However, the methods especially for validating simulation workflow is
very limit. Many of the existing validation techniques are domain-dependent
with cumbersome questionnaire design and expert scoring. Therefore, this paper
present an empirical learning-based validation procedure to implement a
semi-automated evaluation for simulation workflow. First, representative
features of general simulation workflow and their relations with validation
indices are proposed. The calculation process of workflow credibility based on
Analytic Hierarchy Process (AHP) is then introduced. In order to make full use
of the historical data and implement more efficient validation, four learning
algorithms, including back propagation neural network (BPNN), extreme learning
machine (ELM), evolving new-neuron (eNFN) and fast incremental gaussian mixture
model (FIGMN), are introduced for constructing the empirical relation between
the workflow credibility and its features. A case study on a landing-process
simulation workflow is established to test the feasibility of the proposed
procedure. The experimental results also provide some useful overview of the
state-of-the-art learning algorithms on the credibility evaluation of
simulation models
Fuzzy Jets
Collimated streams of particles produced in high energy physics experiments
are organized using clustering algorithms to form jets. To construct jets, the
experimental collaborations based at the Large Hadron Collider (LHC) primarily
use agglomerative hierarchical clustering schemes known as sequential
recombination. We propose a new class of algorithms for clustering jets that
use infrared and collinear safe mixture models. These new algorithms, known as
fuzzy jets, are clustered using maximum likelihood techniques and can
dynamically determine various properties of jets like their size. We show that
the fuzzy jet size adds additional information to conventional jet tagging
variables. Furthermore, we study the impact of pileup and show that with some
slight modifications to the algorithm, fuzzy jets can be stable up to high
pileup interaction multiplicities
A Survey on Soft Subspace Clustering
Subspace clustering (SC) is a promising clustering technology to identify
clusters based on their associations with subspaces in high dimensional spaces.
SC can be classified into hard subspace clustering (HSC) and soft subspace
clustering (SSC). While HSC algorithms have been extensively studied and well
accepted by the scientific community, SSC algorithms are relatively new but
gaining more attention in recent years due to better adaptability. In the
paper, a comprehensive survey on existing SSC algorithms and the recent
development are presented. The SSC algorithms are classified systematically
into three main categories, namely, conventional SSC (CSSC), independent SSC
(ISSC) and extended SSC (XSSC). The characteristics of these algorithms are
highlighted and the potential future development of SSC is also discussed.Comment: This paper has been published in Information Sciences Journal in 201
- …