58,553 research outputs found
Examining Granular Computing from a Modeling Perspective
In this paper, we use a set of unified components to conduct granular modeling for problem solving paradigms in several fields of computing. Each identified component may represent a potential research direction in the field of granular computing. A granular computing model for information analysis is proposed. The model may suggest that granular computing is an instrument for implementing perception based computing based on numeric computing. In addition, a novel granular language modeling technique is proposed for information extraction from web pages. This paper also suggests that the study of data mining in the framework of granular computing may address the issues of interpretability and usage of discovered patterns
A Granular Computing-Based Model for Group Decision-Making in Multi-Criteria and Heterogeneous Environments
Granular computing is a growing computing paradigm of information processing that covers any techniques, methodologies, and theories employing information granules in complex problem solving. Within the recent past, it has been applied to solve group decision-making processes and different granular computing-based models have been constructed, which focus on some particular aspects of these decision-making processes. This study presents a new granular computing-based model for group decision-making processes defined in multi-criteria and heterogeneous environments that is able to improve with minimum adjustment both the consistency associated with individual decision-makers and the consensus related to the group. Unlike the existing granular computing-based approaches, this new one is able to take into account a higher number of features when dealing with this kind of decision-making processes
Steady state properties of a mean field model of driven inelastic mixtures
We investigate a Maxwell model of inelastic granular mixture under the
influence of a stochastic driving and obtain its steady state properties in the
context of classical kinetic theory. The model is studied analytically by
computing the moments up to the eighth order and approximating the
distributions by means of a Sonine polynomial expansion method. The main
findings concern the existence of two different granular temperatures, one for
each species, and the characterization of the distribution functions, whose
tails are in general more populated than those of an elastic system. These
analytical results are tested against Monte Carlo numerical simulations of the
model and are in general in good agreement. The simulations, however, reveal
the presence of pronounced non-gaussian tails in the case of an infinite
temperature bath, which are not well reproduced by the Sonine method.Comment: 23 pages, 10 figures, submitted for publicatio
A Study of Deep CNN Model with Labeling Noise Based on Granular-ball Computing
In supervised learning, the presence of noise can have a significant impact
on decision making. Since many classifiers do not take label noise into account
in the derivation of the loss function, including the loss functions of
logistic regression, SVM, and AdaBoost, especially the AdaBoost iterative
algorithm, whose core idea is to continuously increase the weight value of the
misclassified samples, the weight of samples in many presence of label noise
will be increased, leading to a decrease in model accuracy. In addition, the
learning process of BP neural network and decision tree will also be affected
by label noise. Therefore, solving the label noise problem is an important
element of maintaining the robustness of the network model, which is of great
practical significance. Granular ball computing is an important modeling method
developed in the field of granular computing in recent years, which is an
efficient, robust and scalable learning method. In this paper, we pioneered a
granular ball neural network algorithm model, which adopts the idea of
multi-granular to filter label noise samples during model training, solving the
current problem of model instability caused by label noise in the field of deep
learning, greatly reducing the proportion of label noise in training samples
and improving the robustness of neural network models
Iterative Information Granulation for Novelty Detection in Complex Datasets
Recognition memory in a number of mammals is usually utilised to identify novel objects that violate model predictions. In humans in particular, the recognition of novel objects is foremost associated to their ability to group objects that are highly compatible/similar. Granular computing not only mimics the human cognition to draw objects together but also mimics the ability to capture associated properties by similarity, proximity or functionality. In this paper, an iterative information granulation approach is presented, for the problem of novelty detection in complex data. Two granular compatibility measures are used, based on principles of Granular Computing, namely the multidimensional distance between the granules, as well as the granular density and volume. A two-stage iterative information granulation is proposed in this work. In the first stage, a predefined number of granular detectors are constructed. The granular detectors capture the relationships (rules) between the input-output data and then use this information in a second granulation stage in order to discriminate new samples as novel. The proposed iterative information granulation approach for novelty detection is then applied to three different benchmark problems in pattern recognition demonstrating very good performance
On mesogranulation, network formation and supergranulation
We present arguments which show that in all likelihood mesogranulation is not
a true scale of solar convection but the combination of the effects of both
highly energetic granules, which give birth to strong positive divergences
(SPDs) among which we find exploders, and averaging effects of data processing.
The important role played by SPDs in horizontal velocity fields appears in the
spectra of these fields where the scale 4 Mm is most energetic; we
illustrate the effect of averaging with a one-dimensional toy model which shows
how two independent non-moving (but evolving) structures can be transformed
into a single moving structure when time and space resolution are degraded.
The role of SPDs in the formation of the photospheric network is shown by
computing the advection of floating corks by the granular flow. The coincidence
of the network bright points distribution and that of the corks is remarkable.
We conclude with the possibility that supergranulation is not a proper scale of
convection but the result of a large-scale instability of the granular flow,
which manifests itself through a correlation of the flows generated by SPDs.Comment: 10 pages, 11 figures, to appear in Astronomy and Astrophysic
Recommended from our members
Hierarchical wireless framework for real-time collaborative generation and distribution of telemetry data
This project introduces a novel multidisciplinary approach combining Vehicular Ad Hoc Networks and Granular Computing, to the data processing and information generation problem in large urban traffic systems. It addresses the challenge of realtime information generation and dissemination in such systems by designing and investigating a hierarchical real-time information framework. The research work is complemented by designing and developing a simulator for such a system, which provides a simulation environment for the model developed. The proposed multidisciplinary hierarchical real-time information processing and dissemination system framework utilises results from two different areas of study, which are Vehicular Ad Hoc Networks (VANETS) and Granular Computing concepts. Furthermore, a new geographically constrained VANET topology for information generation is proposed, simulated and investigated
Custom v. Standardized Risk Models
We discuss when and why custom multi-factor risk models are warranted and
give source code for computing some risk factors. Pension/mutual funds do not
require customization but standardization. However, using standardized risk
models in quant trading with much shorter holding horizons is suboptimal: 1)
longer horizon risk factors (value, growth, etc.) increase noise trades and
trading costs; 2) arbitrary risk factors can neutralize alpha; 3)
"standardized" industries are artificial and insufficiently granular; 4)
normalization of style risk factors is lost for the trading universe; 5)
diversifying risk models lowers P&L correlations, reduces turnover and market
impact, and increases capacity. We discuss various aspects of custom risk model
building.Comment: 30 pages; minor improvements, more source code added; to appear in
Risk
- âŠ