1,032 research outputs found
Solving the four index fully fuzzy transportation problem
In this paper, we will solve the four index fully fuzzy transportation problem (textit{FFTP}) with some adapted classical methods. All problem\u27s data will be presented as fuzzy numbers. In order to defuzificate these data, we will use the ranking function procedure. Our method to solve the textit{FFTP} composed of two phases; in the first one, we will use an adaptation of well-known algorithms to find an initial feasible solution, which are the least cost, Russell\u27s approximation and Vogel\u27s approximation methods. In the second phase, we will test the optimality of the initial solution, if it is not optimal, we will improve it. A numerical analysis of the proposed methods is performed by solving different examples of different sizes; it is determined that they are stable, robust, and efficient. A proper comparative study between the adapted methods identifies the suitable method for solving textit{FFTP}
Fuzzy linear programming problems : models and solutions
We investigate various types of fuzzy linear programming problems based on models and solution methods. First, we review fuzzy linear programming problems with fuzzy decision variables and fuzzy linear programming problems with fuzzy parameters (fuzzy numbers in the definition of the objective function or constraints) along with the associated duality results. Then, we review the fully fuzzy linear programming problems with all variables and parameters being allowed to be fuzzy. Most methods used for solving such problems are based on ranking functions, alpha-cuts, using duality results or penalty functions. In these methods, authors deal with crisp formulations of the fuzzy problems. Recently, some heuristic algorithms have also been proposed. In these methods, some authors solve the fuzzy problem directly, while others solve the crisp problems approximately
Laplacian Mixture Modeling for Network Analysis and Unsupervised Learning on Graphs
Laplacian mixture models identify overlapping regions of influence in
unlabeled graph and network data in a scalable and computationally efficient
way, yielding useful low-dimensional representations. By combining Laplacian
eigenspace and finite mixture modeling methods, they provide probabilistic or
fuzzy dimensionality reductions or domain decompositions for a variety of input
data types, including mixture distributions, feature vectors, and graphs or
networks. Provable optimal recovery using the algorithm is analytically shown
for a nontrivial class of cluster graphs. Heuristic approximations for scalable
high-performance implementations are described and empirically tested.
Connections to PageRank and community detection in network analysis demonstrate
the wide applicability of this approach. The origins of fuzzy spectral methods,
beginning with generalized heat or diffusion equations in physics, are reviewed
and summarized. Comparisons to other dimensionality reduction and clustering
methods for challenging unsupervised machine learning problems are also
discussed.Comment: 13 figures, 35 reference
Linear kernels for outbranching problems in sparse digraphs
In the -Leaf Out-Branching and -Internal Out-Branching problems we are
given a directed graph with a designated root and a nonnegative integer
. The question is to determine the existence of an outbranching rooted at
that has at least leaves, or at least internal vertices,
respectively. Both these problems were intensively studied from the points of
view of parameterized complexity and kernelization, and in particular for both
of them kernels with vertices are known on general graphs. In this
work we show that -Leaf Out-Branching admits a kernel with vertices
on -minor-free graphs, for any fixed family of graphs
, whereas -Internal Out-Branching admits a kernel with
vertices on any graph class of bounded expansion.Comment: Extended abstract accepted for IPEC'15, 27 page
Recommended from our members
Variable neighbourhood search based heuristic for K-harmonic means clustering
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Although there has been a rapid development of technology and increase of computation speeds, most of the real-world optimization problems still cannot be solved in a reasonable time. Some times it is impossible for them to be optimally solved, as there are many instances of real problems which cannot be addressed by computers at their present speed. In such cases, the heuristic approach can be used. Heuristic research has been used by many researchers to supply this need. It gives a sufficient solution in reasonable time. The clustering problem is one example of this, formed in many applications.
In this thesis, I suggest a Variable Neighbourhood Search (VNS) to improve a recent clustering local search called K-Harmonic Means (KHM).Many experiments are presented to show the strength of my code compared with some algorithms from the literature.
Some counter-examples are introduced to show that KHM may degenerate entirely, in either one or more runs. Furthermore, it degenerates and then stops in some familiar datasets, which significantly affects the final solution. Hence, I present a removing degeneracy code for KHM. I also apply VNS to improve the code of KHM after removing the evidence of degeneracy
- …