389 research outputs found

    Uncertain Multi-Criteria Optimization Problems

    Get PDF
    Most real-world search and optimization problems naturally involve multiple criteria as objectives. Generally, symmetry, asymmetry, and anti-symmetry are basic characteristics of binary relationships used when modeling optimization problems. Moreover, the notion of symmetry has appeared in many articles about uncertainty theories that are employed in multi-criteria problems. Different solutions may produce trade-offs (conflicting scenarios) among different objectives. A better solution with respect to one objective may compromise other objectives. There are various factors that need to be considered to address the problems in multidisciplinary research, which is critical for the overall sustainability of human development and activity. In this regard, in recent decades, decision-making theory has been the subject of intense research activities due to its wide applications in different areas. The decision-making theory approach has become an important means to provide real-time solutions to uncertainty problems. Theories such as probability theory, fuzzy set theory, type-2 fuzzy set theory, rough set, and uncertainty theory, available in the existing literature, deal with such uncertainties. Nevertheless, the uncertain multi-criteria characteristics in such problems have not yet been explored in depth, and there is much left to be achieved in this direction. Hence, different mathematical models of real-life multi-criteria optimization problems can be developed in various uncertain frameworks with special emphasis on optimization problems

    A Comprehensive study on (α,β)-multi-granulation bipolar fuzzy rough sets under bipolar fuzzy preference relation

    Get PDF
    The rough set (RS) and multi-granulation RS (MGRS) theories have been successfully extended to accommodate preference analysis by substituting the equivalence relation (ER) with the dominance relation (DR). On the other hand, the bipolar fuzzy sets (BFSs) are effective tools for handling bipolarity and fuzziness of the data. In this study, with the description of the background of risk decision-making problems in reality, we present (α,β) (\alpha, \beta) -optimistic multi-granulation bipolar fuzzified preference rough sets ((α,β)o (\alpha, \beta)^o -MG-BFPRSs) and (α,β) (\alpha, \beta) -pessimistic multi-granulation bipolar fuzzified preference rough sets ((α,β)p (\alpha, \beta)^p -MG-BFPRSs) using bipolar fuzzy preference relation (BFPR). Subsequently, the relevant properties and results of both (α,β)o (\alpha, \beta)^o -MG-BFPRSs and (α,β)p (\alpha, \beta)^p -MG-BFPRSs are investigated in detail. At the same time, a relationship among the (α,β) (\alpha, \beta) -BFPRSs, (α,β)o (\alpha, \beta)^o -MG-BFPRSs and (α,β)p (\alpha, \beta)^p -MG-BFPRSs is given

    Discrete Mathematics and Symmetry

    Get PDF
    Some of the most beautiful studies in Mathematics are related to Symmetry and Geometry. For this reason, we select here some contributions about such aspects and Discrete Geometry. As we know, Symmetry in a system means invariance of its elements under conditions of transformations. When we consider network structures, symmetry means invariance of adjacency of nodes under the permutations of node set. The graph isomorphism is an equivalence relation on the set of graphs. Therefore, it partitions the class of all graphs into equivalence classes. The underlying idea of isomorphism is that some objects have the same structure if we omit the individual character of their components. A set of graphs isomorphic to each other is denominated as an isomorphism class of graphs. The automorphism of a graph will be an isomorphism from G onto itself. The family of all automorphisms of a graph G is a permutation group

    Model granularity in engineering design – concepts and framework

    Get PDF
    In many engineering design contexts models are indispensable. They offer decision support and help tackle complex and interconnected design projects, capturing the underlying structure of development processes or resulting products. Because managers and engineers base many decisions on models, it is crucial to understand their properties and how these might influence their behaviour. The level of detail, or granularity, of a model is a key attribute that results from how reality is abstracted in the modelling process. Despite the direct impact granularity has on the use of a model, the general topic has so far only received limited attention and is therefore not well understood or documented. This article provides background on model theory, explores relevant terminology from a range of fields and discusses the implications for engineering design. Based on this, a classification framework is synthesised, which outlines the main manifestations of model granularity. This research contributes to theory by scrutinising the nature of model granularity. It also illustrates how this may manifest in engineering design models, using Design Structure Matrices as an example, and discusses associated challenges to provide a resource for modellers navigating decisions regarding granularity.This work was supported by an Industrial CASE studentship funded by the UK Engineering and Physical Sciences Research Council and BT [EP/K504282/1]

    EXPLOITING HIGHER ORDER UNCERTAINTY IN IMAGE ANALYSIS

    Get PDF
    Soft computing is a group of methodologies that works synergistically to provide flexible information processing capability for handling real-life ambiguous situations. Its aim is to exploit the tolerance for imprecision, uncertainty, approximate reasoning, and partial truth in order to achieve tractability, robustness, and low-cost solutions. Soft computing methodologies (involving fuzzy sets, neural networks, genetic algorithms, and rough sets) have been successfully employed in various image processing tasks including image segmentation, enhancement and classification, both individually or in combination with other soft computing techniques. The reason of such success has its motivation in the fact that soft computing techniques provide a powerful tools to describe uncertainty, naturally embedded in images, which can be exploited in various image processing tasks. The main contribution of this thesis is to present tools for handling uncertainty by means of a rough-fuzzy framework for exploiting feature level uncertainty. The first contribution is the definition of a general framework based on the hybridization of rough and fuzzy sets, along with a new operator called RF-product, as an effective solution to some problems in image analysis. The second and third contributions are devoted to prove the effectiveness of the proposed framework, by presenting a compression method based on vector quantization and its compression capabilities and an HSV color image segmentation technique

    Fuzzy Techniques for Decision Making 2018

    Get PDF
    Zadeh's fuzzy set theory incorporates the impreciseness of data and evaluations, by imputting the degrees by which each object belongs to a set. Its success fostered theories that codify the subjectivity, uncertainty, imprecision, or roughness of the evaluations. Their rationale is to produce new flexible methodologies in order to model a variety of concrete decision problems more realistically. This Special Issue garners contributions addressing novel tools, techniques and methodologies for decision making (inclusive of both individual and group, single- or multi-criteria decision making) in the context of these theories. It contains 38 research articles that contribute to a variety of setups that combine fuzziness, hesitancy, roughness, covering sets, and linguistic approaches. Their ranges vary from fundamental or technical to applied approaches

    A Variability Study of the Typical Red Supergiant Antares A

    Get PDF
    Red giants and red supergiants have long been known to be variable. In the last 40 years many of the features of this variability have been associated with large convective cells. Unfortunately, due to the long timescales of these variations they are not well studied, with the exception of the bright M-class supergiant Betelgeuse (α Orionis, M2 Iab). Betelgeuse has been well studied both observationally and theoretically, and has many features that are well described by models of convection. It was these studies of Betelgeuse that provided the main motivation for this thesis. We ask if the dramatic motions seen in the atmospheres of Betelgeuse (Gray 2008 amongst others) are typical of red supergiants and if their variations can be described by convective motions as is the case for that enigmatic star. Sixty-five spectra of the M-class supergiant Antares A (α Scorpii A, M1.5 Iab) were obtained at the Elginfield observatory from April 2008 until July 2010. These data were combined with historical radial velocity measures, Hipparcos photometry, and AAVSO photometry. From these data we determine four scales of variability: ~7140 days, 2167 days, ~1260 days, and ~100 days. The longest of these periods is found from the AAVSO photometry and cannot be confirmed by any of the other data. A period of similar length has been reported in one previous study but no analysis was completed. A period of this length (~7140 days; 19 years) is consistent with suspected rotation rates for red supergiants, though it is also plausible that episodic dust ejection could cause such a variation. The 2167-day variation was found from historical and present radial velocity measures. Due to the shape of the radial velocity curve, and a phase shift compared to the temperature curve, we interpret this period as a pulsation or a long secondary period. The ~1260-day period found from both sets of photometry and the 100-day timescale found from our spectral analysis are both interpreted along with a handful of periods found in the literature as arising from convection

    Rough Sets and Near Sets in Medical Imaging: A Review

    Full text link

    A Three-Dimensional Population Balance model of Granulation Processes Employing Mechanistic Kernels

    No full text
    Granulation is a process for agglomeration where powder material is combined with liquid binder solution to facilitate the formation of larger, free-flowing granules. Granulation has become a mainstream process amongst the industries with applicability in numerous areas, which include the pharmaceuticals, mineral processing, fertilisers and in the production of a range of commodity products. A major driVing force for the production of granules from their ungranulated counterparts arises from the economic savings Le., increased bulk density permits savings to be made in transportation. and storage. Furthermore, granules may be tailored to possess certain desirable attributes that will suit the final application of the granules. Granulation is an example of a process that exhibits complex interactions between the underlying granulation phenomena such as nucleation, consolidation, aggregation and breakage. In addition, the granUle properties are distributed heterogeneously across the entire particle population posing as a particular challenge in generating a mathematical model that is able to accurately describe the granulation behaviour. The modelling approach used in this study is different from common practices, which tend to rely on heuristics and empiricism for the operation of the granulation process. This empirical approach signifies a disconnect from our understanding of the underlying physics of the process, which poses as a impediment towards the efficient operation of granulation processes. The work presented in this thesis attempts to address this disconnect by applying a threedimensional population balance with mechanistic representations for the underlying granulation rate processes. The population balance framework is ideally suited for this particular process, as it enables the evolution of the granules to be tracked with respect to differentiating particle traits, e.g. the granule size distribution. The selection of the desired properties is influenced by the importance of these particle properties on the end granule product, and also by their influence on key process mechanisms. A novel mechanistic nucleation kernel is developed incorporating fundamental material properties pertaining to the powder substrate and the liquid binder solution. The model form of the nucleation kernel is formulated by drawing a parallel with the collision/transition state theory. There are few literature reports on the inclusion of nucleation phenomena in the population balance models of granulation processes, let alone a mechanistic nucleation model. This study is one of the first in this regard. The recent recognition of the importance of the wetting kinetics and the nucleation thermodynamics on the nucleation phenomenon has been factored into the nucleation kernel by explicitly accounting for the effects of the liquid flow rate and the physicochemical properties of the material properties (surface tension, contact angle, and spreading coefficient). Batch granulation experiments were conducted obtaining granule measurements with respect to the size distribution, porosity and fractional binder content. Preliminary results for the validation of the population balance model with the experiment-measurements showed a good agreement, providing partial albeit valuable validation of the population balance model. This is also one of the first studies to model and validate a three-dimensional population balance model for granulation. Model based analyses were also carried out under a variety of processing conditions, which included the effects of changing formulations, droplet size effects, feed size distribution and the effects of powder and binder properties. The proposed model demonstrated the interactions for a range of feed formulations in tandem with granulating operating conditions, establishing qualitative agreement with similar findings derived from past experimental studies.Imperial Users onl
    • …
    corecore