13 research outputs found

    On the New Notion of the Set-Expectation for a Random Set of Events

    Get PDF
    The paper introduces new notion for the set-valued mean set of a random set. The means are defined as families of sets that minimize mean distances to the random set. The distances are determined by metrics in spaces of sets or by suitable generalizations. Some examples illustrate the use of the new definitions.mean random set, metrics in set space, mean distance, Aumann expectation, Frechet expectation, Hausdorff metric, random finite set, mean set, set-median, set-expectation

    Modeling spatial-temporal change of Poyang Lake marshland based on an uncertainty theory - random sets

    Get PDF
    AbstractUncertainty modeling now engages the attention of researchers in spatial temporal change analysis in remote sensing. Some studies proposed to use random sets for modeling the spatial uncertainty of image objects with uncertain boundaries, but none have considered the parameter determination problem for large datasets. In this paper we refined the random set models for monitoring monthly changes in wetland vegetation areas from series of images. Twelve cloud-free HJ-1A/1B images from April 2009 to March 2010 were used for monitoring spatial-temporal changes of Poyang Lake wetlands. We applied random sets to represent spatial uncertainty of wetland vegetation that were extracted from normalized difference vegetation index (NDVI) maps. Time series of random sets reflect the seasonal differences of location and extents of the wetlands, whereas degree of uncertainties indicated by SD and CV indices reflect the gradual change of the wetland vegetation in space. Results show that the uncertain extents of wetland vegetation change through the year, achieving the largest range and uncertainty degree in autumn. This coincides with the highly heterogeneous vegetation status in autumn, since the wetland recovers gradually after flooding and young vegetation emerges at gradually changing densities, thus providing forage in different ecological zones for different types of migratory birds. We conclude that the random set model enriches spatial-temporal modeling of phenomena which are uncertain in space and dynamic in time

    Averaging of random sets and binary images

    Get PDF

    Uncertainty Assessment in High-Risk Environments Using Probability, Evidence Theory and Expert Judgment Elicitation

    Get PDF
    The level of uncertainty in advanced system design is assessed by comparing the results of expert judgment elicitation to probability and evidence theory. This research shows how one type of monotone measure, namely Dempster-Shafer Theory of Evidence can expand the framework of uncertainty to provide decision makers a more robust solution space. The issues imbedded in this research are focused on how the relevant predictive uncertainty produced by similar action is measured. This methodology uses the established approach from traditional probability theory and Dempster-Shafer evidence theory to combine two classes of uncertainty, aleatory and epistemic. Probability theory provides the mathematical structure traditionally used in the representation of aleatory uncertainty. The uncertainty in analysis outcomes is represented by probability distributions and typically summarized as Complimentary Cumulative Distribution Functions (CCDFs). The main components of this research are probability of X in the probability theory compared to mx in evidence theory. Using this comparison, an epistemic model is developed to obtain the upper “CCPF - Complimentary Cumulative Plausibility Function” limits and the lower “CCBF - Complimentary Cumulative Belief Function” limits compared to the traditional probability function. A conceptual design for the Thermal Protection System (TPS) of future Crew Exploration Vehicles (CEV) is used as an initial test case. A questionnaire is tailored to elicit judgment from experts in high-risk environments. Based on description and characteristics, the answers of the questionnaire produces information, that serves as qualitative semantics used for the evidence theory functions. The computational mechanism provides a heuristic approach for the compilation and presentation of the results. A follow-up evaluation serves as validation of the findings and provides useful information in terms of consistency and adoptability to other domains. The results of this methodology provide a useful and practical approach in conceptual design to aid the decision maker in assessing the level of uncertainty of the experts. The methodology presented is well-suited for decision makers that encompass similar conceptual design instruments

    Development of a New Local Adaptive Thresholding Method and Classification Algorithms for X-ray Machine Vision Inspection of Pecans

    Get PDF
    This study evaluated selected local adaptive thresholding methods for pecan defect segmentation and proposed a new method: Reverse Water Flow. Good pecan nuts and fabricated defective pecan nuts were used for comparison, in addition to images from published research articles. For detailed comparison, defective and good pecans, 100 each, were collect from a mechanical sorter operating at Pecan Research Farm, Oklahoma State University. To improve classification accuracy and reduce the decision time AdaBoost and support vector machine classifiers were applied and compared with Bayesian classifier. The data set was randomly divided into training and validation sets and 300 such runs were made. A new local adaptive thresholding method with a new hypothesis: reversing the water flow and a simpler thresholding criterion is proposed. The new hypothesis, reversing the simulated water flow, reduced the computational time by 40-60% as compared to the existing fastest Oh method. The proposed method could segment both larBiosystems and Agricultural Engineerin

    Image Processing and Simulation Toolboxes of Microscopy Images of Bacterial Cells

    Get PDF
    Recent advances in microscopy imaging technology have allowed the characterization of the dynamics of cellular processes at the single-cell and single-molecule level. Particularly in bacterial cell studies, and using the E. coli as a case study, these techniques have been used to detect and track internal cell structures such as the Nucleoid and the Cell Wall and fluorescently tagged molecular aggregates such as FtsZ proteins, Min system proteins, inclusion bodies and all the different types of RNA molecules. These studies have been performed with using multi-modal, multi-process, time-lapse microscopy, producing both morphological and functional images. To facilitate the finding of relationships between cellular processes, from small-scale, such as gene expression, to large-scale, such as cell division, an image processing toolbox was implemented with several automatic and/or manual features such as, cell segmentation and tracking, intra-modal and intra-modal image registration, as well as the detection, counting and characterization of several cellular components. Two segmentation algorithms of cellular component were implemented, the first one based on the Gaussian Distribution and the second based on Thresholding and morphological structuring functions. These algorithms were used to perform the segmentation of Nucleoids and to identify the different stages of FtsZ Ring formation (allied with the use of machine learning algorithms), which allowed to understand how the temperature influences the physical properties of the Nucleoid and correlated those properties with the exclusion of protein aggregates from the center of the cell. Another study used the segmentation algorithms to study how the temperature affects the formation of the FtsZ Ring. The validation of the developed image processing methods and techniques has been based on benchmark databases manually produced and curated by experts. When dealing with thousands of cells and hundreds of images, these manually generated datasets can become the biggest cost in a research project. To expedite these studies in terms of time and lower the cost of the manual labour, an image simulation was implemented to generate realistic artificial images. The proposed image simulation toolbox can generate biologically inspired objects that mimic the spatial and temporal organization of bacterial cells and their processes, such as cell growth and division and cell motility, and cell morphology (shape, size and cluster organization). The image simulation toolbox was shown to be useful in the validation of three cell tracking algorithms: Simple Nearest-Neighbour, Nearest-Neighbour with Morphology and DBSCAN cluster identification algorithm. It was shown that the Simple Nearest-Neighbour still performed with great reliability when simulating objects with small velocities, while the other algorithms performed better for higher velocities and when there were larger clusters present

    Application of Random Sets to Image Analysis

    Get PDF
    Abstract not provided
    corecore