1,970,410 research outputs found

    Certain and possible rules for decision making using rough set theory extended to fuzzy sets

    Get PDF
    Uncertainty may be caused by the ambiguity in the terms used to describe a specific situation. It may also be caused by skepticism of rules used to describe a course of action or by missing and/or erroneous data. To deal with uncertainty, techniques other than classical logic need to be developed. Although, statistics may be the best tool available for handling likelihood, it is not always adequate for dealing with knowledge acquisition under uncertainty. Inadequacies caused by estimating probabilities in statistical processes can be alleviated through use of the Dempster-Shafer theory of evidence. Fuzzy set theory is another tool used to deal with uncertainty where ambiguous terms are present. Other methods include rough sets, the theory of endorsements and nonmonotonic logic. J. Grzymala-Busse has defined the concept of lower and upper approximation of a (crisp) set and has used that concept to extract rules from a set of examples. We will define the fuzzy analogs of lower and upper approximations and use these to obtain certain and possible rules from a set of examples where the data is fuzzy. Central to these concepts will be the idea of the degree to which a fuzzy set A is contained in another fuzzy set B, and the degree of intersection of a set A with set B. These concepts will also give meaning to the statement; A implies B. The two meanings will be: (1) if x is certainly in A then it is certainly in B, and (2) if x is possibly in A then it is possibly in B. Next, classification will be looked at and it will be shown that if a classification will be looked at and it will be shown that if a classification is well externally definable then it is well internally definable, and if it is poorly externally definable then it is poorly internally definable, thus generalizing a result of Grzymala-Busse. Finally, some ideas of how to define consensus and group options to form clusters of rules will be given

    A Dimension Reduction Scheme for the Computation of Optimal Unions of Subspaces

    Get PDF
    Given a set of points \F in a high dimensional space, the problem of finding a union of subspaces \cup_i V_i\subset \R^N that best explains the data \F increases dramatically with the dimension of \R^N. In this article, we study a class of transformations that map the problem into another one in lower dimension. We use the best model in the low dimensional space to approximate the best solution in the original high dimensional space. We then estimate the error produced between this solution and the optimal solution in the high dimensional space.Comment: 15 pages. Some corrections were added, in particular the title was changed. It will appear in "Sampling Theory in Signal and Image Processing

    Closed sets of correlations: answers from the zoo

    Get PDF
    We investigate the conditions under which a set of multipartite nonlocal correlations can describe the distributions achievable by distant parties conducting experiments in a consistent universe. Several questions are posed, such as: are all such sets "nested", i.e., contained into one another? Are they discrete or do they form a continuum? How many of them are supraquantum? Are there non-trivial polytopes among them? We answer some of these questions or relate them with established conjectures in complexity theory by introducing a "zoo" of physically consistent sets which can be characterized efficiently via either linear or semidefinite programming. As a bonus, we use the zoo to derive, for the first time, concrete impossibility results in nonlocality distillation.Comment: 24 pages, 5 figure

    Towards cohomology of renormalization: bigrading the combinatorial Hopf algebra of rooted trees

    Get PDF
    The renormalization of quantum field theory twists the antipode of a noncocommutative Hopf algebra of rooted trees, decorated by an infinite set of primitive divergences. The Hopf algebra of undecorated rooted trees, HR{\cal H}_R, generated by a single primitive divergence, solves a universal problem in Hochschild cohomology. It has two nontrivial closed Hopf subalgebras: the cocommutative subalgebra Hladder{\cal H}_{\rm ladder} of pure ladder diagrams and the Connes-Moscovici noncocommutative subalgebra HCM{\cal H}_{\rm CM} of noncommutative geometry. These three Hopf algebras admit a bigrading by nn, the number of nodes, and an index kk that specifies the degree of primitivity. In each case, we use iterations of the relevant coproduct to compute the dimensions of subspaces with modest values of nn and kk and infer a simple generating procedure for the remainder. The results for Hladder{\cal H}_{\rm ladder} are familiar from the theory of partitions, while those for HCM{\cal H}_{\rm CM} involve novel transforms of partitions. Most beautiful is the bigrading of HR{\cal H}_R, the largest of the three. Thanks to Sloane's {\tt superseeker}, we discovered that it saturates all possible inequalities. We prove this by using the universal Hochschild-closed one-cocycle B+B_+, which plugs one set of divergences into another, and by generalizing the concept of natural growth beyond that entailed by the Connes-Moscovici case. We emphasize the yet greater challenge of handling the infinite set of decorations of realistic quantum field theory.Comment: 21 pages, LaTe

    Efficient Random Assignment under a Combination of Ordinal and Cardinal Information on Preferences

    Get PDF
    Consider a collection of m indivisible objects to be allocated to n agents, where m = n. Each agent falls in one of two distinct categories: either he (a) has a complete ordinal ranking over the set of individual objects, or (b) has a set of “plausible” benchmark von Neumann-Morgenstern (vNM) utility functions in whose non-negative span his “true” utility is known to lie. An allocation is undominated if there does not exist a preference-compatible profile of vNM utilities at which it is Pareto dominated by another feasible allocation. Given an undominated allocation, we use the tools of linear duality theory to construct a profile of vNM utilities at which it is ex-ante welfare maximizing. A finite set of preference-compatible vNM utility profiles is exhibited such that every undominated allocation is ex-ante welfare maximizing with respect to at least one of them. Given an arbitrary allocation, we provide an interpretation of the constructed vNM utilities as subgradients of a function which measures worst-case domination.Random Assignment, Efficiency, Duality, Linear Programming

    Anticipation and the Non-linear Dynamics of Meaning-Processing in Social Systems

    Full text link
    Social order does not exist as a stable phenomenon, but can be considered as "an order of reproduced expectations." When anticipations operate upon one another, they can generate a non-linear dynamics which processes meaning. Although specific meanings can be stabilized, for example in social institutions, all meaning arises from a global horizon of possible meanings. Using Luhmann's (1984) social systems theory and Rosen's (1985) theory of anticipatory systems, I submit algorithms for modeling the non-linear dynamics of meaning in social systems. First, a self-referential system can use a model of itself for the anticipation. Under the condition of functional differentiation, the social system can be expected to entertain a set of models; each model can also contain a model of the other models. Two anticipatory mechanisms are then possible: a transversal one between the models, and a longitudinal one providing the system with a variety of meanings. A system containing two anticipatory mechanisms can become hyper-incursive. Without making decisions, however, a hyper-incursive system would be overloaded with uncertainty. Under this pressure, informed decisions tend to replace the "natural preferences" of agents and a knowledge-based order can increasingly be shaped
    corecore