1,972 research outputs found

    Iterative Decoding and Turbo Equalization: The Z-Crease Phenomenon

    Full text link
    Iterative probabilistic inference, popularly dubbed the soft-iterative paradigm, has found great use in a wide range of communication applications, including turbo decoding and turbo equalization. The classic approach of analyzing the iterative approach inevitably use the statistical and information-theoretical tools that bear ensemble-average flavors. This paper consider the per-block error rate performance, and analyzes it using nonlinear dynamical theory. By modeling the iterative processor as a nonlinear dynamical system, we report a universal "Z-crease phenomenon:" the zig-zag or up-and-down fluctuation -- rather than the monotonic decrease -- of the per-block errors, as the number of iteration increases. Using the turbo decoder as an example, we also report several interesting motion phenomenons which were not previously reported, and which appear to correspond well with the notion of "pseudo codewords" and "stopping/trapping sets." We further propose a heuristic stopping criterion to control Z-crease and identify the best iteration. Our stopping criterion is most useful for controlling the worst-case per-block errors, and helps to significantly reduce the average-iteration numbers.Comment: 6 page

    Scaling near Quantum Chaos Border in Interacting Fermi Systems

    Full text link
    The emergence of quantum chaos for interacting Fermi systems is investigated by numerical calculation of the level spacing distribution P(s)P(s) as function of interaction strength UU and the excitation energy Ļµ\epsilon above the Fermi level. As UU increases, P(s)P(s) undergoes a transition from Poissonian (nonchaotic) to Wigner-Dyson (chaotic) statistics and the transition is described by a single scaling parameter given by Z=(UĻµĪ±āˆ’u0)Ļµ1/2Ī½Z = (U \epsilon^{\alpha}-u_0) \epsilon^{1/2\nu}, where u0u_0 is a constant. While the exponent Ī±\alpha, which determines the global change of the chaos border, is indecisive within a broad range of 0.9āˆ¼2.00.9 \sim 2.0, finite value of Ī½\nu, which comes from the increase of the Fock space size with Ļµ\epsilon, suggests that the transition becomes sharp as Ļµ\epsilon increases.Comment: 4 pages, 4 figures, to appear in Phys. Rev. E (Rapid Communication

    Complementary cooperation, minimal winning coalitions, and power indices

    Full text link
    We introduce a new simple game, which is referred to as the complementary weighted multiple majority game (C-WMMG for short). C-WMMG models a basic cooperation rule, the complementary cooperation rule, and can be taken as a sister model of the famous weighted majority game (WMG for short). In this paper, we concentrate on the two dimensional C-WMMG. An interesting property of this case is that there are at most n+1n+1 minimal winning coalitions (MWC for short), and they can be enumerated in time O(nlogā”n)O(n\log n), where nn is the number of players. This property guarantees that the two dimensional C-WMMG is more handleable than WMG. In particular, we prove that the main power indices, i.e. the Shapley-Shubik index, the Penrose-Banzhaf index, the Holler-Packel index, and the Deegan-Packel index, are all polynomially computable. To make a comparison with WMG, we know that it may have exponentially many MWCs, and none of the four power indices is polynomially computable (unless P=NP). Still for the two dimensional case, we show that local monotonicity holds for all of the four power indices. In WMG, this property is possessed by the Shapley-Shubik index and the Penrose-Banzhaf index, but not by the Holler-Packel index or the Deegan-Packel index. Since our model fits very well the cooperation and competition in team sports, we hope that it can be potentially applied in measuring the values of players in team sports, say help people give more objective ranking of NBA players and select MVPs, and consequently bring new insights into contest theory and the more general field of sports economics. It may also provide some interesting enlightenments into the design of non-additive voting mechanisms. Last but not least, the threshold version of C-WMMG is a generalization of WMG, and natural variants of it are closely related with the famous airport game and the stable marriage/roommates problem.Comment: 60 page

    Worst-Case Analysis of Process Flexibility Designs

    Get PDF
    Theoretical studies of process flexibility designs have mostly focused on expected sales. In this paper, we take a different approach by studying process flexibility designs from the worst-case point of view. To study the worst-case performances, we introduce the plant cover indices (PCIs), defined by bottlenecks in flexibility designs containing a fixed number of products. We prove that given a flexibility design, a general class of worst-case performance measures can be expressed as functions of the designā€™s PCIs and the given uncertainty set. This result has several major implications. First, it suggests a method to compare the worst-case performances of different flexibility designs without the need to know the specifics of the uncertainty sets. Second, we prove that under symmetric uncertainty sets and a large class of worst-case performance measures, the long chain, a celebrated sparse design, is superior to a large class of sparse flexibility designs, including any design that has a degree of two on each of its product nodes. Third, we show that under stochastic demand, the classical Jordan and Graves (JG) index can be expressed as a function of the PCIs. Furthermore, the PCIs motivate a modified JG index that is shown to be more effective in our numerical study. Finally, the PCIs lead to a heuristic for finding sparse flexibility designs that perform well under expected sales and have lower risk measures in our computational study.National Science Foundation (U.S.) (Grant CMMI-0758069)Masdar Institute of Science and TechnologyFord-MIT AllianceNatural Sciences and Engineering Research Council of Canada (Postgraduate Scholarship

    Aggregation of Coarse Preferences

    Get PDF
    We consider weak preference orderings over a set An of n alternatives. An individual preference is of refinement lindividual preferences; voting rules; aggregation

    Pseudorandom Generators for Low Sensitivity Functions

    Get PDF
    A Boolean function is said to have maximal sensitivity s if s is the largest number of Hamming neighbors of a point which differ from it in function value. We initiate the study of pseudorandom generators fooling low-sensitivity functions as an intermediate step towards settling the sensitivity conjecture. We construct a pseudorandom generator with seed-length 2^{O(s^{1/2})} log(n) that fools Boolean functions on n variables with maximal sensitivity at most s. Prior to our work, the (implicitly) best pseudorandom generators for this class of functions required seed-length 2^{O(s)} log(n)
    • ā€¦
    corecore