42,517 research outputs found

    How to Couple from the Past Using a Read-Once Source of Randomness

    Full text link
    We give a new method for generating perfectly random samples from the stationary distribution of a Markov chain. The method is related to coupling from the past (CFTP), but only runs the Markov chain forwards in time, and never restarts it at previous times in the past. The method is also related to an idea known as PASTA (Poisson arrivals see time averages) in the operations research literature. Because the new algorithm can be run using a read-once stream of randomness, we call it read-once CFTP. The memory and time requirements of read-once CFTP are on par with the requirements of the usual form of CFTP, and for a variety of applications the requirements may be noticeably less. Some perfect sampling algorithms for point processes are based on an extension of CFTP known as coupling into and from the past; for completeness, we give a read-once version of coupling into and from the past, but it remains unpractical. For these point process applications, we give an alternative coupling method with which read-once CFTP may be efficiently used.Comment: 28 pages, 2 figure

    Low-Complexity Quantized Switching Controllers using Approximate Bisimulation

    Full text link
    In this paper, we consider the problem of synthesizing low-complexity controllers for incrementally stable switched systems. For that purpose, we establish a new approximation result for the computation of symbolic models that are approximately bisimilar to a given switched system. The main advantage over existing results is that it allows us to design naturally quantized switching controllers for safety or reachability specifications; these can be pre-computed offline and therefore the online execution time is reduced. Then, we present a technique to reduce the memory needed to store the control law by borrowing ideas from algebraic decision diagrams for compact function representation and by exploiting the non-determinism of the synthesized controllers. We show the merits of our approach by applying it to a simple model of temperature regulation in a building

    State-Dependent Computation Using Coupled Recurrent Networks

    Get PDF
    Although conditional branching between possible behavioral states is a hallmark of intelligent behavior, very little is known about the neuronal mechanisms that support this processing. In a step toward solving this problem, we demonstrate by theoretical analysis and simulation how networks of richly interconnected neurons, such as those observed in the superficial layers of the neocortex, can embed reliable, robust finite state machines. We show how a multistable neuronal network containing a number of states can be created very simply by coupling two recurrent networks whose synaptic weights have been configured for soft winner-take-all (sWTA) performance. These two sWTAs have simple, homogeneous, locally recurrent connectivity except for a small fraction of recurrent cross-connections between them, which are used to embed the required states. This coupling between the maps allows the network to continue to express the current state even after the input that elicited that state iswithdrawn. In addition, a small number of transition neurons implement the necessary input-driven transitions between the embedded states. We provide simple rules to systematically design and construct neuronal state machines of this kind. The significance of our finding is that it offers a method whereby the cortex could construct networks supporting a broad range of sophisticated processing by applying only small specializations to the same generic neuronal circuit

    Capacity of a Nonlinear Optical Channel with Finite Memory

    Get PDF
    The channel capacity of a nonlinear, dispersive fiber-optic link is revisited. To this end, the popular Gaussian noise (GN) model is extended with a parameter to account for the finite memory of realistic fiber channels. This finite-memory model is harder to analyze mathematically but, in contrast to previous models, it is valid also for nonstationary or heavy-tailed input signals. For uncoded transmission and standard modulation formats, the new model gives the same results as the regular GN model when the memory of the channel is about 10 symbols or more. These results confirm previous results that the GN model is accurate for uncoded transmission. However, when coding is considered, the results obtained using the finite-memory model are very different from those obtained by previous models, even when the channel memory is large. In particular, the peaky behavior of the channel capacity, which has been reported for numerous nonlinear channel models, appears to be an artifact of applying models derived for independent input in a coded (i.e., dependent) scenario
    corecore