13 research outputs found

    Channels That Die

    Full text link
    Given the possibility of communication systems failing catastrophically, we investigate limits to communicating over channels that fail at random times. These channels are finite-state semi-Markov channels. We show that communication with arbitrarily small probability of error is not possible. Making use of results in finite blocklength channel coding, we determine sequences of blocklengths that optimize transmission volume communicated at fixed maximum message error probabilities. We provide a partial ordering of communication channels. A dynamic programming formulation is used to show the structural result that channel state feedback does not improve performance

    On the Throughput of Channels that Wear Out

    Full text link
    This work investigates the fundamental limits of communication over a noisy discrete memoryless channel that wears out, in the sense of signal-dependent catastrophic failure. In particular, we consider a channel that starts as a memoryless binary-input channel and when the number of transmitted ones causes a sufficient amount of damage, the channel ceases to convey signals. Constant composition codes are adopted to obtain an achievability bound and the left-concave right-convex inequality is then refined to obtain a converse bound on the log-volume throughput for channels that wear out. Since infinite blocklength codes will always wear out the channel for any finite threshold of failure and therefore cannot convey information at positive rates, we analyze the performance of finite blocklength codes to determine the maximum expected transmission volume at a given level of average error probability. We show that this maximization problem has a recursive form and can be solved by dynamic programming. Numerical results demonstrate that a sequence of block codes is preferred to a single block code for streaming sources.Comment: 23 pages, 1 table, 11 figures, submitted to IEEE Transactions on Communication

    Communication Strategies for Low-Latency Trading

    Full text link
    The possibility of latency arbitrage in financial markets has led to the deployment of high-speed communication links between distant financial centers. These links are noisy and so there is a need for coding. In this paper, we develop a gametheoretic model of trading behavior where two traders compete to capture latency arbitrage opportunities using binary signalling. Different coding schemes are strategies that trade off between reliability and latency. When one trader has a better channel, the second trader should not compete. With statistically identical channels, we find there are two different regimes of channel noise for which: there is a unique Nash equilibrium yielding ties; and there are two Nash equilibria with different winners.Comment: Will appear in IEEE International Symposium on Information Theory (ISIT), 201

    The third-order term in the normal approximation for singular channels

    Full text link
    For a singular and symmetric discrete memoryless channel with positive dispersion, the third-order term in the normal approximation is shown to be upper bounded by a constant. This finding completes the characterization of the third-order term for symmetric discrete memoryless channels. The proof method is extended to asymmetric and singular channels with constant composition codes, and its connection to existing results, as well as its limitation in the error exponents regime, are discussed.Comment: Submitted to IEEE Trans. Inform. Theor

    Connectivity Solutions in Automated Trading

    Get PDF
    The study analyzes the architecture and deployment of direct market access (DMA) solutions for automated trading of securities.  It provides an overview of automated trading systems including: trading floor architecture, trading environment connectivity, and DMA solutions.  Among a range of factors influencing operational capacities, round-trip latency has been recognized as the key quality differentiator of an automated trading floor.  The study identifies potential opportunity costs due to latency levels as a major driver of technological progress in trading in highly liquid market conditions

    Unreliable and resource-constrained decoding

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Cataloged from student submitted PDF version of thesis.Includes bibliographical references (p. 185-213).Traditional information theory and communication theory assume that decoders are noiseless and operate without transient or permanent faults. Decoders are also traditionally assumed to be unconstrained in physical resources like material, memory, and energy. This thesis studies how constraining reliability and resources in the decoder limits the performance of communication systems. Five communication problems are investigated. Broadly speaking these are communication using decoders that are wiring cost-limited, that are memory-limited, that are noisy, that fail catastrophically, and that simultaneously harvest information and energy. For each of these problems, fundamental trade-offs between communication system performance and reliability or resource consumption are established. For decoding repetition codes using consensus decoding circuits, the optimal tradeoff between decoding speed and quadratic wiring cost is defined and established. Designing optimal circuits is shown to be NP-complete, but is carried out for small circuit size. The natural relaxation to the integer circuit design problem is shown to be a reverse convex program. Random circuit topologies are also investigated. Uncoded transmission is investigated when a population of heterogeneous sources must be categorized due to decoder memory constraints. Quantizers that are optimal for mean Bayes risk error, a novel fidelity criterion, are designed. Human decision making in segregated populations is also studied with this framework. The ratio between the costs of false alarms and missed detections is also shown to fundamentally affect the essential nature of discrimination. The effect of noise on iterative message-passing decoders for low-density parity check (LDPC) codes is studied. Concentration of decoding performance around its average is shown to hold. Density evolution equations for noisy decoders are derived. Decoding thresholds degrade smoothly as decoder noise increases, and in certain cases, arbitrarily small final error probability is achievable despite decoder noisiness. Precise information storage capacity results for reliable memory systems constructed from unreliable components are also provided. Limits to communicating over systems that fail at random times are established. Communication with arbitrarily small probability of error is not possible, but schemes that optimize transmission volume communicated at fixed maximum message error probabilities are determined. System state feedback is shown not to improve performance. For optimal communication with decoders that simultaneously harvest information and energy, a coding theorem that establishes the fundamental trade-off between the rates at which energy and reliable information can be transmitted over a single line is proven. The capacity-power function is computed for several channels; it is non-increasing and concave.by Lav R. Varshney.Ph.D

    Landscape evolution and holocene climate change in mountain areas of the northern Highlands, Scotland.

    Get PDF
    Holocene landscape evolution in the tectonically quiet mountain areas of the Northern Highlands of Scotland has been attributed largely to postglacial relaxation, which has left a legacy of stable, relict landscapes, disturbed only by intrinsic local response, and modified to an uncertain extent by human activity. A review of this model was prompted by improved understanding of a) the variability of the Holocene climate in mid latitudes, b) the responsiveness of some geological and geomorphological systems to low amplitude climate fluctuations, and c) a small number of field studies from the region, reporting mid and late Holocene slope mass movement unrelated to anthropogenic impact. Sixteen catchments were explored using fieldwork and aerial photographic analysis. Slope activity since the end of the last glacial was investigated at five sites which contained evidence of long sequence, shallow slope failure, gully transport of slope debris, and debris fan formation. At two of these, stratigraphic sections, together with sediment and facies analysis, were combined with radiocarbon dating, in order to elucidate slope processes and constmct a chronostratigraphy. Results confirmed widespread Holocene lower slope re-organisation, with mid and late Holocene landscape rejuvenation occurring millennia after apparent adjustment to postglacial conditions at the two dated localities. Mass movement on slopes was found to have parallels in floodplain aggradation and incision. These transformations appear to have operated on several different time scales, and across a strong regional precipitation gradient. Since they are a function of the glacial inheritance of these landscapes, the potential for further transformations exists. Mid and late Holocene events are only poorly accounted for by paraglacial relaxation. A more robust model of landscape evolution in this setting, incorporates climate change (specifically, precipitation shifts) - interacting with progressive weathering and vegetation cover - as a critical environmental variable. Although no justification was found for the use of dated slope mass movements as palaeoclimate proxies, changes in event frequency on a time scale of 10^ years may contain a climatic signal
    corecore