53,014 research outputs found
Asymptotic Optimality of Antidictionary Codes
An antidictionary code is a lossless compression algorithm using an
antidictionary which is a set of minimal words that do not occur as substrings
in an input string. The code was proposed by Crochemore et al. in 2000, and its
asymptotic optimality has been proved with respect to only a specific
information source, called balanced binary source that is a binary Markov
source in which a state transition occurs with probability 1/2 or 1. In this
paper, we prove the optimality of both static and dynamic antidictionary codes
with respect to a stationary ergodic Markov source on finite alphabet such that
a state transition occurs with probability .Comment: 5 pages, to appear in the proceedings of 2010 IEEE International
Symposium on Information Theory (ISIT2010
Quantization as histogram segmentation: globally optimal scalar quantizer design in network systems
We propose a polynomial-time algorithm for optimal scalar quantizer design on discrete-alphabet sources. Special cases of the proposed approach yield optimal design algorithms for fixed-rate and entropy-constrained scalar quantizers, multi-resolution scalar quantizers, multiple description scalar quantizers, and Wyner-Ziv scalar quantizers. The algorithm guarantees globally optimal solutions for fixed-rate and entropy-constrained scalar quantizers and constrained optima for the other coding scenarios. We derive the algorithm by demonstrating the connection between scalar quantization, histogram segmentation, and the shortest path problem in a certain directed acyclic graph
On the Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts
The article presents a new interpretation for Zipf-Mandelbrot's law in
natural language which rests on two areas of information theory. Firstly, we
construct a new class of grammar-based codes and, secondly, we investigate
properties of strongly nonergodic stationary processes. The motivation for the
joint discussion is to prove a proposition with a simple informal statement: If
a text of length describes independent facts in a repetitive way
then the text contains at least different words, under
suitable conditions on . In the formal statement, two modeling postulates
are adopted. Firstly, the words are understood as nonterminal symbols of the
shortest grammar-based encoding of the text. Secondly, the text is assumed to
be emitted by a finite-energy strongly nonergodic source whereas the facts are
binary IID variables predictable in a shift-invariant way.Comment: 24 pages, no figure
Infocast: A New Paradigm for Collaborative Content Distribution from Roadside Units to Vehicular Networks Using Rateless Codes
In this paper, we address the problem of distributing a large amount of bulk
data to a sparse vehicular network from roadside infostations, using efficient
vehicle-to-vehicle collaboration. Due to the highly dynamic nature of the
underlying vehicular network topology, we depart from architectures requiring
centralized coordination, reliable MAC scheduling, or global network state
knowledge, and instead adopt a distributed paradigm with simple protocols. In
other words, we investigate the problem of reliable dissemination from multiple
sources when each node in the network shares a limited amount of its resources
for cooperating with others. By using \emph{rateless} coding at the Road Side
Unit (RSU) and using vehicles as data carriers, we describe an efficient way to
achieve reliable dissemination to all nodes (even disconnected clusters in the
network). In the nutshell, we explore vehicles as mobile storage devices. We
then develop a method to keep the density of the rateless codes packets as a
function of distance from the RSU at the desired level set for the target
decoding distance. We investigate various tradeoffs involving buffer size,
maximum capacity, and the mobility parameter of the vehicles
Quantization as Histogram Segmentation: Optimal Scalar Quantizer Design in Network Systems
An algorithm for scalar quantizer design on discrete-alphabet sources is proposed. The proposed algorithm can be used to design fixed-rate and entropy-constrained conventional scalar quantizers, multiresolution scalar quantizers, multiple description scalar quantizers, and Wyner–Ziv scalar quantizers. The algorithm guarantees globally optimal solutions for conventional fixed-rate scalar quantizers and entropy-constrained scalar quantizers. For the other coding scenarios, the algorithm yields the best code among all codes that meet a given convexity constraint. In all cases, the algorithm run-time is polynomial in the size of the source alphabet. The algorithm derivation arises from a demonstration of the connection between scalar quantization, histogram segmentation, and the shortest path problem in a certain directed acyclic graph
The Evolution of Work
The division of labor first increased during industrialization and then decreased again after 1970 as job roles have expanded. We explain these trends in the organization of work through a simple model where (a) machines require standardization to exploit economies of scale and (b) more customized products are subject to trends and fashions which make production tasks less predictable and a strict division of labor impractical. At the onset of industrialization, the market supports only a small number of generic varieties which can be mass-produced under a strict division of labor. Thanks to productivity growth, niche markets gradually expand, producers eventually move into customized production and the division of labor decreases again. The model predicts capital-skill substitutability during industrialization and capital skill complementarity in the maturing industrial economy. Moreover, conventional calculations of the factor content of trade underestimate the impact of globalization because they do not take into account changes in product market competition induced by trade. We test our model by exploiting the time-lags in the introduction of bar-coding in three-digit SIC manufacturing industries in the US. We find that both increases in investments in computers and bar-coding have led to skill-upgrading. However, consistent with our model bar-coding has affected mainly the center of the skill distribution by shifting demand away from the high-school educated to the less-than-college educated.
From Packet to Power Switching: Digital Direct Load Scheduling
At present, the power grid has tight control over its dispatchable generation
capacity but a very coarse control on the demand. Energy consumers are shielded
from making price-aware decisions, which degrades the efficiency of the market.
This state of affairs tends to favor fossil fuel generation over renewable
sources. Because of the technological difficulties of storing electric energy,
the quest for mechanisms that would make the demand for electricity
controllable on a day-to-day basis is gaining prominence. The goal of this
paper is to provide one such mechanisms, which we call Digital Direct Load
Scheduling (DDLS). DDLS is a direct load control mechanism in which we unbundle
individual requests for energy and digitize them so that they can be
automatically scheduled in a cellular architecture. Specifically, rather than
storing energy or interrupting the job of appliances, we choose to hold
requests for energy in queues and optimize the service time of individual
appliances belonging to a broad class which we refer to as "deferrable loads".
The function of each neighborhood scheduler is to optimize the time at which
these appliances start to function. This process is intended to shape the
aggregate load profile of the neighborhood so as to optimize an objective
function which incorporates the spot price of energy, and also allows
distributed energy resources to supply part of the generation dynamically.Comment: Accepted by the IEEE journal of Selected Areas in Communications
(JSAC): Smart Grid Communications series, to appea
- …