475 research outputs found
The Trapping Redundancy of Linear Block Codes
We generalize the notion of the stopping redundancy in order to study the
smallest size of a trapping set in Tanner graphs of linear block codes. In this
context, we introduce the notion of the trapping redundancy of a code, which
quantifies the relationship between the number of redundant rows in any
parity-check matrix of a given code and the size of its smallest trapping set.
Trapping sets with certain parameter sizes are known to cause error-floors in
the performance curves of iterative belief propagation decoders, and it is
therefore important to identify decoding matrices that avoid such sets. Bounds
on the trapping redundancy are obtained using probabilistic and constructive
methods, and the analysis covers both general and elementary trapping sets.
Numerical values for these bounds are computed for the [2640,1320] Margulis
code and the class of projective geometry codes, and compared with some new
code-specific trapping set size estimates.Comment: 12 pages, 4 tables, 1 figure, accepted for publication in IEEE
Transactions on Information Theor
Stateāofātheāart report on nonlinear representation of sources and channels
This report consists of two complementary parts, related to the modeling of two important sources of nonlinearities in a communications system. In the first part, an overview of important past work related to the estimation, compression and processing of sparse data through the use of nonlinear models is provided. In the second part, the current state of the art on the representation of wireless channels in the presence of nonlinearities is summarized. In addition to the characteristics of the nonlinear wireless fading channel, some information is also provided on recent approaches to the sparse representation of such channels
Distributed signal processing using nested lattice codes
Multi-Terminal Source Coding (MTSC) addresses the problem of compressing correlated sources
without communication links among them. In this thesis, the constructive approach of this problem
is considered in an algebraic framework and a system design is provided that can be applicable
in a variety of settings. Wyner-Ziv problem is first investigated: coding of an independent and
identically distributed (i.i.d.) Gaussian source with side information available only at the decoder
in the form of a noisy version of the source to be encoded. Theoretical models are first established
and derived for calculating distortion-rate functions. Then a few novel practical code implementations are proposed by using the strategy of multi-dimensional nested lattice/trellis coding. By
investigating various lattices in the dimensions considered, analysis is given on how lattice properties affect performance. Also proposed are methods on choosing good sublattices in multiple
dimensions. By introducing scaling factors, the relationship between distortion and scaling factor
is examined for various rates. The best high-dimensional lattice using our scale-rotate method can
achieve a performance less than 1 dB at low rates from the Wyner-Ziv limit; and random nested
ensembles can achieve a 1.87 dB gap with the limit. Moreover, the code design is extended to
incorporate with distributed compressive sensing (DCS). Theoretical framework is proposed and
practical design using nested lattice/trellis is presented for various scenarios. By using nested
trellis, the simulation shows a 3.42 dB gap from our derived bound for the DCS plus Wyner-Ziv
framework
Identifying network ties from panel data: theory and an application to tax competition
Social interactions determine many economic behaviors, but information on social ties
does not exist in most publicly available and widely used datasets. We present results on the
identiā¦cation of social networks from observational panel data that contains no information on
social ties between agents. In the context of a canonical social interactions model, we provide
suĀ¢cient conditions under which the social interactions matrix, endogenous and exogenous
social eĀ¤ect parameters are all globally identiā¦ed. While this result is relevant across diĀ¤erent
estimation strategies, we then describe how high-dimensional estimation techniques can be
used to estimate the interactions model based on the Adaptive Elastic Net GMM method. We
employ the method to study tax competition across US states. We ā¦nd the identiā¦ed social
interactions matrix implies tax competition diĀ¤ers markedly from the common assumption
of competition between geographically neighboring states, providing further insights for the
long-standing debate on the relative roles of factor mobility and yardstick competition in
driving tax setting behavior across states. Most broadly, our identiā¦cation and application
show the analysis of social interactions can be extended to economic realms where no network
data exists. JEL Codes: C31, D85, H71
Wyner-Ziv coding based on TCQ and LDPC codes and extensions to multiterminal source coding
Driven by a host of emerging applications (e.g., sensor networks and wireless
video), distributed source coding (i.e., Slepian-Wolf coding, Wyner-Ziv coding and
various other forms of multiterminal source coding), has recently become a very active
research area.
In this thesis, we first design a practical coding scheme for the quadratic Gaussian
Wyner-Ziv problem, because in this special case, no rate loss is suffered due to
the unavailability of the side information at the encoder. In order to approach the
Wyner-Ziv distortion limit D??W Z(R), the trellis coded quantization (TCQ) technique
is employed to quantize the source X, and irregular LDPC code is used to implement
Slepian-Wolf coding of the quantized source input Q(X) given the side information
Y at the decoder. An optimal non-linear estimator is devised at the joint decoder
to compute the conditional mean of the source X given the dequantized version of
Q(X) and the side information Y . Assuming ideal Slepian-Wolf coding, our scheme
performs only 0.2 dB away from the Wyner-Ziv limit D??W Z(R) at high rate, which
mirrors the performance of entropy-coded TCQ in classic source coding. Practical
designs perform 0.83 dB away from D??W Z(R) at medium rates. With 2-D trellis-coded
vector quantization, the performance gap to D??W Z(R) is only 0.66 dB at 1.0 b/s and
0.47 dB at 3.3 b/s.
We then extend the proposed Wyner-Ziv coding scheme to the quadratic Gaussian
multiterminal source coding problem with two encoders. Both direct and indirect
settings of multiterminal source coding are considered. An asymmetric code design
containing one classical source coding component and one Wyner-Ziv coding component
is first introduced and shown to be able to approach the corner points on the
theoretically achievable limits in both settings. To approach any point on the theoretically
achievable limits, a second approach based on source splitting is then described.
One classical source coding component, two Wyner-Ziv coding components, and a
linear estimator are employed in this design. Proofs are provided to show the achievability
of any point on the theoretical limits in both settings by assuming that both
the source coding and the Wyner-Ziv coding components are optimal. The performance
of practical schemes is only 0.15 b/s away from the theoretical limits for the
asymmetric approach, and up to 0.30 b/s away from the limits for the source splitting
approach
Information-theoretic analysis of human-machine mixed systems
Many recent information technologies such as crowdsourcing and social decision-making systems are designed based on (near-)optimal information processing techniques for machines. However, in such applications, some parts of systems that process information are humans and so systems are affected by bounded rationality of human behavior and overall performance is suboptimal. In this dissertation, we consider systems that include humans and study their information-theoretic limits. We investigate four problems in this direction and show fundamental limits in terms of capacity, Bayes risk, and rate-distortion.
A system with queue-length-dependent service quality, motivated by crowdsourcing platforms, is investigated. Since human service quality changes depending on workload, a job designer must take the level of work into account. We model the workload using queueing theory and characterize Shannon's information capacity for single-user and multiuser systems.
We also investigate social learning as sequential binary hypothesis testing. We find somewhat counterintuitively that unlike basic binary hypothesis testing, the decision threshold determined by the true prior probability is no longer optimal and biased perception of the true prior could outperform the unbiased perception system. The fact that the optimal belief curve resembles the Prelec weighting function from cumulative prospect theory gives insight, in the era of artificial intelligence (AI), into how to design machine AI that supports a human decision.
The traditional CEO problem well models a collaborative decision-making problem. We extend the CEO problem to two continuous alphabet settings with general rth power of difference and logarithmic distortions, and study matching asymptotics of distortion as the number of agents and sum rate grow without bound
- ā¦