5,195 research outputs found
Source-channel coding for robust image transmission and for dirty-paper coding
In this dissertation, we studied two seemingly uncorrelated, but conceptually
related problems in terms of source-channel coding: 1) wireless image transmission
and 2) Costa ("dirty-paper") code design.
In the first part of the dissertation, we consider progressive image transmission
over a wireless system employing space-time coded OFDM. The space-time coded
OFDM system based on a newly built broadband MIMO fading model is theoretically
evaluated by assuming perfect channel state information (CSI) at the receiver for
coherent detection. Then an adaptive modulation scheme is proposed to pick the
constellation size that offers the best reconstructed image quality for each average
signal-to-noise ratio (SNR).
A more practical scenario is also considered without the assumption of perfect
CSI. We employ low-complexity decision-feedback decoding for differentially space-
time coded OFDM systems to exploit transmitter diversity. For JSCC, we adopt a
product channel code structure that is proven to provide powerful error protection and
bursty error correction. To further improve the system performance, we also apply
the powerful iterative (turbo) coding techniques and propose the iterative decoding
of differentially space-time coded multiple descriptions of images.
The second part of the dissertation deals with practical dirty-paper code designs. We first invoke an information-theoretical interpretation of algebraic binning and
motivate the code design guidelines in terms of source-channel coding. Then two
dirty-paper code designs are proposed. The first is a nested turbo construction based
on soft-output trellis-coded quantization (SOTCQ) for source coding and turbo trellis-
coded modulation (TTCM) for channel coding. A novel procedure is devised to
balance the dimensionalities of the equivalent lattice codes corresponding to SOTCQ
and TTCM. The second dirty-paper code design employs TCQ and IRA codes for
near-capacity performance. This is done by synergistically combining TCQ with IRA
codes so that they work together as well as they do individually. Our TCQ/IRA
design approaches the dirty-paper capacity limit at the low rate regime (e.g., < 1:0
bit/sample), while our nested SOTCQ/TTCM scheme provides the best performs so
far at medium-to-high rates (e.g., >= 1:0 bit/sample). Thus the two proposed practical
code designs are complementary to each other
Nested turbo codes for the costa problem
Driven by applications in data-hiding, MIMO broadcast channel coding, precoding for interference cancellation, and transmitter cooperation in wireless networks, Costa coding has lately become a very active research area. In this paper, we first offer code design guidelines in terms of source- channel coding for algebraic binning. We then address practical code design based on nested lattice codes and propose nested turbo codes using turbo-like trellis-coded quantization (TCQ) for source coding and turbo trellis-coded modulation (TTCM) for channel coding. Compared to TCQ, turbo-like TCQ offers structural similarity between the source and channel coding components, leading to more efficient nesting with TTCM and better source coding performance. Due to the difference in effective dimensionality between turbo-like TCQ and TTCM, there is a performance tradeoff between these two components when they are nested together, meaning that the performance of turbo-like TCQ worsens as the TTCM code becomes stronger and vice versa. Optimization of this performance tradeoff leads to our code design that outperforms existing TCQ/TCM and TCQ/TTCM constructions and exhibits a gap of 0.94, 1.42 and 2.65 dB to the Costa capacity at 2.0, 1.0, and 0.5 bits/sample, respectively
Wide spread spectrum watermarking with side information and interference cancellation
Nowadays, a popular method used for additive watermarking is wide spread
spectrum. It consists in adding a spread signal into the host document. This
signal is obtained by the sum of a set of carrier vectors, which are modulated
by the bits to be embedded. To extract these embedded bits, weighted
correlations between the watermarked document and the carriers are computed.
Unfortunately, even without any attack, the obtained set of bits can be
corrupted due to the interference with the host signal (host interference) and
also due to the interference with the others carriers (inter-symbols
interference (ISI) due to the non-orthogonality of the carriers). Some recent
watermarking algorithms deal with host interference using side informed
methods, but inter-symbols interference problem is still open. In this paper,
we deal with interference cancellation methods, and we propose to consider ISI
as side information and to integrate it into the host signal. This leads to a
great improvement of extraction performance in term of signal-to-noise ratio
and/or watermark robustness.Comment: 12 pages, 8 figure
Writing on Dirty Paper with Resizing and its Application to Quasi-Static Fading Broadcast Channels
This paper studies a variant of the classical problem of ``writing on dirty
paper'' in which the sum of the input and the interference, or dirt, is
multiplied by a random variable that models resizing, known to the decoder but
not to the encoder. The achievable rate of Costa's dirty paper coding (DPC)
scheme is calculated and compared to the case of the decoder's also knowing the
dirt. In the ergodic case, the corresponding rate loss vanishes asymptotically
in the limits of both high and low signal-to-noise ratio (SNR), and is small at
all finite SNR for typical distributions like Rayleigh, Rician, and Nakagami.
In the quasi-static case, the DPC scheme is lossless at all SNR in terms of
outage probability. Quasi-static fading broadcast channels (BC) without
transmit channel state information (CSI) are investigated as an application of
the robustness properties. It is shown that the DPC scheme leads to an outage
achievable rate region that strictly dominates that of time division.Comment: To appear in IEEE International Symposium on Information Theory 200
Digital watermark technology in security applications
With the rising emphasis on security and the number of fraud related crimes
around the world, authorities are looking for new technologies to tighten
security of identity. Among many modern electronic technologies, digital
watermarking has unique advantages to enhance the document authenticity.
At the current status of the development, digital watermarking technologies
are not as matured as other competing technologies to support identity authentication
systems. This work presents improvements in performance of
two classes of digital watermarking techniques and investigates the issue of
watermark synchronisation.
Optimal performance can be obtained if the spreading sequences are designed
to be orthogonal to the cover vector. In this thesis, two classes of
orthogonalisation methods that generate binary sequences quasi-orthogonal
to the cover vector are presented. One method, namely "Sorting and Cancelling"
generates sequences that have a high level of orthogonality to the
cover vector. The Hadamard Matrix based orthogonalisation method, namely
"Hadamard Matrix Search" is able to realise overlapped embedding, thus the
watermarking capacity and image fidelity can be improved compared to using
short watermark sequences. The results are compared with traditional
pseudo-randomly generated binary sequences. The advantages of both classes
of orthogonalisation inethods are significant.
Another watermarking method that is introduced in the thesis is based
on writing-on-dirty-paper theory. The method is presented with biorthogonal
codes that have the best robustness. The advantage and trade-offs of
using biorthogonal codes with this watermark coding methods are analysed
comprehensively. The comparisons between orthogonal and non-orthogonal
codes that are used in this watermarking method are also made. It is found
that fidelity and robustness are contradictory and it is not possible to optimise
them simultaneously.
Comparisons are also made between all proposed methods. The comparisons
are focused on three major performance criteria, fidelity, capacity and
robustness. aom two different viewpoints, conclusions are not the same. For
fidelity-centric viewpoint, the dirty-paper coding methods using biorthogonal
codes has very strong advantage to preserve image fidelity and the advantage
of capacity performance is also significant. However, from the power
ratio point of view, the orthogonalisation methods demonstrate significant
advantage on capacity and robustness. The conclusions are contradictory
but together, they summarise the performance generated by different design
considerations.
The synchronisation of watermark is firstly provided by high contrast
frames around the watermarked image. The edge detection filters are used
to detect the high contrast borders of the captured image. By scanning
the pixels from the border to the centre, the locations of detected edges
are stored. The optimal linear regression algorithm is used to estimate the
watermarked image frames. Estimation of the regression function provides
rotation angle as the slope of the rotated frames. The scaling is corrected by
re-sampling the upright image to the original size. A theoretically studied
method that is able to synchronise captured image to sub-pixel level accuracy
is also presented. By using invariant transforms and the "symmetric
phase only matched filter" the captured image can be corrected accurately
to original geometric size. The method uses repeating watermarks to form an
array in the spatial domain of the watermarked image and the the array that
the locations of its elements can reveal information of rotation, translation
and scaling with two filtering processes
- …