2,802 research outputs found
An improved algorithm of generating shortening patterns for polar codes
The rate matching in polar codes becomes a solution when non-conventional codewords of length N≠2n are required. Shortening is employed to design arbitrary rate codes from a mother code with a given rate. Based on the conventional shortening scheme, length of constructed polar codes is limited. In this paper, we demonstrate the presence of favorable and unfavorable shortening patterns. The structure of polar codes is leveraged to eliminate unfavorable shortening patterns, thereby reducing the search space. We generate an auxiliary matrix through likelihood and subsequently select the shortening bits from the matrix. Unlike different existing methods that offer only a single shortening pattern, our algorithm generates multiple favorable shortening patterns, encompassing all possible favorable configurations. This algorithm has a reduced complexity and suboptimal performance, effectively identifying shortening patterns and sets of frozen symbols for any polar code. Simulation results underscore that the shortened polar codes exhibit performance closely aligned with the mother codes. Our algorithm addresses this security concern by making it more difficult for an attacker to obtain the information set and frozen symbols of a polar code. This is done by generating multiple shortening patterns for any polar code
A Balanced Tree Approach to Construction of Length-Compatible Polar Codes
From the perspective of tree, we design a length-flexible coding scheme. For
an arbitrary code length, we first construct a balanced binary tree (BBT) where
the root node represents a transmitted codeword, the leaf nodes represent
either active bits or frozen bits, and a parent node is related to its child
nodes by a length-adaptive (U+V|V) operation. Both the encoding and the
successive cancellation (SC)-based decoding can be implemented over the
constructed coding tree. For code construction, we propose a signal-to-noise
ratio (SNR)-dependent method and two SNR-independent methods, all of which
evaluate the reliabilities of leaf nodes and then select the most reliable leaf
nodes as the active nodes. Numerical results demonstrate that our proposed
codes can have comparable performance to the 5G polar codes. To reduce the
decoding latency, we propose a partitioned successive cancellation (PSC)-based
decoding algorithm, which can be implemented over a sub-tree obtained by
pruning the coding tree. Numerical results show that the PSC-based decoding can
achieve similar performance to the conventional SC-based decoding.Comment: 30 pages, 10 figure
Modeling rotating stars in two dimensions
In this lecture I present the way stars can be modeled in two dimensions and
especially the fluid flows that are driven by rotation. I discuss some of the
various ways of taking into account turbulence and conclude this contribution
by a short presentation of some of the first results obtained with the ESTER
code on the modeling of interferometrically observed fast rotating early-type
stars.Comment: 16 pages, 2 fig. to appear in Proceedings of the Evry Schatzman
School 2012 of PNPS and CNRS/INSU on the "Role and mechanisms of angular
momentum transport during the formation and early evolution of stars", Eds.
P.Hennebelle & C.Charbonne
Development and evaluation of a Hadamard transform imaging spectrometer and a Hadamard transform thermal imager
A spectrometric imager and a thermal imager, which achieve multiplexing by the use of binary optical encoding masks, were developed. The masks are based on orthogonal, pseudorandom digital codes derived from Hadamard matrices. Spatial and/or spectral data is obtained in the form of a Hadamard transform of the spatial and/or spectral scene; computer algorithms are then used to decode the data and reconstruct images of the original scene. The hardware, algorithms and processing/display facility are described. A number of spatial and spatial/spectral images are presented. The achievement of a signal-to-noise improvement due to the signal multiplexing was also demonstrated. An analysis of the results indicates both the situations for which the multiplex advantage may be gained, and the limitations of the technique. A number of potential applications of the spectrometric imager are discussed
Spacetime approach to force-free magnetospheres
Force-Free Electrodynamics (FFE) describes magnetically dominated
relativistic plasma via non-linear equations for the electromagnetic field
alone. Such plasma is thought to play a key role in the physics of pulsars and
active black holes. Despite its simple covariant formulation, FFE has primarily
been studied in 3+1 frameworks, where spacetime is split into space and time.
In this article we systematically develop the theory of force-free
magnetospheres taking a spacetime perspective. Using a suite of spacetime tools
and techniques (notably exterior calculus) we cover 1) the basics of the
theory, 2) exact solutions that demonstrate the extraction and transport of the
rotational energy of a compact object (in the case of a black hole, the
Blandford-Znajek mechanism), 3) the behavior of current sheets, 4) the general
theory of stationary, axisymmetric magnetospheres and 5) general properties of
pulsar and black hole magnetospheres. We thereby synthesize, clarify and
generalize known aspects of the physics of force-free magnetospheres, while
also introducing several new results.Comment: v2: numerous improvements; v3: further improvements, matches
published versio
On the performance of helper data template protection schemes
The use of biometrics looks promising as it is already being applied in elec- tronic passports, ePassports, on a global scale. Because the biometric data has to be stored as a reference template on either a central or personal storage de- vice, its wide-spread use introduces new security and privacy risks such as (i) identity fraud, (ii) cross-matching, (iii) irrevocability and (iv) leaking sensitive medical information. Mitigating these risks is essential to obtain the accep- tance from the subjects of the biometric systems and therefore facilitating the successful implementation on a large-scale basis. A solution to mitigate these risks is to use template protection techniques. The required protection properties of the stored reference template according to ISO guidelines are (i) irreversibility, (ii) renewability and (iii) unlinkability. A known template protection scheme is the helper data system (HDS). The fun- damental principle of the HDS is to bind a key with the biometric sample with use of helper data and cryptography, as such that the key can be reproduced or released given another biometric sample of the same subject. The identity check is then performed in a secure way by comparing the hash of the key. Hence, the size of the key determines the amount of protection. This thesis extensively investigates the HDS system, namely (i) the the- oretical classication performance, (ii) the maximum key size, (iii) the irre- versibility and unlinkability properties, and (iv) the optimal multi-sample and multi-algorithm fusion method. The theoretical classication performance of the biometric system is deter- mined by assuming that the features extracted from the biometric sample are Gaussian distributed. With this assumption we investigate the in uence of the bit extraction scheme on the classication performance. With use of the the- oretical framework, the maximum size of the key is determined by assuming the error-correcting code to operate on Shannon's bound. We also show three vulnerabilities of HDS that aect the irreversibility and unlinkability property and propose solutions. Finally, we study the optimal level of applying multi- sample and multi-algorithm fusion with the HDS at either feature-, score-, or decision-level
- …