2,635 research outputs found
Compressed Shaping: Concept and FPGA Demonstration
Probabilistic shaping (PS) has been widely studied and applied to optical
fiber communications. The encoder of PS expends the number of bit slots and
controls the probability distribution of channel input symbols. Not only
studies focused on PS but also most works on optical fiber communications have
assumed source uniformity (i.e. equal probability of marks and spaces) so far.
On the other hand, the source information is in general nonuniform, unless
bit-scrambling or other source coding techniques to balance the bit probability
is performed. Interestingly, one can exploit the source nonuniformity to reduce
the entropy of the channel input symbols with the PS encoder, which leads to
smaller required signal-to-noise ratio at a given input logic rate. This
benefit is equivalent to a combination of data compression and PS, and thus we
call this technique compressed shaping. In this work, we explain its
theoretical background in detail, and verify the concept by both numerical
simulation and a field programmable gate array (FPGA) implementation of such a
system. In particular, we find that compressed shaping can reduce power
consumption in forward error correction decoding by up to 90% in nonuniform
source cases. The additional hardware resources required for compressed shaping
are not significant compared with forward error correction coding, and an error
insertion test is successfully demonstrated with the FPGA.Comment: 10 pages, 12 figure
Data and Predictive Analytics Use for Logistics and Supply Chain Management
Purpose
The purpose of this paper is to explore the social process of Big Data and predictive analytics (BDPA) use for logistics and supply chain management (LSCM), focusing on interactions among technology, human behavior and organizational context that occur at the technology’s post-adoption phases in retail supply chain (RSC) organizations. Design/methodology/approach
The authors follow a grounded theory approach for theory building based on interviews with senior managers of 15 organizations positioned across multiple echelons in the RSC. Findings
Findings reveal how user involvement shapes BDPA to fit organizational structures and how changes made to the technology retroactively affect its design and institutional properties. Findings also reveal previously unreported aspects of BDPA use for LSCM. These include the presence of temporal and spatial discontinuities in the technology use across RSC organizations. Practical implications
This study unveils that it is impossible to design a BDPA technology ready for immediate use. The emergent process framework shows that institutional and social factors require BDPA use specific to the organization, as the technology comes to reflect the properties of the organization and the wider social environment for which its designers originally intended. BDPA is, thus, not easily transferrable among collaborating RSC organizations and requires managerial attention to the institutional context within which its usage takes place. Originality/value
The literature describes why organizations will use BDPA but fails to provide adequate insight into how BDPA use occurs. The authors address the “how” and bring a social perspective into a technology-centric area
RAPID CLOCK RECOVERY ALGORITHMS FOR DIGITAL MAGNETIC RECORDING AND DATA COMMUNICATIONS
SIGLEAvailable from British Library Document Supply Centre-DSC:DXN024293 / BLDSC - British Library Document Supply CentreGBUnited Kingdo
A Novel Seed Based Random Interleaving for OFDM System and Its PHY Layer Security Implications
Wireless channels are characterized by multipath and fading that can often cause long
burst of errors. Even though, to date, many very sophisticated error correcting codes have
been designed, yet none can handle long burst of errors efficiently. An interleaver, a
device that distributes a burst of errors, possibly caused by a deep fade, and makes them
appear as simple random errors, therefore, proves to a very useful technique when used in
conjunction with an efficient error correcting code.
In this work, a novel near optimal seed based random interleaver is designed. An optimal
interleaver scatters a given burst of errors uniformly over a fixed block of data - a
property that is measured by so called ‘spread’. The design makes use of a unique seed
based pseudo-random sequence generator or logistic map based chaotic sequence
generator to scramble the given block of data. Since the proposed design is based on a
seed based scrambler, the nature of input is irrelevant. Therefore, the proposed interleaver
can interleave either the bits or the symbols or the packets or even the frames.
Accordingly, in this work, we analyze the suitability of interleaver when introduced
before or after the modulation in single carrier communication systems and show that
interleaving the bits before modulation or interleaving the symbols after modulation has
same advantage. We further show that, in an orthogonal frequency division multiplexing
(OFDM) systems, the position of interleaver, whether before or after constellation
mapper, has no significance, and is interchangeable. However, scrambling symbols is
computationally less expensive than scrambling bits.
For the purpose of analyzing the performance of the proposed seed based random
interleaver, simulations are carried out in MATLAB®. Results show that our proposed
seed based random interleaver has near optimal properties of ‘spread’ and ‘dispersion’.
Furthermore, the proposed interleaver is evaluated in terms of bit error rate (BER) versus
length of burst error in a single carrier system both before and after modulation. The
proposed interleaver out-performs the built in RANDINTLV in MATLAB® when used in
the same system. It shows that proposed interleaver can convert greater amount of burst
errors into simple random errors than that of MATLAB® interleaver. The proposed
interleaver is also tested in IEEE 802.16e based WiMAX system with Stanford University Interim (SUI) channels to compare the performance of average BER versus
SNR for both pre modulation and post modulation interleaver. Results show that pre
modulation interleaver and post modulation has same performance.
There is also a side advantage of this seed based interleaver, in that it generates a variety
of unique random-looking interleaving sequences. Only a receiver that has the knowledge
of the input seed can generate this sequence and no one else. If the interleaving patterns
are kept secure then it can possibly be used to introduce an extra layer of security at
physical (PHY) layer. In that way, at PHY layer, one builds an additional entry barrier to
break through and it comes with no extra cost. This property has been investigated by
carrying out key sensitivity analysis to show that the attacks to guess key can be very
futile, as difference at 4th decimal place in the initial condition can lead to entirely
different scrambling
Bubble memory module
Design, fabrication and test of partially populated prototype recorder using 100 kilobit serial chips is described. Electrical interface, operating modes, and mechanical design of several module configurations are discussed. Fabrication and test of the module demonstrated the practicality of multiplexing resulting in lower power, weight, and volume. This effort resulted in the completion of a module consisting of a fully engineered printed circuit storage board populated with 5 of 8 possible cells and a wire wrapped electronics board. Interface of the module is 16 bits parallel at a maximum of 1.33 megabits per second data rate on either of two interface buses
Interference Cancellation in a Full duplex System
In a full duplex system as WCDMA a mobile phone transmits and receives at the same time, but at different frequencies. The transmitted signal will cause interference in the receiver which must be suppressed to not get degraded sensitivity in the receiver. This Master Thesis was carried out at Ericsson Mobile Platforms in Lund and the purpose was to examine a method to suppress the interference in the digital domain of a WCDMA transceiver. The method is based on that information from the transmitter is fed forward to the receiver to be able to recreate a resembled replica of the interference and subtract it from the desired signal. Further an adaptive least mean square algorithm is used to estimate correct amount of the interference and to provide a tracking ability for temperature variations. A simulator model was developed in matlab to be able to analyze the interference and design a proper cancellation block between the transmitter and the receiver. This simulator model was designed with complexity reductions that did not affect the study of the phenomena. According to simulations, the LMS algorithm turned out to be a sufficient choice concerning rate of convergence, misadjustment and robustness. The main limitation of the improvement by using a cancellation block, was instead determined by the distortion in the transmitter. The trend today is to achieve lower and lower distortions in the uplink making this method more interesting
Radial Velocity Prospects Current and Future: A White Paper Report prepared by the Study Analysis Group 8 for the Exoplanet Program Analysis Group (ExoPAG)
[Abridged] The Study Analysis Group 8 of the NASA Exoplanet Analysis Group
was convened to assess the current capabilities and the future potential of the
precise radial velocity (PRV) method to advance the NASA goal to "search for
planetary bodies and Earth-like planets in orbit around other stars.: (U.S.
National Space Policy, June 28, 2010). PRVs complement other exoplanet
detection methods, for example offering a direct path to obtaining the bulk
density and thus the structure and composition of transiting exoplanets. Our
analysis builds upon previous community input, including the ExoPlanet
Community Report chapter on radial velocities in 2008, the 2010 Decadal Survey
of Astronomy, the Penn State Precise Radial Velocities Workshop response to the
Decadal Survey in 2010, and the NSF Portfolio Review in 2012. The
radial-velocity detection of exoplanets is strongly endorsed by both the Astro
2010 Decadal Survey "New Worlds, New Horizons" and the NSF Portfolio Review,
and the community has recommended robust investment in PRVs. The demands on
telescope time for the above mission support, especially for systems of small
planets, will exceed the number of nights available using instruments now in
operation by a factor of at least several for TESS alone. Pushing down towards
true Earth twins will require more photons (i.e. larger telescopes), more
stable spectrographs than are currently available, better calibration, and
better correction for stellar jitter. We outline four hypothetical situations
for PRV work necessary to meet NASA mission exoplanet science objectives.Comment: ExoPAG SAG 8 final report, 112 pages, fixed author name onl
Variability in Singing and in Song in the Zebra Finch
Variability is a defining feature of the oscine song learning process, reflected in song and in the neural pathways involved in song learning. For the zebra finch, juveniles learning to sing typically exhibit a high degree of vocal variability, and this variability appears to be driven by a key brain nucleus. It has been suggested that this variability is a necessary part of a trial-Ââ€and-Ââ€error learning process in which the bird must search for possible improvements to its song. Our work examines the role this variability plays in learning in two ways: through behavioral experiments with juvenile zebra finches, and through a computational model of parts of the oscine brain. Previous studies have shown that some finches exhibit less variability during the learning process than others by producing repetitive vocalizations. A constantly changing song model was played to juvenile zebra finches to determine whether auditory stimuli can affect this behavior. This stimulus was shown to cause an overall increase in repetitiveness; furthermore, there was a correlation between repetitiveness at an early stage in the learning process and the length of time a bird is repetitive overall, and birds that were repetitive tended to repeat the same thing over an extended period of time. The role of a key brain nucleus involved in song learning was examined through computational modeling. Previous studies have shown that this nucleus produces variability in song, but can also bias the song of a bird in such a way as to reduce errors while singing. Activity within this nucleus during singing is predominantly uncorrelated with the timing of the song, however a portion of this activity is correlated in such a manner. The modeling experiments consider the possibility that this persistent signal is part of a trial-Ââ€and-Ââ€error search and contrast this with the possibility that the persistent signal is the product of some mechanism to directly improve song. Simulation results show that a mixture of timing-Ââ€dependent and timing-Ââ€independent activity in this nucleus produces optimal learning results for the case where the persistent signal is a key component of a trial-Ââ€and-Ââ€error search, but not in the case where this signal will directly improve song. Although a mixture of timing-Ââ€locked and timing-Ââ€independent activity produces optimal results, the ratio found to be optimal within the model differs from what has been observed in vivo. Finally, novel methods for the analysis of birdsong, motivated by the high variability of juvenile song, are presented. These methods are designed to work with sets of song samples rather than through pairwise comparison. The utility of these methods is demonstrated, as well as results illustrating how such methods can be used as the basis for aggregate measures of song such as repertoire complexity
- …