58 research outputs found

    High-throughput variable-to-fixed entropy codec using selective, stochastic code forests

    Get PDF
    Efficient high-throughput (HT) compression algorithms are paramount to meet the stringent constraints of present and upcoming data storage, processing, and transmission systems. In particular, latency, bandwidth and energy requirements are critical for those systems. Most HT codecs are designed to maximize compression speed, and secondarily to minimize compressed lengths. On the other hand, decompression speed is often equally or more critical than compression speed, especially in scenarios where decompression is performed multiple times and/or at critical parts of a system. In this work, an algorithm to design variable-to-fixed (VF) codes is proposed that prioritizes decompression speed. Stationary Markov analysis is employed to generate multiple, jointly optimized codes (denoted code forests). Their average compression efficiency is on par with the state of the art in VF codes, e.g., within 1% of Yamamoto et al.\u27s algorithm. The proposed code forest structure enables the implementation of highly efficient codecs, with decompression speeds 3.8 times faster than other state-of-the-art HT entropy codecs with equal or better compression ratios for natural data sources. Compared to these HT codecs, the proposed forests yields similar compression efficiency and speeds

    Roadmap of ultrafast x-ray atomic and molecular physics

    Get PDF
    X-ray free-electron lasers (XFELs) and table-top sources of x-rays based upon high harmonic generation (HHG) have revolutionized the field of ultrafast x-ray atomic and molecular physics, largely due to an explosive growth in capabilities in the past decade. XFELs now provide unprecedented intensity (1020 W cm−2) of x-rays at wavelengths down to ~1 Angstrom, and HHG provides unprecedented time resolution (∼50 attoseconds) and a correspondingly large coherent bandwidth at longer wavelengths. For context, timescales can be referenced to the Bohr orbital period in hydrogen atom of 150 attoseconds and the hydrogen-molecule vibrational period of 8 femtoseconds; wavelength scales can be referenced to the chemically significant carbon K-edge at a photon energy of ∼280 eV (44 Angstroms) and the bond length in methane of ∼1 Ångstrom. With these modern x-ray sources one now has the ability to focus on individual atoms, even when embedded in a complex molecule, and view electronic and nuclear motion on their intrinsic scales (attoseconds and Ångstroms). These sources have enabled coherent diffractive imaging, where one can image non-crystalline objects in three dimensions on ultrafast timescales, potentially with atomic resolution. The unprecedented intensity available with XFELs has opened new fields of multiphoton and nonlinear x-ray physics where behavior of matter under extreme conditions can be explored. The unprecedented time resolution and pulse synchronization provided by HHG sources has kindled fundamental investigations of time delays in photoionization, charge migration in molecules, and dynamics near conical intersections that are foundational to AMO physics and chemistry. This roadmap coincides with the year when three new XFEL facilities, operating at Ångstrom wavelengths, opened for users (European XFEL, Swiss-FEL and PAL-FEL in Korea) almost doubling the present worldwide number of XFELs, and documents the remarkable progress in HHG capabilities since its discovery roughly 30 years ago, showcasing experiments in AMO physics and other applications. Here we capture the perspectives of 17 leading groups and organize the contributions into four categories: ultrafast molecular dynamics, multidimensional x-ray spectroscopies; high-intensity x-ray phenomena; attosecond x-ray science

    Roadmap of ultrafast x-ray atomic and molecular physics

    Get PDF
    X-ray free-electron lasers (XFELs) and table-top sources of x-rays based upon high harmonic generation (HHG) have revolutionized the field of ultrafast x-ray atomic and molecular physics, largely due to an explosive growth in capabilities in the past decade. XFELs now provide unprecedented intensity (1020 W cm−2) of x-rays at wavelengths down to ~1 Angstrom, and HHG provides unprecedented time resolution (∼50 attoseconds) and a correspondingly large coherent bandwidth at longer wavelengths. For context, timescales can be referenced to the Bohr orbital period in hydrogen atom of 150 attoseconds and the hydrogen-molecule vibrational period of 8 femtoseconds; wavelength scales can be referenced to the chemically significant carbon K-edge at a photon energy of ∼280 eV (44 Angstroms) and the bond length in methane of ∼1 Ångstrom. With these modern x-ray sources one now has the ability to focus on individual atoms, even when embedded in a complex molecule, and view electronic and nuclear motion on their intrinsic scales (attoseconds and Ångstroms). These sources have enabled coherent diffractive imaging, where one can image non-crystalline objects in three dimensions on ultrafast timescales, potentially with atomic resolution. The unprecedented intensity available with XFELs has opened new fields of multiphoton and nonlinear x-ray physics where behavior of matter under extreme conditions can be explored. The unprecedented time resolution and pulse synchronization provided by HHG sources has kindled fundamental investigations of time delays in photoionization, charge migration in molecules, and dynamics near conical intersections that are foundational to AMO physics and chemistry. This roadmap coincides with the year when three new XFEL facilities, operating at Ångstrom wavelengths, opened for users (European XFEL, Swiss-FEL and PAL-FEL in Korea) almost doubling the present worldwide number of XFELs, and documents the remarkable progress in HHG capabilities since its discovery roughly 30 years ago, showcasing experiments in AMO physics and other applications. Here we capture the perspectives of 17 leading groups and organize the contributions into four categories: ultrafast molecular dynamics, multidimensional x-ray spectroscopies; high-intensity x-ray phenomena; attosecond x-ray science

    The AGORA high-resolution galaxy simulations comparison project. III. Cosmological zoom-in simulation of a Milky Way-mass Halo

    Full text link
    Artículo escrito por un elevado número de autores, solo se referencian el que aparece en primer lugar, el nombre del grupo de colaboración, si le hubiere, y los autores pertenecientes a la UA

    Deep Learning Designs for Physical Layer Communications

    Get PDF
    Wireless communication systems and their underlying technologies have undergone unprecedented advances over the last two decades to assuage the ever-increasing demands for various applications and emerging technologies. However, the traditional signal processing schemes and algorithms for wireless communications cannot handle the upsurging complexity associated with fifth-generation (5G) and beyond communication systems due to network expansion, new emerging technologies, high data rate, and the ever-increasing demands for low latency. This thesis extends the traditional downlink transmission schemes to deep learning-based precoding and detection techniques that are hardware-efficient and of lower complexity than the current state-of-the-art. The thesis focuses on: precoding/beamforming in massive multiple-inputs-multiple-outputs (MIMO), signal detection and lightweight neural network (NN) architectures for precoder and decoder designs. We introduce a learning-based precoder design via constructive interference (CI) that performs the precoding on a symbol-by-symbol basis. Instead of conventionally training a NN without considering the specifics of the optimisation objective, we unfold a power minimisation symbol level precoding (SLP) formulation based on the interior-point-method (IPM) proximal ‘log’ barrier function. Furthermore, we propose a concept of NN compression, where the weights are quantised to lower numerical precision formats based on binary and ternary quantisations. We further introduce a stochastic quantisation technique, where parts of the NN weight matrix are quantised while the remaining is not. Finally, we propose a systematic complexity scaling of deep neural network (DNN) based MIMO detectors. The model uses a fraction of the DNN inputs by scaling their values through weights that follow monotonically non-increasing functions. Furthermore, we investigate performance complexity tradeoffs via regularisation constraints on the layer weights such that, at inference, parts of network layers can be removed with minimal impact on the detection accuracy. Simulation results show that our proposed learning-based techniques offer better complexity-vs-BER (bit-error-rate) and complexity-vs-transmit power performances compared to the state-of-the-art MIMO detection and precoding techniques

    Rhythms and Evolution: Effects of Timing on Survival

    Get PDF
    The evolution of metabolism regulation is an intertwined process, where different strategies are constantly being developed towards a cognitive ability to perceive and respond to an environment. Organisms depend on an orchestration of a complex set of chemical reactions: maintaining homeostasis with a changing environment, while simultaneously sending material and energetic resources to where they are needed. The success of an organism requires efficient metabolic regulation, highlighting the connection between evolution, population dynamics and the underlying biochemistry. In this work, I represent organisms as coupled information-processing networks, that is, gene-regulatory networks receiving signals from the environment and acting on chemical reactions, eventually affecting material flows. I discuss the mechanisms through which metabolism control is improved during evolution and how the nonlinearities of competition influence this solution-searching process. The propagation of the populations through the resulting landscapes generally point to the role of the rhythm of cell division as an essential phenotypic feature driving evolution. Subsequently, as it naturally follows, different representations of organisms as oscillators are constructed to indicate more precisely how the interplay between competition, maturation timing and cell-division synchronisation affects the expected evolutionary outcomes, not always leading to the \"survival of the fastest\"

    Beyond Transmitting Bits: Context, Semantics, and Task-Oriented Communications

    Full text link
    Communication systems to date primarily aim at reliably communicating bit sequences. Such an approach provides efficient engineering designs that are agnostic to the meanings of the messages or to the goal that the message exchange aims to achieve. Next generation systems, however, can be potentially enriched by folding message semantics and goals of communication into their design. Further, these systems can be made cognizant of the context in which communication exchange takes place, providing avenues for novel design insights. This tutorial summarizes the efforts to date, starting from its early adaptations, semantic-aware and task-oriented communications, covering the foundations, algorithms and potential implementations. The focus is on approaches that utilize information theory to provide the foundations, as well as the significant role of learning in semantics and task-aware communications.Comment: 28 pages, 14 figure

    Stochastic Simulation and System Identification of large Signal Transduction Networks in Cells

    Get PDF
    New approaches are required for the mathematical modelling and system identification of complex networks, which are characterized by a large number of unknown parameters, uncertain network topologies, partially poorly understood mechanisms and significant stochastic effects. Networks with such properties are ubiquitous in many fields of science, especially in molecular cell biology, where, for example, large signal transduction networks are formed, by which cells transfer and process information, based on the biochemical interactions between signal transduction molecules. Complexity arises from the high number of different molecule species involved and the diversity of sub-processes interacting with each other. Previous attempts to model signal transduction were often limited to small systems or based on qualitative data only. One goal of this thesis is to reduce the complexity to enable system identification on the basis of experimental data. The concept of ’Sensitivity of Sensitivities’, which is presented here for the first time and which is based on the evaluation of stochastically generated parameter set ensembles, reveals two important inherent system properties: high robustness and modular structures of the dependency between state variables and parameters. This is the key to drastically reduce the dimensionality of the parameter identification problem. The approach is applied to the signalling pathway of CD95-induced apoptosis, also called programmed cell death. Defects in the regulation of apoptosis result in a number of serious diseases such as cancer. Despite the ever-increasing number of studies of the molecular mechanisms of apoptotic signalling, a systemic understanding of this complex pathway is still missing. With the model and the estimated parameters of this thesis, it becomes possible to reproduce the observed system behaviour and to predict important system properties. The predictions have been experimentally confirmed and are used for the planning of further experiments. Thereby, a novel regulatory mechanism was revealed, i.e. a threshold between cell death and cell survival. High fluctuations and extremely low particle numbers of crucial molecule species require exact stochastic simulations. Computational problems arise from the huge differences among the timescales on which the reactions occur. Therefore, a stochastic hybrid algorithm is developed by combining the exact Gillespie algorithm with a system of stochastic differential equations. This enables stochastically accurate and highly efficient simulations for large reaction systems and for any other kind of Markov processes. In summary, this thesis provides a methodology specifically suited for highly underdetermined networks. This is of high relevance for the newly emerging field of systems biology going far beyond the present application of programmed cell death

    Predictable markets? A news-driven model of the stock market

    Get PDF
    We attempt to explain stock market dynamics in terms of the interaction among three variables: market price, investor opinion and information flow. We propose a framework for such interaction and apply it to build a model of stock market dynamics which we study both empirically and theoretically. We demonstrate that this model replicates observed market behavior on all relevant timescales (from days to years) reasonably well. Using the model, we obtain and discuss a number of results that pose implications for current market theory and offer potential practical applications.Comment: This is the version accepted for publication in a new journal Algorithmic Finance (http://algorithmicfinance.org). A draft was posted here on 29 Apri
    • …
    corecore