6,222 research outputs found

    Bicriteria data compression

    Get PDF
    The advent of massive datasets (and the consequent design of high-performing distributed storage systems) have reignited the interest of the scientific and engineering community towards the design of lossless data compressors which achieve effective compression ratio and very efficient decompression speed. Lempel-Ziv's LZ77 algorithm is the de facto choice in this scenario because of its decompression speed and its flexibility in trading decompression speed versus compressed-space efficiency. Each of the existing implementations offers a trade-off between space occupancy and decompression speed, so software engineers have to content themselves by picking the one which comes closer to the requirements of the application in their hands. Starting from these premises, and for the first time in the literature, we address in this paper the problem of trading optimally, and in a principled way, the consumption of these two resources by introducing the Bicriteria LZ77-Parsing problem, which formalizes in a principled way what data-compressors have traditionally approached by means of heuristics. The goal is to determine an LZ77 parsing which minimizes the space occupancy in bits of the compressed file, provided that the decompression time is bounded by a fixed amount (or vice-versa). This way, the software engineer can set its space (or time) requirements and then derive the LZ77 parsing which optimizes the decompression speed (or the space occupancy, respectively). We solve this problem efficiently in O(n log^2 n) time and optimal linear space within a small, additive approximation, by proving and deploying some specific structural properties of the weighted graph derived from the possible LZ77-parsings of the input file. The preliminary set of experiments shows that our novel proposal dominates all the highly engineered competitors, hence offering a win-win situation in theory&practice

    Nonstationary Continuum-Armed Bandit Strategies for Automated Trading in a Simulated Financial Market

    Get PDF
    We approach the problem of designing an automated trading strategy that can consistently profit by adapting to changing market conditions. This challenge can be framed as a Nonstationary Continuum-Armed Bandit (NCAB) problem. To solve the NCAB problem, we propose PRBO, a novel trading algorithm that uses Bayesian optimization and a “bandit-over-bandit” framework to dynamically adjust strategy parameters in response to market conditions. We use Bristol Stock Exchange (BSE) to simulate financial markets containing heterogeneous populations of automated trading agents and compare PRBO with PRSH, a reference trading strategy that adapts strategy parameters through stochastic hill-climbing. Results show that PRBO generates significantly more profit than PRSH, despite having fewer hyperparameters to tune. The code for PRBO and performing experiments is available online open-source (https://github.com/HarmoniaLeo/PRZI-Bayesian-Optimisation)

    How social comparison influences reference price formation in a service context

    Get PDF
    What is the influence on reference price when the source of price information is anonymous versus social? This article investigates the formation of reference prices given an observed sequence of past prices in a service context. An experimental study suggests that, considering the same price information, if the source is social (i.e., the prices paid by colleagues), then consumers want to pay less. More specifically, social comparison changes the way individuals weigh information, attributing more importance to the lowest historical prices and to the range in price variations

    Inference of Many-Taxon Phylogenies

    Get PDF
    Phylogenetic trees are tree topologies that represent the evolutionary history of a set of organisms. In this thesis, we address computational challenges related to the analysis of large-scale datasets with Maximum Likelihood based phylogenetic inference. We have approached this using different strategies: reduction of memory requirements, reduction of running time, and reduction of man-hours

    Performance Evaluation - Annual Report Year 3

    Get PDF
    This report describes the work done and results obtained in third year of the CATNETS project. Experiments carried out with the different configurations of the prototype are reported and simulation results are evaluated with the CATNETS metrics framework. The applicability of the Catallactic approach as market model for service and resource allocation in application layer networks is assessed based on the results and experience gained both from the prototype development and simulations. --Grid Computing

    Solution of partial differential equations on vector and parallel computers

    Get PDF
    The present status of numerical methods for partial differential equations on vector and parallel computers was reviewed. The relevant aspects of these computers are discussed and a brief review of their development is included, with particular attention paid to those characteristics that influence algorithm selection. Both direct and iterative methods are given for elliptic equations as well as explicit and implicit methods for initial boundary value problems. The intent is to point out attractive methods as well as areas where this class of computer architecture cannot be fully utilized because of either hardware restrictions or the lack of adequate algorithms. Application areas utilizing these computers are briefly discussed

    High frequency trading from an evolutionary perspective: financial markets as adaptive systems

    Get PDF
    The recent rapid growth of algorithmic high‐frequency trading strategies makes it a very interesting time to revisit the long‐standing debates about the efficiency of stock prices and the best way to model the actions of market participants. To evaluate the evolution of stock price predictability at the millisecond timeframe and to examine whether it is consistent with the newly formed adaptive market hypothesis, we develop three artificial stock markets using a strongly typed genetic programming (STGP) trading algorithm. We simulate real‐life trading by applying STGP to millisecond data of the three highest capitalized stocks: Apple, Exxon Mobil, and Google and observe that profit opportunities at the millisecond time frame are better modelled through an evolutionary process involving natural selection, adaptation, learning, and dynamic evolution than by using conventional analytical techniques. We use combinations of forecasting techniques as benchmarks to demonstrate that different heuristics enable artificial traders to be ecologically rational, making adaptive decisions that combine forecasting accuracy with speed
    corecore