65 research outputs found

    Know When to Persist: Deriving Value from a Stream Buffer

    Get PDF
    We consider \textsc{Persistence}, a new online problem concerning optimizing weighted observations in a stream of data when the observer has limited buffer capacity. A stream of weighted items arrive one at a time at the entrance of a buffer with two holding locations. A processor (or observer) can process (observe) an item at the buffer location it chooses, deriving this way the weight of the observed item as profit. The main constraint is that the processor can only move {\em synchronously} with the item stream; as a result, moving from the end of the buffer to the entrance, it crosses paths with the item already there, and will never have the chance to process or even identify it. \textsc{Persistence}\ is the online problem of scheduling the processor movements through the buffer so that its total derived value is maximized under this constraint. We study the performance of the straight-forward heuristic {\em Threshold}, i.e., forcing the processor to "follow" an item through the whole buffer only if its value is above a threshold. We analyze both the optimal offline and Threshold algorithms in case the input stream is a random permutation, or its items are iid valued. We show that in both cases the competitive ratio achieved by the Threshold algorithm is at least 2/32/3 when the only statistical knowledge of the items is the median of all possible values. We generalize our results by showing that Threshold, equipped with some minimal statistical advice about the input, achieves competitive ratios in the whole spectrum between 2/32/3 and 11, following the variation of a newly defined density-like measure of the input. This result is a significant improvement over the case of arbitrary input streams, since in this case we show that no online algorithm can achieve a competitive ratio better than 1/21/2.Comment: 17 pages, 1 figur

    Low power LoRaWAN node based on FRAM microcontroller

    Get PDF
    © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.In the quest to improve the energy requirements of LPWAN nodes and make them more suitable for energy harvesting, a microcontroller with onchip Ferroelectric Random Access Memory (FRAM) was used as a controller in a LoRaWAN node. Energy measurements showed that the performance of such a device is comparable or better than that of a similar FLASH-based microcontroller. Furthermore, the advantages resulting from the high endurance and low-power characteristics of FRAM memories can be used to improve the node

    Bridge monitoring system based on vibration measurements

    Get PDF
    This work outlines the main algorithms involved in a proposed bridge monitoring system based on ambient and earthquake vibration measurements. The monitoring system can be used to predict the existence, location and size of structural modifications in the bridge by monitoring the changes in the modal characteristics and updating the finite element model of the bridge based on the modal characteristics. Sophisticated system identification methods, combining information from a sensor network with the theoretical information built into a fi-nite element model for simulating structural behaviour, are incorporated into the monitoring system in order to track structural changes and identify the location, type and extent of these changes. Emphasis in this work is given on presenting theoretical and computational issues relating to structural modal identification and structural model updating methods. Specifical-ly, the proposed work outlines the algorithms and software that has been developed for com-puting the modal properties using ambient and earthquake data, as well as recent methodologies and software for finite element model updating using the modal characteristics. Various issues encountered in the optimization problems involved in model updating are demonstrated, including the existence of multiple local optima and the effects of weight values in conventional weighted modal residual methods for selecting the optimal finite element model. Selected features are demonstrated using vibration measurements from a four-span bridge of the Egnatia Odos motorway in Greece

    VITAMIN-V: Virtual Environment and Tool-Boxing for Trustworthy Development of RISC-V Based Cloud Services

    Get PDF
    VITAMIN-V is a 2023-2025 Horizon Europe project that aims to develop a complete RISC-V open-source software stack for cloud services with comparable performance to the cloud-dominant x86 counterpart and a powerful virtual execution environment for software development, validation, verification, and testing that considers the relevant RISC-V ISA extensions for cloud deployment. VITAMIN-V will specifically support the RISC-V extensions for virtualization, cryptography, and vec- torization in three virtual environments: QEMU, gem5, and cloud FPGA prototype platforms. The project will focus on European Processor Initiative (EPI) based RISC-V designs and accelerators. VITAMIN-V will also support the ISA extensions by adding the compiler and toolchain support. Furthermore, it will develop novel software validation, verification, and testing approaches to ensure software trustworthiness. To enable the execution of complete cloud stacks, VITAMIN-V will port all nec- essary machine-dependent modules in relevant open-source cloud software distributions, focusing on three cloud setups. Finally, VITAMIN-V will demonstrate and benchmark these three cloud setups using relevant AI, big-data, and serverless applications. VITAMIN-V aims to match the software performance of its x86 equivalent while contributing to RISC-V open-source virtual environments, software validation, and cloud software suites

    Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements

    Jet energy measurement and its systematic uncertainty in proton-proton collisions at root s=7 TeV with the ATLAS detector

    Get PDF
    The jet energy scale (JES) and its systematic uncertainty are determined for jetsmeasured with the ATLAS detector using proton-proton collision data with a centre-of-mass energy of root s = 7 TeV corresponding to an integrated luminosity of 4.7 fb(-1). Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-kt algorithmwith distance parameters R = 0.4 or R = 0.6, and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a Z boson, for 20 LT = p(T)(jet) LT 1000 GeV and pseudorapidities vertical bar eta vertical bar LT 4.5. The effect of multiple proton-proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1% is found in the central calorimeter region (vertical bar eta vertical bar| LT 1.2) for jets with 55 = p(T)(jet) LT 500 GeV. For central jets at lower p(T), the uncertainty is about 3%. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton-proton collisions and test-beam data, which also provide the estimate for p(T)(jet) GT 1 TeV. The calibration of forward jets is derived from dijet p(T) balance measurements. The resulting uncertainty reaches its largest value of 6% for low-p(T) jets at vertical bar eta vertical bar| = 4.5. Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5-3%

    DAO Dynamics: Treasury and Market Cap Interaction

    No full text
    This study examines the dynamics between treasury and market capitalization in two Decentralized Autonomous Organization (DAO) projects: OlympusDAO and KlimaDAO. This research examines the relationship between market capitalization and treasuries in these projects using vector autoregression (VAR), Granger causality, and Vector Error Correction models (VECM), incorporating an exogenous variable to account for the comovement of decentralized finance assets. Additionally, a Generalized Autoregressive Conditional Heteroskedasticity (GARCH) model is employed to assess the impact of carbon offset tokens on KlimaDAO’s market capitalization returns’ conditional variance. The findings suggest a connection between market capitalization and treasuries in the analyzed projects, underscoring the importance of the treasury and carbon offset tokens in impacting a DAO’s market capitalization and variance. Additionally, the results suggest significant implications for predictive modeling, highlighting the distinct behaviors observed in OlympusDAO and KlimaDAO. Investors and policymakers can leverage these results to refine investment strategies and adjust treasury allocation strategies to align with market trends. Furthermore, this study addresses the importance of responsible investing, advocating for including sustainable investment assets alongside a foundational framework for informed investment decisions and future studies in the field, offering novel insights into decentralized finance dynamics and tokenized assets’ role within the crypto-asset ecosystem
    corecore