2,745 research outputs found

    Physically Dense Server Architectures.

    Full text link
    Distributed, in-memory key-value stores have emerged as one of today's most important data center workloads. Being critical for the scalability of modern web services, vast resources are dedicated to key-value stores in order to ensure that quality of service guarantees are met. These resources include: many server racks to store terabytes of key-value data, the power necessary to run all of the machines, networking equipment and bandwidth, and the data center warehouses used to house the racks. There is, however, a mismatch between the key-value store software and the commodity servers on which it is run, leading to inefficient use of resources. The primary cause of inefficiency is the overhead incurred from processing individual network packets, which typically carry small payloads, and require minimal compute resources. Thus, one of the key challenges as we enter the exascale era is how to best adjust to the paradigm shift from compute-centric to storage-centric data centers. This dissertation presents a hardware/software solution that addresses the inefficiency issues present in the modern data centers on which key-value stores are currently deployed. First, it proposes two physical server designs, both of which use 3D-stacking technology and low-power CPUs to improve density and efficiency. The first 3D architecture---Mercury---consists of stacks of low-power CPUs with 3D-stacked DRAM. The second architecture---Iridium---replaces DRAM with 3D NAND Flash to improve density. The second portion of this dissertation proposes and enhanced version of the Mercury server design---called KeyVault---that incorporates integrated, zero-copy network interfaces along with an integrated switching fabric. In order to utilize the integrated networking hardware, as well as reduce the response time of requests, a custom networking protocol is proposed. Unlike prior works on accelerating key-value stores---e.g., by completely bypassing the CPU and OS when processing requests---this work only bypasses the CPU and OS when placing network payloads into a process' memory. The insight behind this is that because most of the overhead comes from processing packets in the OS kernel---and not the request processing itself---direct placement of packet's payload is sufficient to provide higher throughput and lower latency than prior approaches.PhDComputer Science and EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/111414/1/atgutier_1.pd

    Lifelong Intelligence Beyond the Edge using Hyperdimensional Computing

    Full text link
    On-device learning has emerged as a prevailing trend that avoids the slow response time and costly communication of cloud-based learning. The ability to learn continuously and indefinitely in a changing environment, and with resource constraints, is critical for real sensor deployments. However, existing designs are inadequate for practical scenarios with (i) streaming data input, (ii) lack of supervision and (iii) limited on-board resources. In this paper, we design and deploy the first on-device lifelong learning system called LifeHD for general IoT applications with limited supervision. LifeHD is designed based on a novel neurally-inspired and lightweight learning paradigm called Hyperdimensional Computing (HDC). We utilize a two-tier associative memory organization to intelligently store and manage high-dimensional, low-precision vectors, which represent the historical patterns as cluster centroids. We additionally propose two variants of LifeHD to cope with scarce labeled inputs and power constraints. We implement LifeHD on off-the-shelf edge platforms and perform extensive evaluations across three scenarios. Our measurements show that LifeHD improves the unsupervised clustering accuracy by up to 74.8% compared to the state-of-the-art NN-based unsupervised lifelong learning baselines with as much as 34.3x better energy efficiency. Our code is available at https://github.com/Orienfish/LifeHD.Comment: Accepted by IPSN'2

    Characterising and Verifying the Core in Concurrent Multi-Player Mean-Payoff Games (Full Version)

    Full text link
    Concurrent multi-player mean-payoff games are important models for systems of agents with individual, non-dichotomous preferences. Whilst these games have been extensively studied in terms of their equilibria in non-cooperative settings, this paper explores an alternative solution concept: the core from cooperative game theory. This concept is particularly relevant for cooperative AI systems, as it enables the modelling of cooperation among agents, even when their goals are not fully aligned. Our contribution is twofold. First, we provide a characterisation of the core using discrete geometry techniques and establish a necessary and sufficient condition for its non-emptiness. We then use the characterisation to prove the existence of polynomial witnesses in the core. Second, we use the existence of such witnesses to solve key decision problems in rational verification and provide tight complexity bounds for the problem of checking whether some/every equilibrium in a game satisfies a given LTL or GR(1) specification. Our approach is general and can be adapted to handle other specifications expressed in various fragments of LTL without incurring additional computational costs.Comment: This is the full version of the paper with the same title that appears in the CSL'24 proceeding

    MTF1, a classic metal sensing transcription factor, promotes myogenesis in response to copper [preprint]

    Get PDF
    MTF1 is a conserved metal-binding transcription factor in eukaryotes that binds to conserved DNA sequence motifs, termed metal response elements (MREs). MTF1 responds to metal excess and deprivation, protects cells from oxidative and hypoxic stresses, and is required for embryonic development in vertebrates. We used multiple strategies to identify an unappreciated role for MTF1 and copper (Cu) in cell differentiation. Upon initiation of myogenesis from primary myoblasts, MTF1 expression increased, as did nuclear localization. Mtf1 knockdown impaired differentiation, while addition of non-toxic concentrations of Cu+ enhanced MTF1 expression and promoted myogenesis. Cu+ bound stoichiometrically to a C-terminus tetra-cysteine of MTF1. MTF1 bound to chromatin at the promoter regions of myogenic genes and binding was stimulated by copper. MTF1 formed a complex with MyoD at myogenic promoters, the master transcriptional regulator of the myogenic lineage. These studies establish novel mechanisms by which copper and MTF1 regulate gene expression in myoblast differentiation

    Where Are the Newly Diagnosed HIV Positives in Kenya? Time to Consider Geo-Spatially Guided Targeting at a Finer Scale to Reach the “First 90”

    Get PDF
    Background: The UNAIDS 90-90-90 Fast-Track targets provide a framework for assessing coverage of HIV testing services (HTS) and awareness of HIV status – the “first 90.” In Kenya, the bulk of HIV testing targets are aligned to the five highest HIV-burden counties. However, we do not know if most of the new HIV diagnoses are in these five highest-burden counties or elsewhere. Methods: We analyzed facility-level HTS data in Kenya from 1 October 2015 to 30 September 2016 to assess the spatial distribution of newly diagnosed HIV-positives. We used the Moran's Index (Moran's I) to assess global and local spatial auto-correlation of newly diagnosed HIV-positive tests and Kulldorff spatial scan statistics to detect hotspots of newly diagnosed HIV-positive tests. For aggregated data, we used Kruskal-Wallis equality-of-populations non-parametric rank test to compare absolute numbers across classes. Results: Out of 4,021 HTS sites, 3,969 (98.7%) had geocodes available. Most facilities (3,034, 76.4%), were not spatially autocorrelated for the number of newly diagnosed HIV-positives. For the rest, clustering occurred as follows; 438 (11.0%) were HH, 66 (1.7%) HL, 275 (6.9%) LH, and 156 (3.9%) LL. Of the HH sites, 301 (68.7%) were in high HIV-burden counties. Over half of 123 clusters with a significantly high number of newly diagnosed HIV-infected persons, 73(59.3%) were not in the five highest HIV-burden counties. Clusters with a high number of newly diagnosed persons had twice the number of positives per 1,000,000 tests than clusters with lower numbers (29,856 vs. 14,172). Conclusions: Although high HIV-burden counties contain clusters of sites with a high number of newly diagnosed HIV-infected persons, we detected many such clusters in low-burden counties as well. To expand HTS where most needed and reach the “first 90” targets, geospatial analyses and mapping make it easier to identify and describe localized epidemic patterns in a spatially dispersed epidemic like Kenya's, and consequently, reorient and prioritize HTS strategies.publishedVersio

    Development of an automated DNA purification module using a micro-fabricated pillar chip

    Full text link
    We present a fully automated DNA purification module comprised of a micro-fabricated chip and sequential injection analysis system that is designed for use within autonomous instruments that continuously monitor the environment for the presence of biological threat agents. The chip has an elliptical flow channel containing a bed (3.5 &times; 3.5 mm) of silica-coated pillars with height, width and center-to-center spacing of 200, 15, and 30 &micro;m, respectively, which provides a relatively large surface area (ca. 3 cm2) for DNA capture in the presence of chaotropic agents. We have characterized the effect of various fluidic parameters on extraction performance, including sample input volume, capture flow rate, and elution volume. The flow-through design made the pillar chip completely reusable; carryover was eliminated by flushing lines with sodium hypochlorite and deionized water between assays. A mass balance was conducted to determine the fate of input DNA not recovered in the eluent. The device was capable of purifying and recovering Bacillus anthracis genomic DNA (input masses from 0.32 to 320 pg) from spiked environmental aerosol samples, for subsequent analysis using polymerase chain reaction-based assays.<br /

    Bayesian analysis of weak gravitational lensing and Sunyaev-Zel'dovich data for six galaxy clusters

    Get PDF
    We present an analysis of observations made with the Arcminute Microkelvin Imager (AMI) and the Canada-France-Hawaii Telescope (CFHT) of six galaxy clusters in a redshift range of 0.16--0.41. The cluster gas is modelled using the Sunyaev--Zel'dovich (SZ) data provided by AMI, while the total mass is modelled using the lensing data from the CFHT. In this paper, we: i) find very good agreement between SZ measurements (assuming large-scale virialisation and a gas-fraction prior) and lensing measurements of the total cluster masses out to r_200; ii) perform the first multiple-component weak-lensing analysis of A115; iii) confirm the unusual separation between the gas and mass components in A1914; iv) jointly analyse the SZ and lensing data for the relaxed cluster A611, confirming our use of a simulation-derived mass-temperature relation for parameterizing measurements of the SZ effect.Comment: 22 pages, 12 figures, 12 tables, published by MNRA

    Measurement of the cross-section and charge asymmetry of WW bosons produced in proton-proton collisions at s=8\sqrt{s}=8 TeV with the ATLAS detector

    Get PDF
    This paper presents measurements of the W+μ+νW^+ \rightarrow \mu^+\nu and WμνW^- \rightarrow \mu^-\nu cross-sections and the associated charge asymmetry as a function of the absolute pseudorapidity of the decay muon. The data were collected in proton--proton collisions at a centre-of-mass energy of 8 TeV with the ATLAS experiment at the LHC and correspond to a total integrated luminosity of 20.2~\mbox{fb^{-1}}. The precision of the cross-section measurements varies between 0.8% to 1.5% as a function of the pseudorapidity, excluding the 1.9% uncertainty on the integrated luminosity. The charge asymmetry is measured with an uncertainty between 0.002 and 0.003. The results are compared with predictions based on next-to-next-to-leading-order calculations with various parton distribution functions and have the sensitivity to discriminate between them.Comment: 38 pages in total, author list starting page 22, 5 figures, 4 tables, submitted to EPJC. All figures including auxiliary figures are available at https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/STDM-2017-13

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente
    corecore