83 research outputs found

    Design and deployment of eHealth interventions using behavior change techniques, BPMN2 and OpenEHR

    Get PDF

    Dynamic Set Intersection

    Full text link
    Consider the problem of maintaining a family FF of dynamic sets subject to insertions, deletions, and set-intersection reporting queries: given S,SFS,S'\in F, report every member of SSS\cap S' in any order. We show that in the word RAM model, where ww is the word size, given a cap dd on the maximum size of any set, we can support set intersection queries in O(dw/log2w)O(\frac{d}{w/\log^2 w}) expected time, and updates in O(logw)O(\log w) expected time. Using this algorithm we can list all tt triangles of a graph G=(V,E)G=(V,E) in O(m+mαw/log2w+t)O(m+\frac{m\alpha}{w/\log^2 w} +t) expected time, where m=Em=|E| and α\alpha is the arboricity of GG. This improves a 30-year old triangle enumeration algorithm of Chiba and Nishizeki running in O(mα)O(m \alpha) time. We provide an incremental data structure on FF that supports intersection {\em witness} queries, where we only need to find {\em one} eSSe\in S\cap S'. Both queries and insertions take O\paren{\sqrt \frac{N}{w/\log^2 w}} expected time, where N=SFSN=\sum_{S\in F} |S|. Finally, we provide time/space tradeoffs for the fully dynamic set intersection reporting problem. Using MM words of space, each update costs O(MlogN)O(\sqrt {M \log N}) expected time, each reporting query costs O(NlogNMop+1)O(\frac{N\sqrt{\log N}}{\sqrt M}\sqrt{op+1}) expected time where opop is the size of the output, and each witness query costs O(NlogNM+logN)O(\frac{N\sqrt{\log N}}{\sqrt M} + \log N) expected time.Comment: Accepted to WADS 201

    Algorithms in the Ultra-Wide Word Model

    Full text link
    The effective use of parallel computing resources to speed up algorithms in current multi-core parallel architectures remains a difficult challenge, with ease of programming playing a key role in the eventual success of various parallel architectures. In this paper we consider an alternative view of parallelism in the form of an ultra-wide word processor. We introduce the Ultra-Wide Word architecture and model, an extension of the word-RAM model that allows for constant time operations on thousands of bits in parallel. Word parallelism as exploited by the word-RAM model does not suffer from the more difficult aspects of parallel programming, namely synchronization and concurrency. For the standard word-RAM algorithms, the speedups obtained are moderate, as they are limited by the word size. We argue that a large class of word-RAM algorithms can be implemented in the Ultra-Wide Word model, obtaining speedups comparable to multi-threaded computations while keeping the simplicity of programming of the sequential RAM model. We show that this is the case by describing implementations of Ultra-Wide Word algorithms for dynamic programming and string searching. In addition, we show that the Ultra-Wide Word model can be used to implement a nonstandard memory architecture, which enables the sidestepping of lower bounds of important data structure problems such as priority queues and dynamic prefix sums. While similar ideas about operating on large words have been mentioned before in the context of multimedia processors [Thorup 2003], it is only recently that an architecture like the one we propose has become feasible and that details can be worked out.Comment: 28 pages, 5 figures; minor change

    Particle Backtracking Improves Breeding Subpopulation Discrimination and Natal-Source Identification in Mixed Populations

    Get PDF
    We provide a novel method to improve the use of natural tagging approaches for subpopulation discrimination and source-origin identification in aquatic and terrestrial animals with a passive dispersive phase. Our method integrates observed site-referenced biological information on individuals in mixed populations with a particle-tracking model to retrace likely dispersal histories prior to capture (i.e., particle backtracking). To illustrate and test our approach, we focus on western Lake Erie\u27s yellow perch (Perca flavescens) population during 2006-2007, using microsatellite DNA and otolith microchemistry from larvae and juveniles as natural tags. Particle backtracking showed that not all larvae collected near a presumed hatching location may have originated there, owing to passive drift during the larval stage that was influenced by strong river-and wind-driven water circulation. Re-assigning larvae to their most probable hatching site (based on probabilistic dispersal trajectories from the particle backtracking model) improved the use of genetics and otolith microchemistry to discriminate among local breeding subpopulations. This enhancement, in turn, altered (and likely improved) the estimated contributions of each breeding subpopulation to the mixed population of juvenile recruits. Our findings indicate that particle backtracking can complement existing tools used to identify the origin of individuals in mixed populations, especially in flow-dominated systems

    Partial Sums on the Ultra-Wide Word RAM

    Full text link
    We consider the classic partial sums problem on the ultra-wide word RAM model of computation. This model extends the classic ww-bit word RAM model with special ultrawords of length w2w^2 bits that support standard arithmetic and boolean operation and scattered memory access operations that can access ww (non-contiguous) locations in memory. The ultra-wide word RAM model captures (and idealizes) modern vector processor architectures. Our main result is a new in-place data structure for the partial sum problem that only stores a constant number of ultraword in addition to the input and supports operations in doubly logarithmic time. This matches the best known time bounds for the problem (among polynomial space data structures) while improving the space from superlinear to a constant number of ultrawords. Our results are based on a simple and elegant in-place word RAM data structure, known as the Fenwick tree. Our main technical contribution is a new efficient parallel ultra-wide word RAM implementation of the Fenwick tree, which is likely of independent interest.Comment: Extended abstract appeared at TAMC 202

    Dynamic Compressed Strings with Random Access

    Full text link
    We consider the problem of storing a string S in dynamic compressed form, while permitting operations directly on the compressed representation of S: access a substring of S; replace, insert or delete a symbol in S; count how many occurrences of a given symbol appear in any given prefix of S (called rank operation) and locate the position of the ith occurrence of a symbol inside S (called select operation). We discuss the time complexity of several combinations of these operations along with the entropy space bounds of the corresponding compressed indexes. In this way, we extend or improve the bounds of previous work by Ferragina and Venturini [TCS, 2007], Jansson et al. [ICALP, 2012], and Nekrich and Navarro [SODA, 2013]

    A Simple Linear-Space Data Structure for Constant-Time Range Minimum Query

    Full text link
    Abstract. We revisit the range minimum query problem and present a new O(n)-space data structure that supports queries in O(1) time. Although previous data structures exist whose asymptotic bounds match ours, our goal is to introduce a new solution that is simple, intuitive, and practical without increasing asymptotic costs for query time or space

    Fear expression is suppressed by tyrosine administration

    Get PDF
    Animal studies have demonstrated that catecholamines regulate several aspects of fear conditioning. In humans, however, pharmacological manipulations of the catecholaminergic system have been scarce, and their primary focus has been to interfering with catecholaminergic activity after fear acquisition or expression had taken place, using L-Dopa, primarily, as catecholaminergic precursor. Here, we sought to determine if putative increases in presynaptic dopamine and norepinephrine by tyrosine administered before conditioning could affect fear expression. Electrodermal activity (EDA) of 46 healthy participants (24 placebo, 22 tyrosine) was measured in a fear instructed task. Results showed that tyrosine abolished fear expression compared to placebo. Importantly, tyrosine did not affect EDA responses to the aversive stimulus (UCS) or alter participants' mood. Therefore, the effect of tyrosine on fear expression cannot be attributed to these factors. Taken together, these findings provide evidence that the catecholaminergic system influences fear expression in humans
    corecore