854 research outputs found

    The Missing Elements of Change. A Response to Youth Change Agents: Comparing the Sociopolitical Identities of Youth Organizers and Youth Commissioners

    Get PDF
    By establishing a set of theoretical frameworks to view and compare the work of youth organizers and youth commissioners, and through personal interviews, the authors of the paper “Youth Change Agents: Comparing the Sociopolitical Identities of Youth Organizers and Youth Commissioners” presented their explanation of the development of the sociopolitical identities and civic commitments of each group. This response paper asks questions about the authors’ limited use of context and complexity to explain how their youth arrived at their opinions, perspectives, and ultimately their sociopolitical identities. Their work also raises questions of how and why civic engagement and social activism took place based upon the provided evidence of actual changes that occurred. Finally, it poses methodological concerns associated specifically with relying on youth memories, years after the fact, of their tenure in these two groups and uncoupled from any interactive variables, as well as the absence of triangulated data that would further substantiate their findings

    Quantum Algorithm for the Collision Problem

    Get PDF
    In this note, we give a quantum algorithm that finds collisions in arbitrary r-to-one functions after only O((N/r)^(1/3)) expected evaluations of the function. Assuming the function is given by a black box, this is more efficient than the best possible classical algorithm, even allowing probabilism. We also give a similar algorithm for finding claws in pairs of functions. Furthermore, we exhibit a space-time tradeoff for our technique. Our approach uses Grover's quantum searching algorithm in a novel way.Comment: 8 pages, LaTeX2

    A Protocol for Generating Random Elements with their Probabilities

    Full text link
    We give an AM protocol that allows the verifier to sample elements x from a probability distribution P, which is held by the prover. If the prover is honest, the verifier outputs (x, P(x)) with probability close to P(x). In case the prover is dishonest, one may hope for the following guarantee: if the verifier outputs (x, p), then the probability that the verifier outputs x is close to p. Simple examples show that this cannot be achieved. Instead, we show that the following weaker condition holds (in a well defined sense) on average: If (x, p) is output, then p is an upper bound on the probability that x is output. Our protocol yields a new transformation to turn interactive proofs where the verifier uses private random coins into proofs with public coins. The verifier has better running time compared to the well-known Goldwasser-Sipser transformation (STOC, 1986). For constant-round protocols, we only lose an arbitrarily small constant in soundness and completeness, while our public-coin verifier calls the private-coin verifier only once

    Parameter estimation in mathematical models of lung cancer

    Get PDF
    The goal of this thesis is to improve upon existing mathematical models of lung cancer that inform policy decisions related to lung cancer screening. Construction of stochastic, population-based models of lung cancer relies upon careful statistical estimation of biological parameters from diverse data sources. In this thesis, we focus specifically on two distinct aspects of parameter estimation. First, we propose a model-based framework to estimate lung cancer risk due to repeated low-dose radiation exposures using the two-stage clonal expansion (TSCE) model. We incorporate the TSCE model into a Bayesian framework and formulate a likelihood function for randomized screening data. The likelihood function depends on model-based risk correlates and effectively penalizes parameter values that correspond to model-based contradictions. The net result is that both the sensitivity and specificity of parameter estimation relating to excess lung cancer risk is increased. This methodology is applied to data from the Mayo Lung Project and estimates of 10-year excess lung cancer risk as a function of age at enrollment and number of screens are derived. Second, we describe a new statistical approach aimed at improving our understanding of the natural course of lung cancer. Specifically, we are interested in evaluating the evidence for, or against, the hi-modal hypothesis which proposes that lung cancers are of two categories, either slow-growing and non-invasive cancers (tending to over-diagnosis) or rapidly-growing and highly aggressive. We represent the growth trajectory of lung tumors using the evolutionary parameters of cancer stern cell branching fraction (f) and cell mutation rate (mu). While concern over widespread implementation of lung cancer screening has focused primarily on the extent of over-diagnosis, these results are consistent with the presence of a high percentage of rapidly-growing, aggressive cancers

    On formal verification of arithmetic-based cryptographic primitives

    Full text link
    Cryptographic primitives are fundamental for information security: they are used as basic components for cryptographic protocols or public-key cryptosystems. In many cases, their security proofs consist in showing that they are reducible to computationally hard problems. Those reductions can be subtle and tedious, and thus not easily checkable. On top of the proof assistant Coq, we had implemented in previous work a toolbox for writing and checking game-based security proofs of cryptographic primitives. In this paper we describe its extension with number-theoretic capabilities so that it is now possible to write and check arithmetic-based cryptographic primitives in our toolbox. We illustrate our work by machine checking the game-based proofs of unpredictability of the pseudo-random bit generator of Blum, Blum and Shub, and semantic security of the public-key cryptographic scheme of Goldwasser and Micali.Comment: 13 page

    Moloney murine leukemia virus decay mediated by retroviral reverse transcriptase degradation of genomic RNA

    Get PDF
    AbstractRetroviral vectors are powerful tools for the introduction of transgenes into mammalian cells and for long-term gene expression. However, their application is often limited by a rapid loss of bioactivity: retroviruses spontaneously loose activity at 37 °C, with a half-life of 4 to 9 h depending on the retrovirus type. We sought to determine which components of the retrovirus are responsible for this loss in bioactivity and to obtain a quantitative characterization of their stability. To this end, we focused on RNA and viral proteins, two major components that we hypothesized may undergo degradation and negatively influence viral infectivity. Reverse transcription PCR (RT-PCR) targeting RNA encoding portions of the viral genome clearly demonstrated time-dependent degradation of RNA which correlated with the loss in viral bioactivity. Circular dichroism spectroscopy, SDS-PAGE and two-dimensional SDS-PAGE analyses of viral proteins did not show any change in secondary structure or evidence of proteolysis. The mechanism underlying the degradation of viral RNA was investigated by site-directed mutagenesis of proteins encoded by the viral genome. Reverse transcriptase and protease mutants exhibited enhanced RNA stability in comparison to wild type recombinant virus, suggesting that the degradation of RNA, and the corresponding virus loss of activity, is mediated by the reverse transcriptase enzyme

    Quantum Interactive Proofs with Competing Provers

    Full text link
    This paper studies quantum refereed games, which are quantum interactive proof systems with two competing provers: one that tries to convince the verifier to accept and the other that tries to convince the verifier to reject. We prove that every language having an ordinary quantum interactive proof system also has a quantum refereed game in which the verifier exchanges just one round of messages with each prover. A key part of our proof is the fact that there exists a single quantum measurement that reliably distinguishes between mixed states chosen arbitrarily from disjoint convex sets having large minimal trace distance from one another. We also show how to reduce the probability of error for some classes of quantum refereed games.Comment: 13 pages, to appear in STACS 200

    Almost Perfect Privacy for Additive Gaussian Privacy Filters

    Full text link
    We study the maximal mutual information about a random variable YY (representing non-private information) displayed through an additive Gaussian channel when guaranteeing that only ϵ\epsilon bits of information is leaked about a random variable XX (representing private information) that is correlated with YY. Denoting this quantity by gϵ(X,Y)g_\epsilon(X,Y), we show that for perfect privacy, i.e., ϵ=0\epsilon=0, one has g0(X,Y)=0g_0(X,Y)=0 for any pair of absolutely continuous random variables (X,Y)(X,Y) and then derive a second-order approximation for gϵ(X,Y)g_\epsilon(X,Y) for small ϵ\epsilon. This approximation is shown to be related to the strong data processing inequality for mutual information under suitable conditions on the joint distribution PXYP_{XY}. Next, motivated by an operational interpretation of data privacy, we formulate the privacy-utility tradeoff in the same setup using estimation-theoretic quantities and obtain explicit bounds for this tradeoff when ϵ\epsilon is sufficiently small using the approximation formula derived for gϵ(X,Y)g_\epsilon(X,Y).Comment: 20 pages. To appear in Springer-Verla

    Searching a bitstream in linear time for the longest substring of any given density

    Full text link
    Given an arbitrary bitstream, we consider the problem of finding the longest substring whose ratio of ones to zeroes equals a given value. The central result of this paper is an algorithm that solves this problem in linear time. The method involves (i) reformulating the problem as a constrained walk through a sparse matrix, and then (ii) developing a data structure for this sparse matrix that allows us to perform each step of the walk in amortised constant time. We also give a linear time algorithm to find the longest substring whose ratio of ones to zeroes is bounded below by a given value. Both problems have practical relevance to cryptography and bioinformatics.Comment: 22 pages, 19 figures; v2: minor edits and enhancement
    • …
    corecore