75 research outputs found

    Catalytic space: Non-determinism and hierarchy

    Get PDF
    Catalytic computation, defined by Buhrman, Cleve, Koucký, Loff and Speelman (STOC 2014), is a space-bounded computation where in addition to our working memory we have an exponentially larger auxiliary memory which is full; the auxiliary memory may be used throughout the computation, but it must be restored to its initial content by the end of the computation. Motivated by the surprising power of this model, we set out to study the non-deterministic version of catalytic computation. We establish that non-deterministic catalytic log-space is contained in ZPP, which is the same bound known for its deterministic counterpart, and we prove that non-deterministic catalytic space is closed under complement (under a standard derandomization assumption). Furthermore, we establish hierarchy theorems for non-deterministic and deterministic catalytic computation

    Linking international clinical research with stateless populations to justice in global health

    Get PDF
    BACKGROUND: In response to calls to expand the scope of research ethics to address justice in global health, recent scholarship has sought to clarify how external research actors from high-income countries might discharge their obligation to reduce health disparities between and within countries. An ethical framework-'research for health justice'-was derived from a theory of justice (the health capability paradigm) and specifies how international clinical research might contribute to improved health and research capacity in host communities. This paper examines whether and how external funders, sponsors, and researchers can fulfill their obligations under the framework. METHODS: Case study research was undertaken on the Shoklo Malaria Research Unit's (SMRU) vivax malaria treatment trial, which was performed on the Thai-Myanmar border with Karen and Myanmar refugees and migrants. We conducted nineteen in-depth interviews with trial stakeholders, including investigators, trial participants, community advisory board members, and funder representatives; directly observed at trial sites over a five-week period; and collected trial-related documents for analysis. RESULTS: The vivax malaria treatment trial drew attention to contextual features that, when present, rendered the 'research for health justice' framework's guidance partially incomplete. These insights allowed us to extend the framework to consider external research actors' obligations to stateless populations. Data analysis then showed that framework requirements are largely fulfilled in relation to the vivax malaria treatment trial by Wellcome Trust (funder), Oxford University (sponsor), and investigators. At the same time, this study demonstrates that it may be difficult for long-term collaborations to shift the focus of their research agendas in accordance with the changing burden of illness in their host communities and to build the independent research capacity of host populations when working with refugees and migrants. Obstructive factors included the research funding environment and staff turnover due to resettlement or migration. CONCLUSIONS: Our findings show that obligations for selecting research targets, research capacity strengthening, and post-trial benefits that link clinical trials to justice in global health can be upheld by external research actors from high-income countries when working with stateless populations in LMICs. However, meeting certain framework requirements for long-term collaborations may not be entirely feasible

    A foundation for real recursive function theory

    Get PDF
    The class of recursive functions over the reals, denoted by REC(R), was introduced by Cristopher Moore in his seminal paper written in 1995. Since then many subsequent investigations brought new results: the class REC(R) was put in relation with the class of functions generated by the General Purpose Analogue Computer of Claude Shannon; classical digital computation was embedded in several ways into the new model of computation; restrictions of REC(R) were proved to represent different classes of recursive functions, e.g., recursive, primitive recursive and elementary functions, and structures such as the Ritchie and the Grzergorczyk hierarchies. The class of real recursive functions was then stratified in a natural way, and REC(R) and the analytic hierarchy were recently recognised as two faces of the same mathematical concept. In this new article, we bring a strong foundational support to the Real Recursive Function Theory, rooted in Mathematical Analysis, in a way that the reader can easily recognise both its intrinsic mathematical beauty and its extreme simplicity. The new paradigm is now robust and smooth enough to be taught. To achieve such a result some concepts had to change and some new results were added

    Limits of Quantum Speed-Ups for Computational Geometry and Other Problems: Fine-Grained Complexity via Quantum Walks

    Get PDF
    Many computational problems are subject to a quantum speed-up: one might find that a problem having an Opn3q-time or Opn2q-time classic algorithm can be solved by a known Opn1.5q-time or Opnq-time quantum algorithm. The question naturally arises: how much quantum speed-up is possible? The area of fine-grained complexity allows us to prove optimal lower-bounds on the complexity of various computational problems, based on the conjectured hardness of certain natural, well-studied problems. This theory has recently been extended to the quantum setting, in two independent papers by Buhrman, Patro and Speelman [7], and by Aaronson, Chia, Lin, Wang, and Zhang [1]. In this paper, we further extend the theory of fine-grained complexity to the quantum setting. A fundamental conjecture in the classical setting states that the 3SUM problem cannot be solved by (classical) algorithms in time Opn2´εq, for any ε ą 0. We formulate an analogous conjecture, the Quantum-3SUM-Conjecture, which states that there exist no sublinear Opn1´εq-time quantum algorithms for the 3SUM problem. Based on the Quantum-3SUM-Conjecture, we show new lower-bounds on the time complexity of quantum algorithms for several computational problems. Most of our lower-bounds are optimal, in that they match known upper-bounds, and hence they imply tight limits on the quantum speedup that is possible for these problems. These results are proven by adapting to the quantum setting known classical fine-grained reductions from the 3SUM problem. This adaptation is not trivial, however, since the original classical reductions require pre-processing the input in various ways, e.g. by sorting it according to some order, and this pre-processing (provably) cannot be done in sublinear quantum time. We overcome this bottleneck by combining a quantum walk with a classical dynamic data-structure having a certain “history-independence” property. This type of construction has been used in the past to prove upper bounds, and here we use it for the first time as part of a reduction. This general proof strategy allows us to prove tight lower bounds on several computational-geometry problems, on Convolution-3SUM and on the 0-Edge-Weight-Triangle problem, conditional on the Quantum-3SUM-Conjecture. We believe this proof strategy will be useful in proving tight (conditional) lower-bounds, and limits on quantum speed-ups, for many other problems
    corecore