65,334 research outputs found

    Arithmetic on a Distributed-Memory Quantum Multicomputer

    Full text link
    We evaluate the performance of quantum arithmetic algorithms run on a distributed quantum computer (a quantum multicomputer). We vary the node capacity and I/O capabilities, and the network topology. The tradeoff of choosing between gates executed remotely, through ``teleported gates'' on entangled pairs of qubits (telegate), versus exchanging the relevant qubits via quantum teleportation, then executing the algorithm using local gates (teledata), is examined. We show that the teledata approach performs better, and that carry-ripple adders perform well when the teleportation block is decomposed so that the key quantum operations can be parallelized. A node size of only a few logical qubits performs adequately provided that the nodes have two transceiver qubits. A linear network topology performs acceptably for a broad range of system sizes and performance parameters. We therefore recommend pursuing small, high-I/O bandwidth nodes and a simple network. Such a machine will run Shor's algorithm for factoring large numbers efficiently.Comment: 24 pages, 10 figures, ACM transactions format. Extended version of Int. Symp. on Comp. Architecture (ISCA) paper; v2, correct one circuit error, numerous small changes for clarity, add reference

    Web Science: expanding the notion of Computer Science

    No full text
    Academic disciplines which practice in the context of rapid external change face particular problems when seeking to maintain timely, current and relevant teaching programs. In different institutions faculty will tune and update individual component courses while more radical revisions are typically departmental-wide strategic responses to perceived needs. Internationally, the ACM has sought to define curriculum recommendations since the 1960s and recognizes the diversity of the computing disciplines with its 2005 overview volume. The consequent rolling program of revisions is demanding in terms of time and effort, but an inevitable response to the change inherent is our family of specialisms. Preparation for the Computer Curricula 2013 is underway, so it seems appropriate to ask what place Web Science will have in the curriculum landscape. Web Science has been variously described; the most concise definition being the ‘science of decentralized information systems’. Web science is fundamentally interdisciplinary encompassing the study of the technologies and engineering which constitute the Web, alongside emerging associated human, social and organizational practices. Furthermore, to date little teaching of Web Science is at undergraduate level. Some questions emerge - is Web Science a transient artifact? Can Web Science claim a place in the ACM family, Is Web Science an exotic relative with a home elsewhere? This paper discusses the role and place of Web Science in the context of the computing disciplines. It provides an account of work which has been established towards defining an initial curriculum for Web Science with plans for future developments utilizing novel methods to support and elaborate curriculum definition and review. The findings of a desk survey of existing related curriculum recommendations are presented. The paper concludes with recommendations for future activities which may help us determine whether we should expand the notion of computer science

    Limits on Fundamental Limits to Computation

    Full text link
    An indispensable part of our lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the last fifty years. Such Moore scaling now requires increasingly heroic efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and enrich our understanding of integrated-circuit scaling, we review fundamental limits to computation: in manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, we recall how some limits were circumvented, compare loose and tight limits. We also point out that engineering difficulties encountered by emerging technologies may indicate yet-unknown limits.Comment: 15 pages, 4 figures, 1 tabl

    Human computer interaction for international development: past present and future

    Get PDF
    Recent years have seen a burgeoning interest in research into the use of information and communication technologies (ICTs) in the context of developing regions, particularly into how such ICTs might be appropriately designed to meet the unique user and infrastructural requirements that we encounter in these cross-cultural environments. This emerging field, known to some as HCI4D, is the product of a diverse set of origins. As such, it can often be difficult to navigate prior work, and/or to piece together a broad picture of what the field looks like as a whole. In this paper, we aim to contextualize HCI4D—to give it some historical background, to review its existing literature spanning a number of research traditions, to discuss some of its key issues arising from the work done so far, and to suggest some major research objectives for the future
    • 

    corecore