595 research outputs found

    Formal Verification vs. Quantum Uncertainty

    Get PDF
    Quantum programming is hard: Quantum programs are necessarily probabilistic and impossible to examine without disrupting the execution of a program. In response to this challenge, we and a number of other researchers have written tools to verify quantum programs against their intended semantics. This is not enough. Verifying an idealized semantics against a real world quantum program doesn\u27t allow you to confidently predict the program\u27s output. In order to have verification that works, you need both an error semantics related to the hardware at hand (this is necessarily low level) and certified compilation to the that same hardware. Once we have these two things, we can talk about an approach to quantum programming where we start by writing and verifying programs at a high level, attempt to verify properties of the compiled code, and repeat as necessary

    Multi-cultural visualization : how functional programming can enrich visualization (and vice versa)

    Get PDF
    The past two decades have seen visualization flourish as a research field in its own right, with advances on the computational challenges of faster algorithms, new techniques for datasets too large for in-core processing, and advances in understanding the perceptual and cognitive processes recruited by visualization systems, and through this, how to improve the representation of data. However, progress within visualization has sometimes proceeded in parallel with that in other branches of computer science, and there is a danger that when novel solutions ossify into `accepted practice' the field can easily overlook significant advances elsewhere in the community. In this paper we describe recent advances in the design and implementation of pure functional programming languages that, significantly, contain important insights into questions raised by the recent NIH/NSF report on Visualization Challenges. We argue and demonstrate that modern functional languages combine high-level mathematically-based specifications of visualization techniques, concise implementation of algorithms through fine-grained composition, support for writing correct programs through strong type checking, and a different kind of modularity inherent in the abstractive power of these languages. And to cap it off, we have initial evidence that in some cases functional implementations are faster than their imperative counterparts

    Report on the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2)

    Get PDF
    This technical report records and discusses the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2). The report includes a description of the alternative, experimental submission and review process, two workshop keynote presentations, a series of lightning talks, a discussion on sustainability, and five discussions from the topic areas of exploring sustainability; software development experiences; credit & incentives; reproducibility & reuse & sharing; and code testing & code review. For each topic, the report includes a list of tangible actions that were proposed and that would lead to potential change. The workshop recognized that reliance on scientific software is pervasive in all areas of world-leading research today. The workshop participants then proceeded to explore different perspectives on the concept of sustainability. Key enablers and barriers of sustainable scientific software were identified from their experiences. In addition, recommendations with new requirements such as software credit files and software prize frameworks were outlined for improving practices in sustainable software engineering. There was also broad consensus that formal training in software development or engineering was rare among the practitioners. Significant strides need to be made in building a sense of community via training in software and technical practices, on increasing their size and scope, and on better integrating them directly into graduate education programs. Finally, journals can define and publish policies to improve reproducibility, whereas reviewers can insist that authors provide sufficient information and access to data and software to allow them reproduce the results in the paper. Hence a list of criteria is compiled for journals to provide to reviewers so as to make it easier to review software submitted for publication as a “Software Paper.

    Many-Task Computing and Blue Waters

    Full text link
    This report discusses many-task computing (MTC) generically and in the context of the proposed Blue Waters systems, which is planned to be the largest NSF-funded supercomputer when it begins production use in 2012. The aim of this report is to inform the BW project about MTC, including understanding aspects of MTC applications that can be used to characterize the domain and understanding the implications of these aspects to middleware and policies. Many MTC applications do not neatly fit the stereotypes of high-performance computing (HPC) or high-throughput computing (HTC) applications. Like HTC applications, by definition MTC applications are structured as graphs of discrete tasks, with explicit input and output dependencies forming the graph edges. However, MTC applications have significant features that distinguish them from typical HTC applications. In particular, different engineering constraints for hardware and software must be met in order to support these applications. HTC applications have traditionally run on platforms such as grids and clusters, through either workflow systems or parallel programming systems. MTC applications, in contrast, will often demand a short time to solution, may be communication intensive or data intensive, and may comprise very short tasks. Therefore, hardware and software for MTC must be engineered to support the additional communication and I/O and must minimize task dispatch overheads. The hardware of large-scale HPC systems, with its high degree of parallelism and support for intensive communication, is well suited for MTC applications. However, HPC systems often lack a dynamic resource-provisioning feature, are not ideal for task communication via the file system, and have an I/O system that is not optimized for MTC-style applications. Hence, additional software support is likely to be required to gain full benefit from the HPC hardware

    The first ICASE/LARC industry roundtable: Session proceedings

    Get PDF
    The first 'ICASE/LaRC Industry Roundtable' was held on October 3-4, 1994, in Williamsburg, Virginia. The main purpose of the roundtable was to draw attention of ICASE/LaRC scientists to industrial research agendas. The roundtable was attended by about 200 scientists, 30% from NASA Langley; 20% from universities; 17% NASA Langley contractors (including ICASE personnel); and the remainder from federal agencies other than NASA Langley. The technical areas covered reflected the major research programs in ICASE and closely associated NASA branches. About 80% of the speakers were from industry. This report is a compilation of the session summaries prepared by the session chairmen

    A historical and practical survey of quantum computing using QISKit

    Get PDF
    Quantum Computing has been a part of computer science literature since the 1960s, but the call for quantum mechanical-based computation came when renowned theoretical physicist Richard Feynman said, “
 nature isn\u27t classical, dammit, and if you want to make a simulation of nature, you\u27d better make it quantum mechanical, and by golly it\u27s a wonderful problem, because it doesn\u27t look so easy.” (Feynman, 486) His words inspired people to begin work on the project immediately. Although the best ideas have not yet been found, there is much active research and experimentation going on to learn how best to use these new quantum systems. The more computer scientists understand quantum computing, the better the algorithms become, although they have yet to create numerous future applications. How did quantum computing develop? A related question is, How do people use quantum computing in practical applications? Research suggests quantum computing developed in the early 1980s after the call came from Richard Feynman, and numerous researchers such as David Deutsch, Peter Shor, and Lov Grover outlined theoretical aspects of the field, defined fundamental algorithms, and predicted a quantum advantage providing exponential speedup over classical algorithms. This project provides historical context and pertinent background information. Computer scientists can use quantum computing for their own research and to extend the range of quantum computing itself. An experiment performed with the IBM resource known as QISKit shows how members of the public can perform their own practical applications of quantum computing, while also illustrating limitations that exist in the field of quantum computers

    Dagstuhl News January - December 2000

    Get PDF
    "Dagstuhl News" is a publication edited especially for the members of the Foundation "Informatikzentrum Schloss Dagstuhl" to thank them for their support. The News give a summary of the scientific work being done in Dagstuhl. Each Dagstuhl Seminar is presented by a small abstract describing the contents and scientific highlights of the seminar as well as the perspectives or challenges of the research topic
    • 

    corecore