9,619 research outputs found

    A review of High Performance Computing foundations for scientists

    Full text link
    The increase of existing computational capabilities has made simulation emerge as a third discipline of Science, lying midway between experimental and purely theoretical branches [1, 2]. Simulation enables the evaluation of quantities which otherwise would not be accessible, helps to improve experiments and provides new insights on systems which are analysed [3-6]. Knowing the fundamentals of computation can be very useful for scientists, for it can help them to improve the performance of their theoretical models and simulations. This review includes some technical essentials that can be useful to this end, and it is devised as a complement for researchers whose education is focused on scientific issues and not on technological respects. In this document we attempt to discuss the fundamentals of High Performance Computing (HPC) [7] in a way which is easy to understand without much previous background. We sketch the way standard computers and supercomputers work, as well as discuss distributed computing and discuss essential aspects to take into account when running scientific calculations in computers.Comment: 33 page

    Software Engineering Education Needs More Engineering

    Get PDF
    To what extent is “software engineering” really “engineering” as this term is commonly understood? A hallmark of the products of the traditional engineering disciplines is trustworthiness based on dependability. But in his keynote presentation at ICSE 2006 Barry Boehm pointed out that individuals’, systems’, and peoples’ dependency on software is becoming increasingly critical, yet that dependability is generally not the top priority for software intensive system producers. Continuing in an uncharacteristic pessimistic vein, Professor Boehm said that this situation will likely continue until a major software-induced system catastrophe similar in impact to the 9/11 World Trade Center catastrophe stimulates action toward establishing accountability for software dependability. He predicts that it is highly likely that such a software-induced catastrophe will occur between now and 2025. It is widely understood that software, i.e., computer programs, are intrinsically different from traditionally engineered products, but in one aspect they are identical: the extent to which the well-being of individuals, organizations, and society in general increasingly depend on software. As wardens of the future through our mentoring of the next generation of software developers, we believe that it is our responsibility to at least address Professor Boehm’s predicted catastrophe. Traditional engineering has, and continually addresses its social responsibility through the evolution of the education, practice, and professional certification/licensing of professional engineers. To be included in the fraternity of professional engineers, software engineering must do the same. To get a rough idea of where software engineering currently stands on some of these issues we conducted two surveys. Our main survey was sent to software engineering academics in the U.S., Canada, and Australia. Among other items it sought detail information on their software engineering programs. Our auxiliary survey was sent to U.S. engineering institutions to get some idea about how software engineering programs compared with those in established engineering disciplines of Civil, Electrical, and Mechanical Engineering. Summaries of our findings can be found in the last two sections of our paper

    High-Precision Numerical Simulations of Rotating Black Holes Accelerated by CUDA

    Full text link
    Hardware accelerators (such as Nvidia's CUDA GPUs) have tremendous promise for computational science, because they can deliver large gains in performance at relatively low cost. In this work, we focus on the use of Nvidia's Tesla GPU for high-precision (double, quadruple and octal precision) numerical simulations in the area of black hole physics -- more specifically, solving a partial-differential-equation using finite-differencing. We describe our approach in detail and present the final performance results as compared with a single-core desktop processor and also the Cell BE. We obtain mixed results -- order-of-magnitude gains in overall performance in some cases and negligible gains in others.Comment: 6 pages, 1 figure, 1 table, Accepted for publication in the International Conference on High Performance Computing Systems (HPCS 2010

    Wasco Environmental Chamber System Manifold

    Get PDF
    This report details the process of designing and producing a manifold that connects ultra-high purity (UHP) pressure and vacuum switches to an air supply so that the parts may be tested in an environmental chamber. Using the DMAIC (define, measure, analyze, improve, and control) methodology of problem solving, progress on the project includes thorough research about the problem, preliminary design solutions, iterative prototyping, and testing specific functionalities. Manufacturing engineering topics of tooling, fixturing, metrology, quality, and machining with manual and computer numerical control (CNC) varieties of both mills and lathes are applied during the course of the project. The report concludes with a final manifold design and comparison against the old manifold

    Part 1 - Overview and tools

    Get PDF
    The embedded systems (ES) formation require a broader set of knowledge, abilities and skills including informatics and electronics concepts in order to develop highly creative and imaginative applications based in analytical studies. Moreover, in an effort to improve the education quality it needs to be followed with intense hands-on laboratories. This paper presents a new approach for embedded systems courses appropriate for both high school and undergraduate classrooms, that has been conceived and designed to accomplish these goals, while motivating and equipping this next generation of engineers to rise to future challenges. The course structure was defined in order to be easy to understand and provide a logical flow along the topics, as it mostly progresses from simple topics to more advanced ones. The developed materials include slides for class room teaching, explanatory documents for student and educators future reference, laboratories, tests, programs and application examples after each chapter. Each module is dedicated to a specific aspect of the MSP430 device, including the description of a range of peripherals. This is the first part of the paper presenting the outline of the course. Particularly, this paper identifies the course need, presents its structure, and the initial subjects covering an introductory overview in logic design and embedded processors and a description of the available software and hardware development tools for the MSP430.info:eu-repo/semantics/publishedVersio

    Towards Constructing a Flexible Multimedia Environment for Teaching the History of Arts

    Get PDF
    Multimedia production displays two faces: a multimedia product is the result of programming as well as of publishing. Constructing a multimedia environment for teaching a course in the history of arts suggests that requirements elicitation has many facets and is central to success, hence has to be delt with in a particularly careful way, the more so since we wanted the art historians working in a specific and custom tailored environment. The problem was technically resolved by constructing a dedicated markup language based on XML. We discuss the process of requirements elicitation that led to the markup language and show how this is used as a cornerstone in the development of such an experimental environment

    Undergraduate Curriculum and Academic Policy Committee Minutes, March 1, 8, 15, and 31, 2011

    Get PDF
    Minutes from the Wright State University Faculty Senate Undergraduate Curriculum and Academic Policy Committee meetings held on March 1, 8, 15, and 31, 2011

    Speeding up ecological and evolutionary computations in R; essentials of high performance computing for biologists

    Get PDF
    Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research

    Illinois Technograph v. 079, iss. 6 Mar. 1964

    Get PDF
    published or submitted for publicatio
    • 

    corecore