5,452 research outputs found

    Parallelization and Visual Analysis of Multidimensional Fields: Application to Ozone Production, Destruction, and Transport in Three Dimensions

    Get PDF
    This final report has four sections. We first describe the actual scientific results attained by our research team, followed by a description of the high performance computing research enhancing those results and prompted by the scientific tasks being undertaken. Next, we describe our research in data and program visualization motivated by the scientific research and also enabling it. Last, we comment on the indirect effects this research effort has had on our work, in terms of follow up or additional funding, student training, etc

    Global Grids and Software Toolkits: A Study of Four Grid Middleware Technologies

    Full text link
    Grid is an infrastructure that involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid applications often involve large amounts of data and/or computing resources that require secure resource sharing across organizational boundaries. This makes Grid application management and deployment a complex undertaking. Grid middlewares provide users with seamless computing ability and uniform access to resources in the heterogeneous Grid environment. Several software toolkits and systems have been developed, most of which are results of academic research projects, all over the world. This chapter will focus on four of these middlewares--UNICORE, Globus, Legion and Gridbus. It also presents our implementation of a resource broker for UNICORE as this functionality was not supported in it. A comparison of these systems on the basis of the architecture, implementation model and several other features is included.Comment: 19 pages, 10 figure

    Status Report of the DPHEP Study Group: Towards a Global Effort for Sustainable Data Preservation in High Energy Physics

    Full text link
    Data from high-energy physics (HEP) experiments are collected with significant financial and human effort and are mostly unique. An inter-experimental study group on HEP data preservation and long-term analysis was convened as a panel of the International Committee for Future Accelerators (ICFA). The group was formed by large collider-based experiments and investigated the technical and organisational aspects of HEP data preservation. An intermediate report was released in November 2009 addressing the general issues of data preservation in HEP. This paper includes and extends the intermediate report. It provides an analysis of the research case for data preservation and a detailed description of the various projects at experiment, laboratory and international levels. In addition, the paper provides a concrete proposal for an international organisation in charge of the data management and policies in high-energy physics

    The TeraGyroid Experiment

    Get PDF
    The TeraGyroid experiment at SC 03 addressed a large-scale problem of genuine scientific interest at the same time as showing how intercontinental grids enable new paradigms for collaborative computational science that can dramatically reduce the time to insight. TeraGyroid used computational steering to accelerate the exploration of parameter space in condensed matter simulations. The scientific objective was to study the self-assembly, defect pathways and dynamics of liquid crystalline cubic gyroid mesophases using the largest set of lattice-Boltzmann (LB) simulations ever performed, involving in some cases lattices of over one billion sites and for highly extended simulation times. We describe the application in sufficient detail to reveal how it uses the grid to support interactions between its distributed parts, where the interfaces exist between the application and the middleware infrastructure, what grid services and capabilities are used, and why important design decisions were made. We also describe how the resources of highend computing services were federated with the UK e-Science Grid and the US TeraGrid to form the TeraGyroid testbed, and summarise the lessons learned during the experiment

    Report of the Indiana University Research Data Management Taskforce

    Get PDF
    The “data deluge” in the sciences—the ability to create massive streams of digital data—has been discussed at great length in the academic and lay press. The ability with which scientists can now produce data has transformed scientific practice so that creating data is now less of a challenge in many disciplines than making use of, properly analyzing, and properly storing such data. Two aspects of the data deluge are not as widely appreciated. One is that the data deluge is not contained simply to the sciences. Humanities scholars and artists are generating data at prodigious rates as well through massive scanning projects, digitization of still photographs, video, and music, and the creation of new musical and visual art forms that are inherently digital. A second factor that is not well appreciated is that data collected now is potentially valuable forever. The genomic DNA sequences of a particular organism are what they are. They are known precisely. Or, more properly, the sequences of the contigs that are assembled to create the sequence are known precisely, while there may be dispute about the proper assembly. Such data will be of value indefinitely – and for example to the extent that we wonder if environmental changes are changing the population genetics of various organisms, data on the frequency of particular genetic variations in populations will be of value indefinitely. Similarly, video and audio of an American folk musician, a speaker of an endangered language or a ballet performance will be of value indefinitely although argument might well go on regarding the interpretation and annotation of that video and audio. Such images and associated audio can never be recreated, and are thus of use indefinitely

    I-Light Applications Workshop 2002 Proceedings

    Get PDF
    Editing for this document was provided by Gregory Moore and Craig A. Stewart.Indiana Governor Frank O'Bannon symbolically lit the fiber of the I-Light network on December 11, 2001. I-Light is a unique, high-speed fiber optic network connecting Indiana University Bloomington, Indiana University–Purdue University Indianapolis, and Purdue University West Lafayette with each other and with Abilene, the national high-speed Internet2 research and education network. This unique university-owned high speed network connects three of the Indiana's great research campuses. One year after the lighting of the network, we invited researchers from Indiana University and Purdue University to come together to discuss some of the research and instructional achievements that have been made possible in just one short year of the existence of I-Light. The results were dramatic: on December 4, 2002, more than 150 researchers gathered together in Indianapolis to discuss research and instructional breakthroughs made possible by I-Light.The I-Light Applications Workshop 2002 was sponsored by the Office of the Vice President for Information Technology and CIO, Indiana University; and the Office of the Vice President for Information Technology and CIO, Purdue University. I-Light was made possible by a special appropriation by the State of Indiana. The research described at the I-Light Applications Workshop has been supported by numerous grants from several sources, mentioned in the individual presentations included in this proceedings volume. Many of the scientific research projects discussed in this volume have been supported by the National Science Foundation and/or the National Institutes of Health. Some Purdue projects also received support from Indiana's 21st Century Fund. Multiple presentations featured work supported by the Lilly Endowment, Inc., through grants to Indiana University in support of the Pervasive Technology Laboratories and the Indiana Genomics Initiative, both at Indiana University. Purdue University projects received support from the National Science Foundation and the 21st Century Fund. Any opinions, findings and conclusions, or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the granting agencies

    GA4GH: International policies and standards for data sharing across genomic research and healthcare.

    Get PDF
    The Global Alliance for Genomics and Health (GA4GH) aims to accelerate biomedical advances by enabling the responsible sharing of clinical and genomic data through both harmonized data aggregation and federated approaches. The decreasing cost of genomic sequencing (along with other genome-wide molecular assays) and increasing evidence of its clinical utility will soon drive the generation of sequence data from tens of millions of humans, with increasing levels of diversity. In this perspective, we present the GA4GH strategies for addressing the major challenges of this data revolution. We describe the GA4GH organization, which is fueled by the development efforts of eight Work Streams and informed by the needs of 24 Driver Projects and other key stakeholders. We present the GA4GH suite of secure, interoperable technical standards and policy frameworks and review the current status of standards, their relevance to key domains of research and clinical care, and future plans of GA4GH. Broad international participation in building, adopting, and deploying GA4GH standards and frameworks will catalyze an unprecedented effort in data sharing that will be critical to advancing genomic medicine and ensuring that all populations can access its benefits

    Orbital Angular Momentum Waves: Generation, Detection and Emerging Applications

    Full text link
    Orbital angular momentum (OAM) has aroused a widespread interest in many fields, especially in telecommunications due to its potential for unleashing new capacity in the severely congested spectrum of commercial communication systems. Beams carrying OAM have a helical phase front and a field strength with a singularity along the axial center, which can be used for information transmission, imaging and particle manipulation. The number of orthogonal OAM modes in a single beam is theoretically infinite and each mode is an element of a complete orthogonal basis that can be employed for multiplexing different signals, thus greatly improving the spectrum efficiency. In this paper, we comprehensively summarize and compare the methods for generation and detection of optical OAM, radio OAM and acoustic OAM. Then, we represent the applications and technical challenges of OAM in communications, including free-space optical communications, optical fiber communications, radio communications and acoustic communications. To complete our survey, we also discuss the state of art of particle manipulation and target imaging with OAM beams
    corecore