227,651 research outputs found

    Montage: a grid-enabled engine for delivering custom science-grade mosaics on demand

    Get PDF
    This paper describes the design of a grid-enabled version of Montage, an astronomical image mosaic service, suitable for large scale processing of the sky. All the re-projection jobs can be added to a pool of tasks and performed by as many processors as are available, exploiting the parallelization inherent in the Montage architecture. We show how we can describe the Montage application in terms of an abstract workflow so that a planning tool such as Pegasus can derive an executable workflow that can be run in the Grid environment. The execution of the workflow is performed by the workflow manager DAGMan and the associated Condor-G. The grid processing will support tiling of images to a manageable size when the input images can no longer be held in memory. Montage will ultimately run operationally on the Teragrid. We describe science applications of Montage, including its application to science product generation by Spitzer Legacy Program teams and large-scale, all-sky image processing projects

    Design of Video Compression Approach in Grid Environment

    Get PDF
    Grid offers an optimal solution to problems requiring large storage requiring large storage and/or processing power. Grid provides direct access to computers, data, software and many other resources. Sharing between these resources is highly controlled and done with consensus of both resource providers and consumers. Video compression is a lengthy and compute intensive task, involving compression of video media from one format to another. Video compression refers t reducing the quantity of data used to represent digital video images, and it’s a combination of spatial image compression and temporal motion compensation. Video compression system maintains high picture quality while reducing its quantity by removing redundancies. In Grid computing the task is split up into smaller chunks and the resources of many computers in a network can be applied to a single problem at the same time independently. These factors make the distribution of video compression viable in Grid environment. This paper provides a discussion of design of video compression in Grid environment, Grid environment and its globus toolkit 4.0

    An OGSA Middleware for Managing Medical Images Using Ontologies

    Full text link
    The final publication is available at Springer via http://dx.doi.org/ 10.1007/s10877-005-0675-0This article presents a Middleware based on Grid Technologies that addresses the problem of sharing, transferring and processing DICOM medical images in a distributed environment using an ontological schema to create virtual communities and to define common targets. It defines a distributed storage that builds-up virtual repositories integrating different individual image repositories providing global searching, progressive transmission, automatic encryption and pseudo-anonimisation and a link to remote processing services. Users from a Virtual Organisation can share the cases that are relevant for their communities or research areas, epidemiological studies or even deeper analysis of complex individual cases. Software architecture has been defined for solving the problems that has been exposed before. Briefly, the architecture comprises five layers (from the more physical layer to the more logical layer) based in Grid Thecnologies. The lowest level layers (Core Middleware Layer and Server Services layer) are composed of Grid Services that implement the global managing of resources. The Middleware Components Layer provides a transparent view of the Grid environment and it has been the main objective of this work. Finally, the upest layer (the Application Layer) comprises the applications, and a simple application has been implemented for testing the components developed in the Components Middleware Layer. Other side-results of this work are the services developed in the Middleware Components Layer for managing DICOM images, creating virtual DICOM storages, progressive transmission, automatic encryption and pseudo-anonimisation depending on the ontologies. Other results, such as the Grid Services developed in the lowest layers, are also described in this article. Finally a brief performance analysis and several snapshots from the applications developed are shown. The performance analysis proves that the components developed in this work provide image processing applications with new possibilities for large-scale sharing, management and processing of DICOM images. The results show that the components fulfil the objectives proposed. The extensibility of the system is achieved by the use of open methods and protocols, so new components can be easily added.Blanquer Espert, I.; Hernández García, V.; Segrelles Quilis, JD. (2005). An OGSA Middleware for Managing Medical Images Using Ontologies. Journal of Clinical Monitoring and Computing. 19:295-305. doi:10.1007/s10877-005-0675-0S29530519“European DataGrid Project”. http://www.eu-datagrid.org.“Biomedical Informatics Research”. http://www.nbirn.net/.“ACI project MEDIGRID: medical data storage and processing on the GRID”.http://www.creatis.insa-lyon.fr/MEDIGRID/.“Information eXtraction from Images (IXI) Grid Services for Medical Imaging”. Working Notes of the Workshop on Distributed Databases and processing in Medical Image Computing (DIDAMIC'04). Pag 65.“NeuroBase: Management of Distributed and Heterogeneous Information Sources in Neuroimaging”. Working Notes of the Workshop on Distributed Databases and processing in Medical Image Computing (DIDAMIC'04). Pag 85.Digital Imaging and Communications in Medicine (DICOM) Part 10: Media Storage and File Format for Media Interchange. National Electrical Manufacturers Association, 1300 N. 17th Street, Rosslyn, Virginia 22209 USA.“Open Grid Services Architecture (OGSA)”, http://www.globus.org/ogsa.Globus alliance Home Page. “Relevant documents”, http://www.globus.orgAllen Wyke R, Watt A, “XML Schema Essentials”. Wiley Computer Pub. ISBN 0-471-412597Web security and commerce/Simson Garfinkel. - Cambridge: O'Reilly, 1997. - 483 p.; 23 cm. ISBN 1565922697“The GridFTP Protocol and Software”. http://www-fp.globus.org/datagrid/gridftp.html.JPEG2000: Image compression fundamentals, standards and practice/David S. Taubman, Michael W. Marcellin. – Boston [etc.] : Kluwer Academic, cop. 2002. - XIX, 773 p.; 24 cm. + 1 CD-Rom - (The Kluwer international series in engineering and computer science) ISBN 079237519XBradley J, Erickson MD, “Irreversible Compression of Medical Images”, Dpt. Radiology, Mayo F., Rochester, MN, Jo. of D. Imaging, DOI: 10.1007/s10278-002-0001-z, 02.Monitoring & Discovery System (MDS)” http://www-unix.globus.org/toolkit/mds/“Key management for encrypted data storage in distributed systems”. Proceedings of HeathGrid 2004

    Combining Image Processing with Signal Processing to Improve Transmitter Geolocation Estimation

    Get PDF
    This research develops an algorithm which combines image processing with signal processing to improve transmitter geolocation capability. A building extraction algorithm is compiled from current techniques in order to provide the locations of rectangular buildings within an aerial, orthorectified, RGB image to a geolocation algorithm. The geolocation algorithm relies on measured TDOA data from multiple ground sensors to locate a transmitter by searching a grid of possible transmitter locations within the image region. At each evaluated grid point, theoretical TDOA values are computed for comparison to the measured TDOA values. To compute the theoretical values, the shortest path length between the transmitter and each of the sensors is determined. The building locations are used to determine if the LOS path between these two points is obstructed and what would be the shortest reflected path length. The grid location producing theoretical TDOA values closest to the measured TDOA values is the result of the algorithm. Measured TDOA data is simulated in this thesis. The thesis method performance is compared to that of a current geolocation method that uses Taylor series expansion to solve for the intersection of hyperbolic curves created by the TDOA data. The average online runtime of thesis simulations range from around 20 seconds to around 2 minutes, while the Taylor series method only takes about 0.02 seconds. The thesis method also includes an offline runtime of up to 30 minutes for a given image region and sensor configuration. The thesis method improves transmitter geolocation error by an average of 44m, or 53% in the obstructed simulation cases when compared with the current Taylor series method. However, in cases when all sensors have a direct LOS, the current method performs more accurately. Therefore, the thesis method is most applicable to missions requiring tracking of slower-moving targets in an urban environment with stationary sensors

    Montage: a grid-enabled engine for delivering custom science-grade mosaics on demand

    Get PDF
    This paper describes the design of a grid-enabled version of Montage, an astronomical image mosaic service, suitable for large scale processing of the sky. All the re-projection jobs can be added to a pool of tasks and performed by as many processors as are available, exploiting the parallelization inherent in the Montage architecture. We show how we can describe the Montage application in terms of an abstract workflow so that a planning tool such as Pegasus can derive an executable workflow that can be run in the Grid environment. The execution of the workflow is performed by the workflow manager DAGMan and the associated Condor-G. The grid processing will support tiling of images to a manageable size when the input images can no longer be held in memory. Montage will ultimately run operationally on the Teragrid. We describe science applications of Montage, including its application to science product generation by Spitzer Legacy Program teams and large-scale, all-sky image processing projects

    Global resource information database

    Get PDF
    The United Nations Environment Programme (UNEP)is responsible for initiating and stimulating environmental action and awareness at all levels of society worldwide and for coordinating the environmental work of all United Nations organizations and agencies. Within this framework, UNEP has established the Global Resource Information Database (GRID) to provide the world community with access to timely, usable environmental data and access to the geographic information system, satellite image processing, and telecommunication technology necessary for each data recipient to make the best use of these data and for global science applications, wise resource management, and sustainable development planning. Through GRID, UNEP will address environmental issues at global, regional, and national levels to bridge the gap between scientific understanding of earth processes and sound management of the environment. The long-term objectives of the GRID activity are to ensure that (1) all pertinent global and regional environmental data are available through GRID network to a range of users from students to scientists to politicians; (2) all United Nations specialized agencies and most major intergovernmental organizations will have access to modern technology and the opportunity to provide the necessary information-management support within their own organizations for the description, understanding, and solution of environment-related problems; (3) all countries will have access to GRID data and technology, with most having functioning GRID-compatible monitoring and assessment centers for national environmental assessment and management support. The GRID is designed to become a network of cooperating centers in various regions of the world. At present there are GRID centers in Geneva, Switzerland; Warsaw, Poland; Arendal, Norway; Nairobi, Kenya; Bangkok, Thailand; Kathmandu, Nepal: Tsukuba, Japan; and Siox Falls, United states. Soon there will be GRID centers in Brazil, Russia, Germany, the Caribbean, and the South Pacific. Each of these centers has specific functions within the network. Certain centers deal with sectorial or discipline-specific information; other centers have responsibility for specific geographic areas; still others deal with new technology and general data services
    • …
    corecore