33,902 research outputs found

    Beyond Reuse Distance Analysis: Dynamic Analysis for Characterization of Data Locality Potential

    Get PDF
    Emerging computer architectures will feature drastically decreased flops/byte (ratio of peak processing rate to memory bandwidth) as highlighted by recent studies on Exascale architectural trends. Further, flops are getting cheaper while the energy cost of data movement is increasingly dominant. The understanding and characterization of data locality properties of computations is critical in order to guide efforts to enhance data locality. Reuse distance analysis of memory address traces is a valuable tool to perform data locality characterization of programs. A single reuse distance analysis can be used to estimate the number of cache misses in a fully associative LRU cache of any size, thereby providing estimates on the minimum bandwidth requirements at different levels of the memory hierarchy to avoid being bandwidth bound. However, such an analysis only holds for the particular execution order that produced the trace. It cannot estimate potential improvement in data locality through dependence preserving transformations that change the execution schedule of the operations in the computation. In this article, we develop a novel dynamic analysis approach to characterize the inherent locality properties of a computation and thereby assess the potential for data locality enhancement via dependence preserving transformations. The execution trace of a code is analyzed to extract a computational directed acyclic graph (CDAG) of the data dependences. The CDAG is then partitioned into convex subsets, and the convex partitioning is used to reorder the operations in the execution trace to enhance data locality. The approach enables us to go beyond reuse distance analysis of a single specific order of execution of the operations of a computation in characterization of its data locality properties. It can serve a valuable role in identifying promising code regions for manual transformation, as well as assessing the effectiveness of compiler transformations for data locality enhancement. We demonstrate the effectiveness of the approach using a number of benchmarks, including case studies where the potential shown by the analysis is exploited to achieve lower data movement costs and better performance.Comment: Transaction on Architecture and Code Optimization (2014

    3D/2D Registration of Mapping Catheter Images for Arrhythmia Interventional Assistance

    Full text link
    Radiofrequency (RF) catheter ablation has transformed treatment for tachyarrhythmias and has become first-line therapy for some tachycardias. The precise localization of the arrhythmogenic site and the positioning of the RF catheter over that site are problematic: they can impair the efficiency of the procedure and are time consuming (several hours). Electroanatomic mapping technologies are available that enable the display of the cardiac chambers and the relative position of ablation lesions. However, these are expensive and use custom-made catheters. The proposed methodology makes use of standard catheters and inexpensive technology in order to create a 3D volume of the heart chamber affected by the arrhythmia. Further, we propose a novel method that uses a priori 3D information of the mapping catheter in order to estimate the 3D locations of multiple electrodes across single view C-arm images. The monoplane algorithm is tested for feasibility on computer simulations and initial canine data.Comment: International Journal of Computer Science Issues, IJCSI, Volume 4, Issue 2, pp10-19, September 200

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    International Veterinary Epilepsy Task Force recommendations for systematic sampling and processing of brains from epileptic dogs and cats

    Get PDF
    Traditionally, histological investigations of the epileptic brain are required to identify epileptogenic brain lesions, to evaluate the impact of seizure activity, to search for mechanisms of drug-resistance and to look for comorbidities. For many instances, however, neuropathological studies fail to add substantial data on patients with complete clinical work-up. This may be due to sparse training in epilepsy pathology and or due to lack of neuropathological guidelines for companion animals. The protocols introduced herein shall facilitate systematic sampling and processing of epileptic brains and therefore increase the efficacy, reliability and reproducibility of morphological studies in animals suffering from seizures. Brain dissection protocols of two neuropathological centres with research focus in epilepsy have been optimised with regards to their diagnostic yield and accuracy, their practicability and their feasibility concerning clinical research requirements. The recommended guidelines allow for easy, standardised and ubiquitous collection of brain regions, relevant for seizure generation. Tissues harvested the prescribed way will increase the diagnostic efficacy and provide reliable material for scientific investigations

    Opportunities and limitations of crop phenotyping in southern european countries

    Get PDF
    ReviewThe Mediterranean climate is characterized by hot dry summers and frequent droughts. Mediterranean crops are frequently subjected to high evapotranspiration demands, soil water deficits, high temperatures, and photo-oxidative stress. These conditions will become more severe due to global warming which poses major challenges to the sustainability of the agricultural sector in Mediterranean countries. Selection of crop varieties adapted to future climatic conditions and more tolerant to extreme climatic events is urgently required. Plant phenotyping is a crucial approach to address these challenges. High-throughput plant phenotyping (HTPP) helps to monitor the performance of improved genotypes and is one of the most effective strategies to improve the sustainability of agricultural production. In spite of the remarkable progress in basic knowledge and technology of plant phenotyping, there are still several practical, financial, and political constraints to implement HTPP approaches in field and controlled conditions across the Mediterranean. The European panorama of phenotyping is heterogeneous and integration of phenotyping data across different scales and translation of “phytotron research” to the field, and from model species to crops, remain major challenges. Moreover, solutions specifically tailored to Mediterranean agriculture (e.g., crops and environmental stresses) are in high demand, as the region is vulnerable to climate change and to desertification processes. The specific phenotyping requirements of Mediterranean crops have not yet been fully identified. The high cost of HTPP infrastructures is a major limiting factor, though the limited availability of skilled personnel may also impair its implementation in Mediterranean countries. We propose that the lack of suitable phenotyping infrastructures is hindering the development of new Mediterranean agricultural varieties and will negatively affect future competitiveness of the agricultural sector. We provide an overview of the heterogeneous panorama of phenotyping within Mediterranean countries, describing the state of the art of agricultural production, breeding initiatives, and phenotyping capabilities in five countries: Italy, Greece, Portugal, Spain, and Turkey. We characterize some of the main impediments for development of plant phenotyping in those countries and identify strategies to overcome barriers and maximize the benefits of phenotyping and modeling approaches to Mediterranean agriculture and related sustainabilityinfo:eu-repo/semantics/publishedVersio

    On Characterizing the Data Movement Complexity of Computational DAGs for Parallel Execution

    Get PDF
    Technology trends are making the cost of data movement increasingly dominant, both in terms of energy and time, over the cost of performing arithmetic operations in computer systems. The fundamental ratio of aggregate data movement bandwidth to the total computational power (also referred to the machine balance parameter) in parallel computer systems is decreasing. It is there- fore of considerable importance to characterize the inherent data movement requirements of parallel algorithms, so that the minimal architectural balance parameters required to support it on future systems can be well understood. In this paper, we develop an extension of the well-known red-blue pebble game to develop lower bounds on the data movement complexity for the parallel execution of computational directed acyclic graphs (CDAGs) on parallel systems. We model multi-node multi-core parallel systems, with the total physical memory distributed across the nodes (that are connected through some interconnection network) and in a multi-level shared cache hierarchy for processors within a node. We also develop new techniques for lower bound characterization of non-homogeneous CDAGs. We demonstrate the use of the methodology by analyzing the CDAGs of several numerical algorithms, to develop lower bounds on data movement for their parallel execution

    GLOBE: Science and Education

    Get PDF
    This article provides a brief overview of the GLOBE Program and describes its benefits to scientists, teachers, and students. The program itself is designed to use environmental research as a means to improve student achievement in basic science, mathematics, geography, and use of technology. Linking of students and scientists as collaborators is seen as a fundamental part of the process. GLOBE trains teachers to teach students how to take measurements of environmental parameters at quality levels acceptable for scientific research. Teacher training emphasizes a hands-on, inquiry-based methodology. Student-collected GLOBE data are universally accessible through the Web. An annual review over the past six years indicates that GLOBE has had a positive impact on students' abilities to use scientific data in decision-making and on students' scientifically informed awareness of the environment. Educational levels: Graduate or professional

    EcoGIS – GIS tools for ecosystem approaches to fisheries management

    Get PDF
    Executive Summary: The EcoGIS project was launched in September 2004 to investigate how Geographic Information Systems (GIS), marine data, and custom analysis tools can better enable fisheries scientists and managers to adopt Ecosystem Approaches to Fisheries Management (EAFM). EcoGIS is a collaborative effort between NOAA’s National Ocean Service (NOS) and National Marine Fisheries Service (NMFS), and four regional Fishery Management Councils. The project has focused on four priority areas: Fishing Catch and Effort Analysis, Area Characterization, Bycatch Analysis, and Habitat Interactions. Of these four functional areas, the project team first focused on developing a working prototype for catch and effort analysis: the Fishery Mapper Tool. This ArcGIS extension creates time-and-area summarized maps of fishing catch and effort from logbook, observer, or fishery-independent survey data sets. Source data may come from Oracle, Microsoft Access, or other file formats. Feedback from beta-testers of the Fishery Mapper was used to debug the prototype, enhance performance, and add features. This report describes the four priority functional areas, the development of the Fishery Mapper tool, and several themes that emerged through the parallel evolution of the EcoGIS project, the concept and implementation of the broader field of Ecosystem Approaches to Management (EAM), data management practices, and other EAM toolsets. In addition, a set of six succinct recommendations are proposed on page 29. One major conclusion from this work is that there is no single “super-tool” to enable Ecosystem Approaches to Management; as such, tools should be developed for specific purposes with attention given to interoperability and automation. Future work should be coordinated with other GIS development projects in order to provide “value added” and minimize duplication of efforts. In addition to custom tools, the development of cross-cutting Regional Ecosystem Spatial Databases will enable access to quality data to support the analyses required by EAM. GIS tools will be useful in developing Integrated Ecosystem Assessments (IEAs) and providing pre- and post-processing capabilities for spatially-explicit ecosystem models. Continued funding will enable the EcoGIS project to develop GIS tools that are immediately applicable to today’s needs. These tools will enable simplified and efficient data query, the ability to visualize data over time, and ways to synthesize multidimensional data from diverse sources. These capabilities will provide new information for analyzing issues from an ecosystem perspective, which will ultimately result in better understanding of fisheries and better support for decision-making. (PDF file contains 45 pages.
    • …
    corecore