163,511 research outputs found

    Representation of Earth scientific information by the Google(TM) Earth

    Get PDF
    The Google(TM) Earth is a GIS application provided by the Google, with versatile and high performance visualization and manipulation capability for geographic information. The Google Earth is also a multi-platform application, so the installation and running cost for research and education site is relatively low. The Earth scientific information also have a geographic information scheme, so the Google Earth has some potential to support the education and research field of the Earth science. In this report, some Earth scientific information were converted by the R-Language and its libraries for represent by the Google Earth. The 3-dimensional representation of the information will be able to support to understanding the specification of the data for the Earth scientific research and education field

    Research and Education in Computational Science and Engineering

    Get PDF
    Over the past two decades the field of computational science and engineering (CSE) has penetrated both basic and applied research in academia, industry, and laboratories to advance discovery, optimize systems, support decision-makers, and educate the scientific and engineering workforce. Informed by centuries of theory and experiment, CSE performs computational experiments to answer questions that neither theory nor experiment alone is equipped to answer. CSE provides scientists and engineers of all persuasions with algorithmic inventions and software systems that transcend disciplines and scales. Carried on a wave of digital technology, CSE brings the power of parallelism to bear on troves of data. Mathematics-based advanced computing has become a prevalent means of discovery and innovation in essentially all areas of science, engineering, technology, and society; and the CSE community is at the core of this transformation. However, a combination of disruptive developments---including the architectural complexity of extreme-scale computing, the data revolution that engulfs the planet, and the specialization required to follow the applications to new frontiers---is redefining the scope and reach of the CSE endeavor. This report describes the rapid expansion of CSE and the challenges to sustaining its bold advances. The report also presents strategies and directions for CSE research and education for the next decade.Comment: Major revision, to appear in SIAM Revie

    36M-pixel synchrotron radiation micro-CT for whole secondary pulmonary lobule visualization from a large human lung specimen

    Get PDF
    A micro-CT system was developed using a 36M-pixel digital single-lens reflex camera as a cost-effective mode for large human lung specimen imaging. Scientific grade cameras used for biomedical x-ray imaging are much more expensive than consumer-grade cameras. During the past decade, advances in image sensor technology for consumer appliances have spurred the development of biomedical x-ray imaging systems using commercial digital single-lens reflex cameras fitted with high megapixel CMOS image sensors. This micro-CT system is highly specialized for visualizing whole secondary pulmonary lobules in a large human lung specimen. The secondary pulmonary lobule, a fundamental unit of the lung structure, reproduces the lung in miniature. The lung specimen is set in an acrylic cylindrical case of 36 mm diameter and 40 mm height. A field of view (FOV) of the micro-CT is 40.6 mm wide × 15.1 mm high with 3.07 μm pixel size using offset CT scanning for enlargement of the FOV. We constructed a 13,220 × 13,220 × 4912 voxel image with 3.07 μm isotropic voxel size for three-dimensional visualization of the whole secondary pulmonary lobule. Furthermore, synchrotron radiation has proved to be a powerful high-resolution imaging tool. This micro-CT system using a single-lens reflex camera and synchrotron radiation provides practical benefits of high-resolution and wide-field performance, but at low cost

    Low-Cost Air Quality Monitoring Tools: From Research to Practice (A Workshop Summary).

    Get PDF
    In May 2017, a two-day workshop was held in Los Angeles (California, U.S.A.) to gather practitioners who work with low-cost sensors used to make air quality measurements. The community of practice included individuals from academia, industry, non-profit groups, community-based organizations, and regulatory agencies. The group gathered to share knowledge developed from a variety of pilot projects in hopes of advancing the collective knowledge about how best to use low-cost air quality sensors. Panel discussion topics included: (1) best practices for deployment and calibration of low-cost sensor systems, (2) data standardization efforts and database design, (3) advances in sensor calibration, data management, and data analysis and visualization, and (4) lessons learned from research/community partnerships to encourage purposeful use of sensors and create change/action. Panel discussions summarized knowledge advances and project successes while also highlighting the questions, unresolved issues, and technological limitations that still remain within the low-cost air quality sensor arena

    Functional requirements document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992

    Get PDF
    Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented

    HPC Cloud for Scientific and Business Applications: Taxonomy, Vision, and Research Challenges

    Full text link
    High Performance Computing (HPC) clouds are becoming an alternative to on-premise clusters for executing scientific applications and business analytics services. Most research efforts in HPC cloud aim to understand the cost-benefit of moving resource-intensive applications from on-premise environments to public cloud platforms. Industry trends show hybrid environments are the natural path to get the best of the on-premise and cloud resources---steady (and sensitive) workloads can run on on-premise resources and peak demand can leverage remote resources in a pay-as-you-go manner. Nevertheless, there are plenty of questions to be answered in HPC cloud, which range from how to extract the best performance of an unknown underlying platform to what services are essential to make its usage easier. Moreover, the discussion on the right pricing and contractual models to fit small and large users is relevant for the sustainability of HPC clouds. This paper brings a survey and taxonomy of efforts in HPC cloud and a vision on what we believe is ahead of us, including a set of research challenges that, once tackled, can help advance businesses and scientific discoveries. This becomes particularly relevant due to the fast increasing wave of new HPC applications coming from big data and artificial intelligence.Comment: 29 pages, 5 figures, Published in ACM Computing Surveys (CSUR
    corecore