159 research outputs found
The generic mapping tools version 6
The Generic Mapping Tools (GMT) software is ubiquitous in the Earth and ocean sciences. As a cross-platform tool producing high-quality maps and figures, it is used by tens of thousands of scientists around the world. The basic syntax of GMT scripts has evolved very slowly since the 1990s, despite the fact that GMT is generally perceived to have a steep learning curve with many pitfalls for beginners and experienced users alike. Reducing these pitfalls means changing the interface, which would break compatibility with thousands of existing scripts. With the latest GMT version 6, we solve this conundrum by introducing a new "modern mode" to complement the interface used in previous versions, which GMT 6 now calls "classic mode." GMT 6 defaults to classic mode and thus is a recommended upgrade for all GMT 5 users. Nonetheless, new users should take advantage of modern mode to make shorter scripts, quickly access commonly used global data sets, and take full advantage of the new tools to draw subplots, place insets, and create animations.Funding Agency
National Science Foundation (NSF)
Appeared in article as
U.S. National Science Foundation
MSU Geological Sciences Endowmentinfo:eu-repo/semantics/publishedVersio
Analysing I/O bottlenecks in LHC data analysis on grid storage resources
We describe recent I/O testing frameworks that we have developed and applied within the UK GridPP Collaboration, the ATLAS experiment and the DPM team, for a variety of distinct purposes. These include benchmarking vendor supplied storage products, discovering scaling limits of SRM solutions, tuning of storage systems for experiment data analysis, evaluating file access protocols, and exploring I/O read patterns of experiment software and their underlying event data models. With multiple grid sites now dealing with petabytes of data, such studies are becoming essential. We describe how the tests build, and improve, on previous work and contrast how the use-cases differ. We also detail the results obtained and the implications for storage hardware, middleware and experiment software
Photogrammetry for 3D Reconstruction in SOLIDWORKS and its Applications in Industry
Indiana University-Purdue University Indianapolis (IUPUI)Close range, image based photogrammetry and LIDAR laser scanning technique
are commonly utilized methodologies to snap real objects.3D models of already existing
model or parts can be reconstructed by laser scanning and photogrammetry.
These 3D models can be useful in applications like quality inspection, reverse engineering.
With these techniques, they have their merits and limitations. Though laser scanners
have higher accuracy, they require higher initial investment. Close-range photogrammetry
is known for its simplicity, versatility and e ective detection of complex
surfaces and 3D measurement of parts. But photogrammetry techniques can be initiated
with comparatively much lower initial cost with acceptable accuracy.
Currently, many industries are using photogrammetry for reverse engineering,
quality inspection purposes. But, for photogrammetric object reconstruction, they
are using di erent softwares. Industrial researchers are using commercial/open source
codes for reconstruction and another stand-alone software for reverse engineering and
mesh deviation analysis.
So the problem statement here for this thesis is to integrate Photogrammetry,
reverse engineering and deviation analysis to make one state-of-the-art
work
ow.
xx
The objectives of this thesis are as follows:
1. Comparative study between available source codes and identify suitable and
stable code for integration; understand the photogrammetry methodology of
that particular code.
2. To create a taskpane add-in using API for Integration of selected photogrammetry
methodology and facilitate methodology with parameters.
3. To demonstrate the photogrammetric work
ow followed by a reverse engineering
case studies to showcase the potential of integration.
4. Parametric study for number of images vs accuracy
5. Comparison of Scan results, photogrammetry results with actual CAD dat
Parthenon -- a performance portable block-structured adaptive mesh refinement framework
On the path to exascale the landscape of computer device architectures and
corresponding programming models has become much more diverse. While various
low-level performance portable programming models are available, support at the
application level lacks behind. To address this issue, we present the
performance portable block-structured adaptive mesh refinement (AMR) framework
Parthenon, derived from the well-tested and widely used Athena++ astrophysical
magnetohydrodynamics code, but generalized to serve as the foundation for a
variety of downstream multi-physics codes. Parthenon adopts the Kokkos
programming model, and provides various levels of abstractions from
multi-dimensional variables, to packages defining and separating components, to
launching of parallel compute kernels. Parthenon allocates all data in device
memory to reduce data movement, supports the logical packing of variables and
mesh blocks to reduce kernel launch overhead, and employs one-sided,
asynchronous MPI calls to reduce communication overhead in multi-node
simulations. Using a hydrodynamics miniapp, we demonstrate weak and strong
scaling on various architectures including AMD and NVIDIA GPUs, Intel and AMD
x86 CPUs, IBM Power9 CPUs, as well as Fujitsu A64FX CPUs. At the largest scale
on Frontier (the first TOP500 exascale machine), the miniapp reaches a total of
zone-cycles/s on 9,216 nodes (73,728 logical GPUs) at ~92%
weak scaling parallel efficiency (starting from a single node). In combination
with being an open, collaborative project, this makes Parthenon an ideal
framework to target exascale simulations in which the downstream developers can
focus on their specific application rather than on the complexity of handling
massively-parallel, device-accelerated AMR.Comment: 17 pages, 11 figures, accepted for publication in IJHPCA, Codes
available at https://github.com/parthenon-hpc-la
The Third Annual NASA Science Internet User Working Group Conference
The NASA Science Internet (NSI) User Support Office (USO) sponsored the Third Annual NSI User Working Group (NSIUWG) Conference March 30 through April 3, 1992, in Greenbelt, MD. Approximately 130 NSI users attended to learn more about the NSI, hear from projects which use NSI, and receive updates about new networking technologies and services. This report contains material relevant to the conference; copies of the agenda, meeting summaries, presentations, and descriptions of exhibitors. Plenary sessions featured a variety of speakers, including NSI project management, scientists, and NSI user project managers whose projects and applications effectively use NSI, and notable citizens of the larger Internet community. The conference also included exhibits of advanced networking applications; tutorials on internetworking, computer security, and networking technologies; and user subgroup meetings on the future direction of the conference, networking, and user services and applications
Standards and practices for reporting plankton and other particle observations from images
This technical manual guides the user through the process of creating a data table for the submission of taxonomic and morphological information for plankton and other particles from images to a repository. Guidance is provided to produce documentation that should accompany the submission of plankton and other particle data to a repository, describes data collection and processing techniques, and outlines the creation of a data file. Field names include scientificName that represents the lowest level taxonomic classification (e.g., genus if not certain of species, family if not certain of genus) and scientificNameID, the unique identifier from a reference database such as the World Register of Marine Species or AlgaeBase. The data table described here includes the field names associatedMedia, scientificName/ scientificNameID for both automated and manual identification, biovolume, area_cross_section, length_representation and width_representation. Additional steps that instruct the user on how to format their data for a submission to the Ocean Biodiversity Information System (OBIS) are also included. Examples of
documentation and data files are provided for the user to follow. The documentation requirements and data table format are approved by both NASA’s SeaWiFS Bio-optical Archive and Storage System (SeaBASS) and the National Science Foundation’s Biological and Chemical Oceanography Data Management Office (BCO-DMO).This report was an outcome of a working group supported by the Ocean Carbon and Biogeochemistry (OCB) project office, which is funded by the US National Science Foundation (OCE1558412) and the National Aeronautics and Space Administration (NNX17AB17G). AN, SB, and CP conceived and drafted the document. IC, IST, JF and HS contributed to the main body of the document as well as the example files. All members of the working group contributed to the content of the document, including the conceptualization of the data table and metadata format. We would also like thank the external reviewers Cecile Rousseaux (NASA GSFC), Susanne Menden-Deuer (URI) Frank Muller-Karger (USF), and Abigail Benson (USGS) for their valuable feedback
- …