111 research outputs found

    Analysis Ready Data in Analytics Optimized Data Stores for Analysis of Big Earth Data in the Cloud

    Get PDF
    Cloud computing offers the possibility of making the analysis of Big Data approachable for a wider community due to affordable access to computing power, an ecosystem of usable tools for parallel processing, and migration of many large datasets to archives in the cloud, allowing data-proximal computing. Generally, data analysis acceleration in the cloud comes from running multiple nodes in a split-combine-apply strategy. Data systems such as the Earth Observing System Data and Information System are in a position to "pre-split" the data by storing them in a data store that is optimized for data parallel computing, i.e., an Analytics-Optimized Data Store (AODS). A variety of approaches to AODS are possible, from highly scalable databases to scalable filesystems to data formats optimized for cloud access (e.g., zarr and cloud-optimized datasets), with the optimal choice dependent on both the types of analysis and the geospatial structure of the data. A key question is how much preprocessing of the data to do, both before splitting and as the first part of the apply step. Again, the geospatial structure of the data and the analysis type influence the decision, with the added complexity of the user type. Trans-disciplinary users who are not well-versed in the nuances of quality-filtering and georeferencing of remote sensing orbit/swath/scene data tend to ask for more highly processed data, relying on the data provider to make sensible decisions on preprocessing parameters. (This accounts for the popularity of "Level 3" gridded data, despite the lower spatial resolution it provides.) In this case, data can be preprocessed before the split, resulting in higher performance in the rest of the "apply" step, which can be transformative for use cases such as interactive data exploration at scale. Discipline researchers who are experienced with remote sensing data often prefer more flexibility in customizing the preprocessing data into Analysis Ready Data, resulting in more need for on-the-fly preprocessing

    XML-Based Generator of C++ Code for Integration With GUIs

    Get PDF
    An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values

    Open-Source Software for Modeling of Nanoelectronic Devices

    Get PDF
    The Nanoelectronic Modeling 3-D (NEMO 3-D) computer program has been upgraded to open-source status through elimination of license-restricted components. The present version functions equivalently to the version reported in "Software for Numerical Modeling of Nanoelectronic Devices" (NPO-30520), NASA Tech Briefs, Vol. 27, No. 11 (November 2003), page 37. To recapitulate: NEMO 3-D performs numerical modeling of the electronic transport and structural properties of a semiconductor device that has overall dimensions of the order of tens of nanometers. The underlying mathematical model represents the quantum-mechanical behavior of the device resolved to the atomistic level of granularity. NEMO 3-D solves the applicable quantum matrix equation on a Beowulf-class cluster computer by use of a parallel-processing matrix vector multiplication algorithm coupled to a Lanczos and/or Rayleigh-Ritz algorithm that solves for eigenvalues. A prior upgrade of NEMO 3-D incorporated a capability for a strain treatment, parameterized for bulk material properties of GaAs and InAs, for two tight-binding submodels. NEMO 3-D has been demonstrated in atomistic analyses of effects of disorder in alloys and, in particular, in bulk In(x)Ga(1-x)As and in In(0.6)Ga(0.4)As quantum dots

    Web Program for Development of GUIs for Cluster Computers

    Get PDF
    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking

    Federated Space-Time Query for Earth Science Data Using OpenSearch Conventions

    Get PDF
    This slide presentation reviews a Space-time query system that has been developed to assist the user in finding Earth science data that fulfills the researchers needs. It reviews the reasons why finding Earth science data can be so difficult, and explains the workings of the Space-Time Query with OpenSearch and how this system can assist researchers in finding the required data, It also reviews the developments with client server systems

    Computational Support for Technology- Investment Decisions

    Get PDF
    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format

    Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    Get PDF
    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts

    Rapid Damage Mapping for the 2015 M_w 7.8 Gorkha Earthquake Using Synthetic Aperture Radar Data from COSMO–SkyMed and ALOS-2 Satellites

    Get PDF
    The 25 April 2015 M_w 7.8 Gorkha earthquake caused more than 8000 fatalities and widespread building damage in central Nepal. The Italian Space Agency’s COSMO–SkyMed Synthetic Aperture Radar (SAR) satellite acquired data over Kathmandu area four days after the earthquake and the Japan Aerospace Exploration Agency’s Advanced Land Observing Satellite-2 SAR satellite for larger area nine days after the mainshock. We used these radar observations and rapidly produced damage proxy maps (DPMs) derived from temporal changes in Interferometric SAR coherence. Our DPMs were qualitatively validated through comparison with independent damage analyses by the National Geospatial-Intelligence Agency and the United Nations Institute for Training and Research’s United Nations Operational Satellite Applications Programme, and based on our own visual inspection of DigitalGlobe’s WorldView optical pre- versus postevent imagery. Our maps were quickly released to responding agencies and the public, and used for damage assessment, determining inspection/imaging priorities, and reconnaissance fieldwork

    Rapid Imaging of Earthquake Ruptures with Combined Geodetic and Seismic Analysis

    Get PDF
    Rapid determination of the location and extent of earthquake ruptures is helpful for disaster response, as it allows prediction of the likely area of major damage from the earthquake and can help with rescue and recovery planning. With the increasing availability of near real-time data from the Global Positioning System (GPS) and other global navigation satellite system receivers in active tectonic regions, and with the shorter repeat times of many recent and newly launched satellites, geodetic data can be obtained quickly after earthquakes or other disasters. We have been building a data system that can ingest, catalog, and process geodetic data and combine it with seismic analysis to estimate the fault rupture locations and slip distributions for large earthquakes

    Spreadsheets for Analyzing and Optimizing Space Missions

    Get PDF
    XCALIBR (XML Capability Analysis LIBRary) is a set of Extensible Markup Language (XML) database and spreadsheet- based analysis software tools designed to assist in technology-return-on-investment analysis and optimization of technology portfolios pertaining to outer-space missions. XCALIBR is also being examined for use in planning, tracking, and documentation of projects. An XCALIBR database contains information on mission requirements and technological capabilities, which are related by use of an XML taxonomy. XCALIBR incorporates a standardized interface for exporting data and analysis templates to an Excel spreadsheet. Unique features of XCALIBR include the following: It is inherently hierarchical by virtue of its XML basis. The XML taxonomy codifies a comprehensive data structure and data dictionary that includes performance metrics for spacecraft, sensors, and spacecraft systems other than sensors. The taxonomy contains >700 nodes representing all levels, from system through subsystem to individual parts. All entries are searchable and machine readable. There is an intuitive Web-based user interface. The software automatically matches technologies to mission requirements. The software automatically generates, and makes the required entries in, an Excel return-on-investment analysis software tool. The results of an analysis are presented in both tabular and graphical displays
    • …
    corecore