64 research outputs found

    Simplex GPS and InSAR Inversion Software

    Get PDF
    Changes in the shape of the Earth's surface can be routinely measured with precisions better than centimeters. Processes below the surface often drive these changes and as a result, investigators require models with inversion methods to characterize the sources. Simplex inverts any combination of GPS (global positioning system), UAVSAR (uninhabited aerial vehicle synthetic aperture radar), and InSAR (interferometric synthetic aperture radar) data simultaneously for elastic response from fault and fluid motions. It can be used to solve for multiple faults and parameters, all of which can be specified or allowed to vary. The software can be used to study long-term tectonic motions and the faults responsible for those motions, or can be used to invert for co-seismic slip from earthquakes. Solutions involving estimation of fault motion and changes in fluid reservoirs such as magma or water are possible. Any arbitrary number of faults or parameters can be considered. Simplex specifically solves for any of location, geometry, fault slip, and expansion/contraction of a single or multiple faults. It inverts GPS and InSAR data for elastic dislocations in a half-space. Slip parameters include strike slip, dip slip, and tensile dislocations. It includes a map interface for both setting up the models and viewing the results. Results, including faults, and observed, computed, and residual displacements, are output in text format, a map interface, and can be exported to KML. The software interfaces with the QuakeTables database allowing a user to select existing fault parameters or data. Simplex can be accessed through the QuakeSim portal graphical user interface or run from a UNIX command line

    A High Throughput Workflow Environment for Cosmological Simulations

    Get PDF
    The next generation of wide-area sky surveys offer the power to place extremely precise constraints on cosmological parameters and to test the source of cosmic acceleration. These observational programs will employ multiple techniques based on a variety of statistical signatures of galaxies and large-scale structure. These techniques have sources of systematic error that need to be understood at the percent-level in order to fully leverage the power of next-generation catalogs. Simulations of large-scale structure provide the means to characterize these uncertainties. We are using XSEDE resources to produce multiple synthetic sky surveys of galaxies and large-scale structure in support of science analysis for the Dark Energy Survey. In order to scale up our production to the level of fifty 10^10-particle simulations, we are working to embed production control within the Apache Airavata workflow environment. We explain our methods and report how the workflow has reduced production time by 40% compared to manual management.Comment: 8 pages, 5 figures. V2 corrects an error in figure

    Quantum Films Adsorbed on Graphite: Third and Fourth Helium Layers

    Full text link
    Using a path-integral Monte Carlo method for simulating superfluid quantum films, we investigate helium layers adsorbed on a substrate consisting of graphite plus two solid helium layers. Our results for the promotion densities and the dependence of the superfluid density on coverage are in agreement with experiment. We can also explain certain features of the measured heat capacity as a function of temperature and coverage.Comment: 13 pages in the Phys. Rev. two-column format, 16 Figure

    BioDrugScreen: a computational drug design resource for ranking molecules docked to the human proteome

    Get PDF
    BioDrugScreen is a resource for ranking molecules docked against a large number of targets in the human proteome. Nearly 1600 molecules from the freely available NCI diversity set were docked onto 1926 cavities identified on 1589 human targets resulting in >3 million receptor–ligand complexes requiring >200 000 cpu-hours on the TeraGrid. The targets in BioDrugScreen originated from Human Cancer Protein Interaction Network, which we have updated, as well as the Human Druggable Proteome, which we have created for the purpose of this effort. This makes the BioDrugScreen resource highly valuable in drug discovery. The receptor–ligand complexes within the database can be ranked using standard and well-established scoring functions like AutoDock, DockScore, ChemScore, X-Score, GoldScore, DFIRE and PMF. In addition, we have scored the complexes with more intensive GBSA and PBSA approaches requiring an additional 120 000 cpu-hours on the TeraGrid. We constructed a simple interface to enable users to view top-ranking molecules and access purchasing and other information for further experimental exploration

    Apache Airavata: Design and Directions of a Science Gateway Framework

    Get PDF
    This paper provides an overview of the Apache Airavata software system for science gateways. Gateways use Airavata to manage application and workflow executions on a range of backend resources (grids, computing clouds, and local clusters). Airavata’s design goal is to provide component abstractions for major tasks required to provide gateway application management. Components are not directly accessed but are instead exposed through a client Application Programming Interface. This design allows gateway developers to take full advantage of Airavata’s capabilities, and Airavata developers (including those interested in middleware research) to modify Airavata’s implementations and behavior. This is particularly important as Airavata evolves to become a scalable, elastic “platform as a service” for science gateways. We illustrate the capabilities of Airavata through the discussion of usage vignettes. As an Apache Software Foundation project, Airavata’s open community governance model is as important as its software base. We discuss how this works within Airavata and how it may be applicable to other distributed computing infrastructure and cyberinfrastructure efforts

    QuakeSim 2.0

    Get PDF
    QuakeSim 2.0 improves understanding of earthquake processes by providing modeling tools and integrating model applications and various heterogeneous data sources within a Web services environment. QuakeSim is a multisource, synergistic, data-intensive environment for modeling the behavior of earthquake faults individually, and as part of complex interacting systems. Remotely sensed geodetic data products may be explored, compared with faults and landscape features, mined by pattern analysis applications, and integrated with models and pattern analysis applications in a rich Web-based and visualization environment. Integration of heterogeneous data products with pattern informatics tools enables efficient development of models. Federated database components and visualization tools allow rapid exploration of large datasets, while pattern informatics enables identification of subtle, but important, features in large data sets. QuakeSim is valuable for earthquake investigations and modeling in its current state, and also serves as a prototype and nucleus for broader systems under development. The framework provides access to physics-based simulation tools that model the earthquake cycle and related crustal deformation. Spaceborne GPS and Inter ferometric Synthetic Aperture (InSAR) data provide information on near-term crustal deformation, while paleoseismic geologic data provide longerterm information on earthquake fault processes. These data sources are integrated into QuakeSim's QuakeTables database system, and are accessible by users or various model applications. UAVSAR repeat pass interferometry data products are added to the QuakeTables database, and are available through a browseable map interface or Representational State Transfer (REST) interfaces. Model applications can retrieve data from Quake Tables, or from third-party GPS velocity data services; alternatively, users can manually input parameters into the models. Pattern analysis of GPS and seismicity data has proved useful for mid-term forecasting of earthquakes, and for detecting subtle changes in crustal deformation. The GPS time series analysis has also proved useful as a data-quality tool, enabling the discovery of station anomalies and data processing and distribution errors. Improved visualization tools enable more efficient data exploration and understanding. Tools provide flexibility to science users for exploring data in new ways through download links, but also facilitate standard, intuitive, and routine uses for science users and end users such as emergency responders

    A Workflows Roadmap for the Geosciences

    Get PDF
    The EarthCube Workflows Community Group was formed in March 2012 as part of the NSF EarthCube initiative in response to initial discussions in EarthCube that occurred during 2011. Workflows are used to manage complex computations that have many steps or use large data. Workflow systems assist scientists to select models appropriate for their data, configure them with appropriate parameters, and execute them efficiently. The EarthCube community saw great value in workflow technologies for the future of geosciences. The goal of the EarthCube Workflows Community Group was to begin to elicit requirements for workflows in geosciences, ascertain the state of the art and current practices, identify current gaps in both the use of and capabilities of current workflow systems in the earth sciences through use case studies, and identify grand challenges for the next decade along with the possible paths to addressing those challenges. The group was asked to produce a roadmap for workflows in geosciences. Three other Community Groups were formed (Data, Semantics and Ontologies, and Governance), and each was asked to create a roadmap in their area. The group held a series of virtual and face-to-face workshops to solicit participation from the geosciences community and other relevant researchers. The EarthCube Workflows Community Group set up a public web site where all their activities were made open for participation from the community and all documents were posted for public access and editing (https://sites.google.com/site/earthcubeworkflow/). Presentations and discussions were recorded and posted on the site. A key result of the work of the EarthCube Workflows Community Group activities in Spring and Summer 2012 was the creation of a workflows roadmap for the geosciences. An initial roadmap document for the EarthCube community that was first released in June 2012 and presented to the EarthCube community. A revised roadmap was delivered to the community in August 2012. The roadmap serves as a living document created as a group effort with provisions and a process to update and extend it over time.This document represents the final roadmap of the NSF EarthCube Community Group for workflows in the geosciences. Community feedback is always welcome, as the roadmap will be revised and extended while EarthCube activities continue.This work was supported through National Science Foundation under grant # EAR-1238216 as part of the NSF EarthCube initiative. EarthCube is an innovative and longterm cross-directorate initiative of the US National Science Foundation

    Path integral Monte Carlo simulation of the second layer of helium-4 adsorbed on graphite

    Full text link
    We have developed a path integral Monte Carlo method for simulating helium films and apply it to the second layer of helium adsorbed on graphite. We use helium-helium and helium-graphite interactions that are found from potentials which realistically describe the interatomic interactions. The Monte Carlo sampling is over both particle positions and permutations of particle labels. From the particle configurations and static structure factor calculations, we find that this layer possesses, in order of increasing density, a superfluid liquid phase, a sqrt(7) x sqrt(7) commensurate solid phase that is registered with respect to the first layer, and an incommensurate solid phases. By applying the Maxwell construction to the dependence of the low-temperature total energy on the coverage, we are able to identify coexistence regions between the phases. From these, we deduce an effectively zero-temperature phase diagram. Our phase boundaries are in agreement with heat capacity and torsional oscillator measurements, and demonstrate that the experimentally observed disruption of the superfluid phase is caused by the growth of the commensurate phase. We further observe that the superfluid phase has a transition temperature consistent with the two-dimensional value. Promotion to the third layer occurs for densities above 0.212 atom/A^2, in good agreement with experiment. Finally, we calculate the specific heat for each phase and obtain peaks at temperatures in general agreement with experiment.Comment: 14 double-column pages, 10 figures, revtex. Accepted for publication in Phys. Rev. B. 3 figures added, some text revisions, 6 figures remove

    Community Organizations: Changing the Culture in Which Research Software Is Developed and Sustained

    Get PDF
    Software is the key crosscutting technology that enables advances in mathematics, computer science, and domain-specific science and engineering to achieve robust simulations and analysis for science, engineering, and other research fields. However, software itself has not traditionally received focused attention from research communities; rather, software has evolved organically and inconsistently, with its development largely as by-products of other initiatives. Moreover, challenges in scientific software are expanding due to disruptive changes in computer hardware, increasing scale and complexity of data, and demands for more complex simulations involving multiphysics, multiscale modeling and outer-loop analysis. In recent years, community members have established a range of grass-roots organizations and projects to address these growing technical and social challenges in software productivity, quality, reproducibility, and sustainability. This article provides an overview of such groups and discusses opportunities to leverage their synergistic activities while nurturing work toward emerging software ecosystems

    Aerosols in the Pre-industrial Atmosphere

    Get PDF
    Purpose of Review: We assess the current understanding of the state and behaviour of aerosols under pre-industrial conditions and the importance for climate. Recent Findings: Studies show that the magnitude of anthropogenic aerosol radiative forcing over the industrial period calculated by climate models is strongly affected by the abundance and properties of aerosols in the pre-industrial atmosphere. The low concentration of aerosol particles under relatively pristine conditions means that global mean cloud albedo may have been twice as sensitive to changes in natural aerosol emissions under pre-industrial conditions compared to present-day conditions. Consequently, the discovery of new aerosol formation processes and revisions to aerosol emissions have large effects on simulated historical aerosol radiative forcing. Summary: We review what is known about the microphysical, chemical, and radiative properties of aerosols in the pre-industrial atmosphere and the processes that control them. Aerosol properties were controlled by a combination of natural emissions, modification of the natural emissions by human activities such as land-use change, and anthropogenic emissions from biofuel combustion and early industrial processes. Although aerosol concentrations were lower in the pre-industrial atmosphere than today, model simulations show that relatively high aerosol concentrations could have been maintained over continental regions due to biogenically controlled new particle formation and wildfires. Despite the importance of pre-industrial aerosols for historical climate change, the relevant processes and emissions are given relatively little consideration in climate models, and there have been very few attempts to evaluate them. Consequently, we have very low confidence in the ability of models to simulate the aerosol conditions that form the baseline for historical climate simulations. Nevertheless, it is clear that the 1850s should be regarded as an early industrial reference period, and the aerosol forcing calculated from this period is smaller than the forcing since 1750. Improvements in historical reconstructions of natural and early anthropogenic emissions, exploitation of new Earth system models, and a deeper understanding and evaluation of the controlling processes are key aspects to reducing uncertainties in future
    corecore