5,098 research outputs found

    Autoplot: A browser for scientific data on the web

    Full text link
    Autoplot is software developed for the Virtual Observatories in Heliophysics to provide intelligent and automated plotting capabilities for many typical data products that are stored in a variety of file formats or databases. Autoplot has proven to be a flexible tool for exploring, accessing, and viewing data resources as typically found on the web, usually in the form of a directory containing data files with multiple parameters contained in each file. Data from a data source is abstracted into a common internal data model called QDataSet. Autoplot is built from individually useful components, and can be extended and reused to create specialized data handling and analysis applications and is being used in a variety of science visualization and analysis applications. Although originally developed for viewing heliophysics-related time series and spectrograms, its flexible and generic data representation model makes it potentially useful for the Earth sciences.Comment: 16 page

    The multimedia documentation of endangered and minority languages : a thesis presented in partial fulfilment of the requirements for the degree of Master of Philosophy in Linguistics at Massey University

    Get PDF
    This thesis examines the impending loss of linguistic diversity in the world and advocates a change in emphasis in linguistic research towards the documentation of minority and endangered languages. Various models for documentation are examined, along with some of the ethical issues involved in linguistic research amongst small groups, and a new model is proposed. The new model is centred around the collection of a wide variety of high-quality data, but includes the collection of other related materials that will be of particular use and interest to the ethnic community. The collected data and other materials are then structured as an internet-ready multimedia documentation designed for use by the ethnic community as primary audience, while still catering for the needs of linguistic researchers worldwide. A pilot project is carried out using the model

    An approach for real world data modelling with the 3D terrestrial laser scanner for built environment

    Get PDF
    Capturing and modelling 3D information of the built environment is a big challenge. A number of techniques and technologies are now in use. These include EDM, GPS, and photogrammetric application, remote sensing and traditional building surveying applications. However, use of these technologies cannot be practical and efficient in regard to time, cost and accuracy. Furthermore, a multi disciplinary knowledge base, created from the studies and research about the regeneration aspects is fundamental: historical, architectural, archeologically, environmental, social, economic, etc. In order to have an adequate diagnosis of regeneration, it is necessary to describe buildings and surroundings by means of documentation and plans. However, at this point in time the foregoing is considerably far removed from the real situation, since more often than not it is extremely difficult to obtain full documentation and cartography, of an acceptable quality, since the material, constructive pathologies and systems are often insufficient or deficient (flat that simply reflects levels, isolated photographs,..). Sometimes the information in reality exists, but this fact is not known, or it is not easily accessible, leading to the unnecessary duplication of efforts and resources. In this paper, we discussed 3D laser scanning technology, which can acquire high density point data in an accurate, fast way. Besides, the scanner can digitize all the 3D information concerned with a real world object such as buildings, trees and terrain down to millimetre detail Therefore, it can provide benefits for refurbishment process in regeneration in the Built Environment and it can be the potential solution to overcome the challenges above. The paper introduce an approach for scanning buildings, processing the point cloud raw data, and a modelling approach for CAD extraction and building objects classification by a pattern matching approach in IFC (Industry Foundation Classes) format. The approach presented in this paper from an undertaken research can lead to parametric design and Building Information Modelling (BIM) for existing structures. Two case studies are introduced to demonstrate the use of laser scanner technology in the Built Environment. These case studies are the Jactin House Building in East Manchester and the Peel building in the campus of University Salford. Through these case studies, while use of laser scanners are explained, the integration of it with various technologies and systems are also explored for professionals in Built Environmen

    Colorado Ultraviolet Transit Experiment Data Simulator

    Get PDF
    The Colorado Ultraviolet Transit Experiment (CUTE) is a 6U NASA CubeSat carrying on-board a low-resolution (R~2000--3000), near-ultraviolet (2500--3300 {\AA}) spectrograph. It has a rectangular primary Cassegrain telescope to maximize the collecting area. CUTE, which is planned for launch in Spring 2020, is designed to monitor transiting extra-solar planets orbiting bright, nearby stars aiming at improving our understanding of planet atmospheric escape and star-planet interaction processes. We present here the CUTE data simulator, which we complemented with a basic data reduction pipeline. This pipeline will be then updated once the final CUTE data reduction pipeline is developed. We show here the application of the simulator to the HD209458 system and a first estimate of the precision on the measurement of the transit depth as a function of temperature and magnitude of the host star. We also present estimates of the effect of spacecraft jitter on the final spectral resolution. The simulator has been developed considering also scalability and adaptability to other missions carrying on-board a long-slit spectrograph. The data simulator will be used to inform the CUTE target selection, choose the spacecraft and instrument settings for each observation, and construct synthetic CUTE wavelength-dependent transit light curves on which to develop the CUTE data reduction pipeline.Comment: Accepted for publication in the Journal of Astronomical Telescopes, Instruments and System

    Seismic-py: Reading Seismic Data with Python

    Get PDF
    The field of seismic exploration of the Earth has changed dramatically over the last half a century. The Society of Exploration Geophysicists (SEG) has worked to create standards to store the vast amounts of seismic data in a way that will be portable across computer architectures. However, it has been impossible to predict the needs of the immense range of seismic data acquisition systems. As a result, vendors have had to bend the rules to accommodate the needs of new instruments and experiment types. For low level access to seismic data, there is need for a standard open source library to allow access to a wide range of vendor data files that can handle all of the variations. A new seismic software package, seismic-py, provides an infrastructure for creating and managing drivers for each particular format. Drivers can be derived from one of the known formats and altered to handle any slight variations. Alternatively drivers can be developed from scratch for formats that are very different from any previously defined format. Python has been the key to making driver development easy and efficient to implement. The goal of seismic-py is to be the base system that will power a wide range of experimentation with seismic data and at the same time provide clear documentation for the historical record of seismic data formats

    Capturing flight system test engineering expertise: Lessons learned

    Get PDF
    Within a few years, JPL will be challenged by the most active mission set in history. Concurrently, flight systems are increasingly more complex. Presently, the knowledge to conduct integration and test of spacecraft and large instruments is held by a few key people, each with many years of experience. JPL is in danger of losing a significant amount of this critical expertise, through retirement, during a period when demand for this expertise is rapidly increasing. The most critical issue at hand is to collect and retain this expertise and develop tools that would ensure the ability to successfully perform the integration and test of future spacecraft and large instruments. The proposed solution was to capture and codity a subset of existing knowledge, and to utilize this captured expertise in knowledge-based systems. First year results and activities planned for the second year of this on-going effort are described. Topics discussed include lessons learned in knowledge acquisition and elicitation techniques, life-cycle paradigms, and rapid prototyping of a knowledge-based advisor (Spacecraft Test Assistant) and a hypermedia browser (Test Engineering Browser). The prototype Spacecraft Test Assistant supports a subset of integration and test activities for flight systems. Browser is a hypermedia tool that allows users easy perusal of spacecraft test topics. A knowledge acquisition tool called ConceptFinder which was developed to search through large volumes of data for related concepts is also described and is modified to semi-automate the process of creating hypertext links

    Monte Carlo Particle Lists: MCPL

    Get PDF
    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages
    corecore