1,213 research outputs found
Regional Data Archiving and Management for Northeast Illinois
This project studies the feasibility and implementation options for establishing a regional data archiving system to help monitor
and manage traffic operations and planning for the northeastern Illinois region. It aims to provide a clear guidance to the
regional transportation agencies, from both technical and business perspectives, about building such a comprehensive
transportation information system. Several implementation alternatives are identified and analyzed. This research is carried
out in three phases.
In the first phase, existing documents related to ITS deployments in the broader Chicago area are summarized, and a
thorough review is conducted of similar systems across the country. Various stakeholders are interviewed to collect
information on all data elements that they store, including the format, system, and granularity. Their perception of a data
archive system, such as potential benefits and costs, is also surveyed. In the second phase, a conceptual design of the
database is developed. This conceptual design includes system architecture, functional modules, user interfaces, and
examples of usage. In the last phase, the possible business models for the archive system to sustain itself are reviewed. We
estimate initial capital and recurring operational/maintenance costs for the system based on realistic information on the
hardware, software, labor, and resource requirements. We also identify possible revenue opportunities.
A few implementation options for the archive system are summarized in this report; namely:
1. System hosted by a partnering agency
2. System contracted to a university
3. System contracted to a national laboratory
4. System outsourced to a service provider
The costs, advantages and disadvantages for each of these recommended options are also provided.ICT-R27-22published or submitted for publicationis peer reviewe
Terrestrial applications: An intelligent Earth-sensing information system
For Abstract see A82-2214
Improving the Retrieval of Offshore-Onshore Correlation Functions With Machine Learning
The retrieval of reliable offshore‐onshore correlation functions is critical to improve our ability to predict long‐period ground motions from megathrust earthquakes. However, localized ambient seismic field sources between offshore and onshore stations can bias correlation functions and generate nonphysical arrivals. We present a two‐step method based on unsupervised learning to improve the quality of correlation functions calculated with the deconvolution technique (e.g., deconvolution functions, DFs). For a DF data set calculated between two stations over a long time period, we first reduce the data set dimensions using the principal component analysis and cluster the features of the low‐dimensional space with a Gaussian mixture model. We then stack the DFs belonging to each cluster together and select the best stacked DF. We apply our technique to DFs calculated every 30 min between an offshore station located on top of the Nankai Trough, Japan, and 78 onshore receivers. Our method removes spurious arrivals and improves the signal‐to‐noise ratio of DFs. Most 30‐min DFs selected by our clustering method are generated during extreme meteorological events such as typhoons. To demonstrate that the DFs obtained with our method contain reliable phases and amplitudes, we use them to simulate the long‐period ground motions from a Mw 5.8 earthquake, which occurred near the offshore station. Results show that the earthquake long‐period ground motions are accurately simulated. Our method can easily be used as an additional processing step when calculating offshore‐onshore DFs and offers a new way to improve the prediction of long‐period ground motions from potential megathrust earthquakes
Searching Data: A Review of Observational Data Retrieval Practices in Selected Disciplines
A cross-disciplinary examination of the user behaviours involved in seeking
and evaluating data is surprisingly absent from the research data discussion.
This review explores the data retrieval literature to identify commonalities in
how users search for and evaluate observational research data. Two analytical
frameworks rooted in information retrieval and science technology studies are
used to identify key similarities in practices as a first step toward
developing a model describing data retrieval
Intelligent Data Networking for the Earth System Science Community
Earth system science (ESS) research is generally very data intense. To enable detailed discovery and transparent access of the data stored in heterogeneous and organisationally separated data centres common data and metadata community interfaces are needed. This paper describes the development of a coherent data discovery and data access infrastructure for the ESS community in Germany. To comprehensively and consistently describe the characteristics of geographic data, required for their discovery (discovery metadata) and for their usage (use metadata) the ISO standard 19115 is adopted. Webservice technology is used to hide the details of heterogeneous data access mechanisms and preprocessing implementations. The commitment to international standards and the modular character of the approach facilitates the expandability of the infrastructure as well as the interoperability with international partners and other communities
Telescience testbedding: An implementation approach
Telescience is the term used to describe a concept being developed by NASA's Office of Space Science and Applications (OSSA) under the Science and Applications Information System (SAIS) Program. This concept focuses on the development of an ability for all OSSA users to be remotely interactive with all provided information system services for the Space Station era. This concept includes access to services provided by both flight and ground components of the system and emphasizes the accommodation of users from their home institutions. Key to the development of the telescience capability is an implementation approach called rapid-prototype testbedding. This testbedding is used to validate the concept and test the applicability of emerging technologies and operational methodologies. Testbedding will be used to first determine the feasibility of an idea and then the applicability to real science usage. Once a concept is deemed viable, it will be integrated into the operational system for real time support. It is believed that this approach will greatly decrease the expense of implementing the eventual system and will enhance the resultant capabilities of the delivered system
Information resources management, 1984-1989: A bibliography with indexes
This bibliography contains 768 annotated references to reports and journal articles entered into the NASA scientific and technical information database 1984 to 1989
- …