48 research outputs found

    Towards a metadata standard for field spectroscopy

    Get PDF
    This thesis identifies the core components for a field spectroscopy metadata standard to facilitate discoverability, interoperability, reliability, quality assurance and extended life cycles for datasets being exchanged in a variety of data sharing platforms. The research is divided into five parts: 1) an overview of the importance of field spectroscopy, metadata paradigms and standards, metadata quality and geospatial data archiving systems; 2) definition of a core metadataset critical for all field spectroscopy applications; 3) definition of an extended metadataset for specific applications; 4) methods and metrics for assessing metadata quality and completeness in spectral data archives; 5) recommendations for implementing a field spectroscopy metadata standard in data warehouses and ‘big data’ environments. Part 1 of the thesis is a review of the importance of field spectroscopy in remote sensing; metadata paradigms and standards; field spectroscopy metadata practices, metadata quality; and geospatial data archiving systems. The unique metadata requirements for field spectroscopy are discussed. Conventional definitions and metrics for measuring metadata quality are presented. Geospatial data archiving systems for data warehousing and intelligent information exchange are explained. Part 2 of the thesis presents a core metadataset for all field spectroscopy applications, derived from the results of an international expert panel survey. The survey respondents helped to identify a metadataset critical to all field spectroscopy campaigns, and for specific applications. These results form the foundation of a field spectroscopy metadata standard that is practical, flexible enough to suit the purpose for which the data is being collected, and/or has sufficient legacy potential for long-term sharing and interoperability with other datasets. Part 3 presents an extended metadataset for specific application areas within field spectroscopy. The key metadata is presented for three applications: tree crown, soil, and underwater coral reflectance measurements. The performance of existing metadata standards in complying with the field spectroscopy metadataset was measured. Results show they consistently fail to accommodate the needs of both field spectroscopy scientists in general as well as the three application areas. Part 4 presents criteria for measuring the quality and completeness of field spectroscopy metadata in a spectral archive. Existing methods for measuring quality and completeness of metadata were scrutinized against the special requirements of field spectroscopy datasets. Novel field spectroscopy metadata quality parameters were defined. Two spectral libraries were examined as case studies of operationalized metadata. The case studies revealed that publicly available datasets are underperforming on the quality and completeness measures. Part 5 presents recommendations for adoption and implementation of a field spectroscopy standard, both within the field spectroscopy community and within the wider scope of IT infrastructure for storing and sharing field spectroscopy metadata within data warehouses and big data environments. The recommendations are divided into two main sections: community adoption of the standard, and integration of standardized metadatasets into data warehouses and big data platforms. This thesis has identified the core components of a metadata standard for field spectroscopy. The metadata standard serves overall to increase the discoverability, reliability, quality, and life cycle of field spectroscopy metadatasets for wide-scale data exchange

    Emotion and Stress Recognition Related Sensors and Machine Learning Technologies

    Get PDF
    This book includes impactful chapters which present scientific concepts, frameworks, architectures and ideas on sensing technologies and machine learning techniques. These are relevant in tackling the following challenges: (i) the field readiness and use of intrusive sensor systems and devices for capturing biosignals, including EEG sensor systems, ECG sensor systems and electrodermal activity sensor systems; (ii) the quality assessment and management of sensor data; (iii) data preprocessing, noise filtering and calibration concepts for biosignals; (iv) the field readiness and use of nonintrusive sensor technologies, including visual sensors, acoustic sensors, vibration sensors and piezoelectric sensors; (v) emotion recognition using mobile phones and smartwatches; (vi) body area sensor networks for emotion and stress studies; (vii) the use of experimental datasets in emotion recognition, including dataset generation principles and concepts, quality insurance and emotion elicitation material and concepts; (viii) machine learning techniques for robust emotion recognition, including graphical models, neural network methods, deep learning methods, statistical learning and multivariate empirical mode decomposition; (ix) subject-independent emotion and stress recognition concepts and systems, including facial expression-based systems, speech-based systems, EEG-based systems, ECG-based systems, electrodermal activity-based systems, multimodal recognition systems and sensor fusion concepts and (x) emotion and stress estimation and forecasting from a nonlinear dynamical system perspective

    Renewables 2019 Global Status Report

    Get PDF

    Spatiotemporal enabled Content-based Image Retrieval

    Full text link

    “Blue” Hydrogen & Helium From Flare Gas Of The Bakken Formation Of The Williston Basin, North Dakota: A Novel Process

    Get PDF
    Is it possible to curtail flaring in the Williston basin while simultaneously sequestering carbon dioxide, harvesting economic quantities of natural gas liquids, helium and other valuable products? Utilizing a novel approach described here, diatomic hydrogen and elemental helium, as well as other products, can be profitably extracted from the gas streams produced from horizontal, hydraulically-fractured Middle Bakken Member wells, in the Devonian-Mississippian Bakken Formation of the Williston Basin, North Dakota, USA.However, there are two vastly different methods employed to extract these gasses. Hydrogen is harvested from the gas stream by physically reforming methane (CH4) through the application of one or another of two-stage processes: “Autothermal Reformation + Water Gas Shift (WGS) reaction”, known as ATR; or “Steam Methane Reforming”, SMR. Both yield H2, plus CO (carbon monoxide) in the first phase, and CO2 (carbon dioxide) after the second. Elemental diatomic hydrogen (H2) can be used in fuel cells to generate electricity or directly in certain internal combustion engines; primarily turbines, as primary fuel. The produced CO2 can be captured (CCUS: Carbon Capture, Utilization and Sequestration) and injected downhole for both reservoir energy enhancement and CO2 sequestration, or sold for industrial use because of its purity. Helium, on the other hand, is inert and therefore it is unnecessary to expend the amount of energy required to reformat methane to liberate hydrogen. There are several methods commercially available to economically extract 99.995% pure helium from gas streams where the helium concentration can be as low as 0.010%. The extraction of crude helium from natural gas requires three processing steps. The first step removes impurities through deamination, glycol absorption, nitrogen rejection, and desiccant adsorption, which remove CO2, H2O, N2, and H2S; a typical gas pre-treatment process. The second step removes high-molecular weight hydrocarbons (Natural Gas Liquids), if desired, while the third step is via cryogenics, which removes the final methane. The result is 75-90% pure helium. Final purification, before liquefaction, is accomplished via activated charcoal absorbers at liquid-nitrogen temperatures and high pressure, or pressure-swing adsorption (PSA) processes. Low-temperature adsorption can yield helium purities of 99.99 percent, while PSA processes recover helium at better than 99.9999 percent purity. However, with the advent of selective zeolite or organometallic membranes, the cryogenic extraction of He from the CH4 stream step can be eliminated. Heating the gas stream and passing it through selective semi-permeable membranes allow for the helium, with its much smaller size, and higher energy, pass while excluding the relatively massive CH4 molecule. The helium can be isolated and purified via pressure swing adsorption (PSA) methods to achieve 99.999% purity. The heated methane can then be directly ported to a Steam Methane Reformer unit for extraction of hydrogen. Both H2 and He extraction procedures eliminate the need for gas flaring, as both yield salable products such as LNG and NGLs, and the opportunity to capture and sequester carbon dioxide (CO2) from the produced gas stream. This extracted so-called “Blue Hydrogen” is slated for use in transportation via fuel cells or use in internal combustion engines and sells for approximately 3.00/MCF,dependingonthecostofthefeedstocknaturalgas.“Metallurgicalhelium”or“Grade−AHelium”(i.e.,3˘e99.99993.00/MCF, depending on the cost of the feedstock natural gas. “Metallurgical helium” or “Grade-A Helium” (i.e., \u3e 99.9999% pure), with myriad industrial and scientific uses, brings ~US498/MCF (02-2023). The cost of hydrogen vs. helium extraction is difficult to compare. Hydrogen production depends on the cost of natural gas as a feedstock, which is particularly variable. The cost of helium extraction depends on the volume of gas being processed, as most helium extraction units could handle 10-12 Bakken wells simultaneously. However, as a straight-up market product, helium revenue exceeds hydrogen by a factor of 100. Doing both coincidental from the same gas stream will enhance the revenue of each

    Human-Nature Interactions

    Get PDF
    This edited volume aims to widen the discussion about the diversity of human-nature relationships and valuation methods and to stimulate new perspective that are needed to build a more sustainable future, especially in face of ongoing socio-environmental changes. Conceptual and empirical approaches, including qualitative, quantitative, and mixed methodologies have been used to highlight the importance of an integrative understanding of socio-ecological systems, where healthy ecosystems underpin the quality of life and societal activities largely drive environmental changes. Readers will obtain a comprehensive overview of the many and diverse ways the relationships between people and nature can be characterized. This includes understanding how people assign values to nature, discuss how human-nature interactions are shaped and provide examples of how these values and interactions can be systematically assessed across different land systems in Europe and beyond. This open access book is produced by internationally recognized scientists in the field but written in an accessible format to be of interest to a large audience, including prospective students, lecturers, young professionals and scientists embarking to the interdisciplinary field of socio-ecological research and environmental valuation

    Spatial and Temporal Sentiment Analysis of Twitter data

    Get PDF
    The public have used Twitter world wide for expressing opinions. This study focuses on spatio-temporal variation of georeferenced Tweets’ sentiment polarity, with a view to understanding how opinions evolve on Twitter over space and time and across communities of users. More specifically, the question this study tested is whether sentiment polarity on Twitter exhibits specific time-location patterns. The aim of the study is to investigate the spatial and temporal distribution of georeferenced Twitter sentiment polarity within the area of 1 km buffer around the Curtin Bentley campus boundary in Perth, Western Australia. Tweets posted in campus were assigned into six spatial zones and four time zones. A sentiment analysis was then conducted for each zone using the sentiment analyser tool in the Starlight Visual Information System software. The Feature Manipulation Engine was employed to convert non-spatial files into spatial and temporal feature class. The spatial and temporal distribution of Twitter sentiment polarity patterns over space and time was mapped using Geographic Information Systems (GIS). Some interesting results were identified. For example, the highest percentage of positive Tweets occurred in the social science area, while science and engineering and dormitory areas had the highest percentage of negative postings. The number of negative Tweets increases in the library and science and engineering areas as the end of the semester approaches, reaching a peak around an exam period, while the percentage of negative Tweets drops at the end of the semester in the entertainment and sport and dormitory area. This study will provide some insights into understanding students and staff ’s sentiment variation on Twitter, which could be useful for university teaching and learning management

    European Handbook of Crowdsourced Geographic Information

    Get PDF
    "This book focuses on the study of the remarkable new source of geographic information that has become available in the form of user-generated content accessible over the Internet through mobile and Web applications. The exploitation, integration and application of these sources, termed volunteered geographic information (VGI) or crowdsourced geographic information (CGI), offer scientists an unprecedented opportunity to conduct research on a variety of topics at multiple scales and for diversified objectives. The Handbook is organized in five parts, addressing the fundamental questions: What motivates citizens to provide such information in the public domain, and what factors govern/predict its validity?What methods might be used to validate such information? Can VGI be framed within the larger domain of sensor networks, in which inert and static sensors are replaced or combined by intelligent and mobile humans equipped with sensing devices? What limitations are imposed on VGI by differential access to broadband Internet, mobile phones, and other communication technologies, and by concerns over privacy? How do VGI and crowdsourcing enable innovation applications to benefit human society? Chapters examine how crowdsourcing techniques and methods, and the VGI phenomenon, have motivated a multidisciplinary research community to identify both fields of applications and quality criteria depending on the use of VGI. Besides harvesting tools and storage of these data, research has paid remarkable attention to these information resources, in an age when information and participation is one of the most important drivers of development. The collection opens questions and points to new research directions in addition to the findings that each of the authors demonstrates. Despite rapid progress in VGI research, this Handbook also shows that there are technical, social, political and methodological challenges that require further studies and research.
    corecore