75 research outputs found

    Internet of things

    Get PDF
    Manual of Digital Earth / Editors: Huadong Guo, Michael F. Goodchild, Alessandro Annoni .- Springer, 2020 .- ISBN: 978-981-32-9915-3Digital Earth was born with the aim of replicating the real world within the digital world. Many efforts have been made to observe and sense the Earth, both from space (remote sensing) and by using in situ sensors. Focusing on the latter, advances in Digital Earth have established vital bridges to exploit these sensors and their networks by taking location as a key element. The current era of connectivity envisions that everything is connected to everything. The concept of the Internet of Things(IoT)emergedasaholisticproposaltoenableanecosystemofvaried,heterogeneous networked objects and devices to speak to and interact with each other. To make the IoT ecosystem a reality, it is necessary to understand the electronic components, communication protocols, real-time analysis techniques, and the location of the objects and devices. The IoT ecosystem and the Digital Earth (DE) jointly form interrelated infrastructures for addressing today’s pressing issues and complex challenges. In this chapter, we explore the synergies and frictions in establishing an efficient and permanent collaboration between the two infrastructures, in order to adequately address multidisciplinary and increasingly complex real-world problems. Although there are still some pending issues, the identified synergies generate optimism for a true collaboration between the Internet of Things and the Digital Earth

    Oceanids C2: An Integrated Command, Control, and Data Infrastructure for the Over-the-Horizon Operation of Marine Autonomous Systems

    Get PDF
    Long-range Marine Autonomous Systems (MAS), operating beyond the visual line-of-sight of a human pilot or research ship, are creating unprecedented opportunities for oceanographic data collection. Able to operate for up to months at a time, periodically communicating with a remote pilot via satellite, long-range MAS vehicles significantly reduce the need for an expensive research ship presence within the operating area. Heterogeneous fleets of MAS vehicles, operating simultaneously in an area for an extended period of time, are becoming increasingly popular due to their ability to provide an improved composite picture of the marine environment. However, at present, the expansion of the size and complexity of these multi-vehicle operations is limited by a number of factors: (1) custom control-interfaces require pilots to be trained in the use of each individual vehicle, with limited cross-platform standardization; (2) the data produced by each vehicle are typically in a custom vehicle-specific format, making the automated ingestion of observational data for near-real-time analysis and assimilation into operational ocean models very difficult; (3) the majority of MAS vehicles do not provide machine-to-machine interfaces, limiting the development and usage of common piloting tools, multi-vehicle operating strategies, autonomous control algorithms and automated data delivery. In this paper, we describe a novel piloting and data management system (C2) which provides a unified web-based infrastructure for the operation of long-range MAS vehicles within the UK's National Marine Equipment Pool. The system automates the archiving, standardization and delivery of near-real-time science data and associated metadata from the vehicles to end-users and Global Data Assembly Centers mid-mission. Through the use and promotion of standard data formats and machine interfaces throughout the C2 system, we seek to enable future opportunities to collaborate with both the marine science and robotics communities to maximize the delivery of high-quality oceanographic data for world-leading science

    One-operator two-machine flow shop scheduling with setup times for machines and total completion time objective

    Get PDF
    In a manufacturing environment, when a worker or a machine switches from one type of operation to another, a setup time may be required. I propose a scheduling model with one operator and two machines. In this problem, a single operator completes a set of jobs requiring operations in a two-machine flow shop. The operator can perform only one operation at a time. When one machine is in use, the other is idle. Whenever the operator changes machine, a setup time is required. We consider the objective of total completion time. I formulate the problem as a linear integer programming with \u27 O\u27(\u27n\u273) 0-1 variables and \u27 O\u27(\u27n\u272) constraints. I also introduce some classes of valid inequalities. To obtain the exact solutions, Branch-and-Bound, Cut-and-Branch, Branch-and-Cut algorithms are used. For larger size problems, some heuristic procedures are proposed and the computational results are compared

    Microstructural effects on the mechanical properties of carburized low-alloy steels

    Get PDF
    This study examined the effects of composition and initial microstructure on the physical, metallurgical, and mechanical properties of carburized SAE 8620 and PS-18 steels. Testing was performed on 8620 and PS-18 steels in the as-received and normalized conditions. Hardenability testing was conducted prior to additional heat treatments. Size and shape distortion, residual stress, retained austenite, and effective case depth measurements were obtained for specimens subjected to a carburizing heat treatment. Specimens subjected to a core thermal cycle heat treatment were tested to determine the tensile and Charpy impact properties of the core material of carburized components. Despite differences between the as-received and normalized materials prior to carburizing, testing revealed that normalizing did not have a significant effect on the properties of the carburized or core thermal cycle heat treated materials. PS-18 had a higher hardenability, effective case depth, and ultimate tensile strength and a lowerCharpy impact toughness than 8620

    Using sensor ontologies to create reasoning-ready sensor data for real-time hazard monitoring in a spatial decision support system

    Get PDF
    In order to protect at-risk communities and critical infrastructure, hazard managers use sensor networks to monitor the landscapes and phenomena associated with potential hazards. This strategy can produce large amounts of data, but when investigating an often unstructured problem such as hazard detection it can be beneficial to apply automated analysis routines and artificial intelligence techniques such as reasoning. Current sensor web infrastructure, however, is not designed to support this information-centric monitoring perspective. A generalized methodology to transform typical sensor data representations into a form that enables these analysis techniques has been created and is demonstrated through an implementation that bridges geospatial standards for sensor data and descriptions with an ontology-based monitoring environment. An ontology that describes sensors and measurements so they may be understood by an SDSS has also been developed. These tools have been integrated into a monitoring environment, allowing the hazard manager to thoroughly investigate potential hazards

    Geospatial Standards for Web-enabled Environmental Models

    Get PDF
    Serving geographic information via standardized Web services has been widely accepted as a useful approach. Web-enabled environmental models simulating real-world phenomena are, however, rare. The models predict observations traditionally served by geospatial Web services compliant to well-defined standards. Using standardized Web services could support decoupling of models, comparison of similar models, and the automatic integration into existing geospatial workflows. Modeling experts face several open issues when migrating existing environmental computer models to the Web. The selection of the Web service interface depends on the input parameters required for the successful execution of the computer model. Losing control over the execution of the models, and consequently also the confidence in model results, can be addressed to a certain extent by using translucent and standardized workflow languages. Mechanisms and open problems for the implementation of geospatial Web service compositions are discussed. Two scenarios about oil spills and the exposure to air pollution illustrate the impact of unconfigured model parameters for standard-compliant spatial data clients
    corecore