3,801 research outputs found

    Signals and Images in Sea Technologies

    Get PDF
    Life below water is the 14th Sustainable Development Goal (SDG) envisaged by the United Nations and is aimed at conserving and sustainably using the oceans, seas, and marine resources for sustainable development. It is not difficult to argue that signals and image technologies may play an essential role in achieving the foreseen targets linked to SDG 14. Besides increasing the general knowledge of ocean health by means of data analysis, methodologies based on signal and image processing can be helpful in environmental monitoring, in protecting and restoring ecosystems, in finding new sensor technologies for green routing and eco-friendly ships, in providing tools for implementing best practices for sustainable fishing, as well as in defining frameworks and intelligent systems for enforcing sea law and making the sea a safer and more secure place. Imaging is also a key element for the exploration of the underwater world for various scopes, ranging from the predictive maintenance of sub-sea pipelines and other infrastructure projects, to the discovery, documentation, and protection of sunken cultural heritage. The scope of this Special Issue encompasses investigations into techniques and ICT approaches and, in particular, the study and application of signal- and image-based methods and, in turn, exploration of the advantages of their application in the previously mentioned areas

    Uncertainty Management of Intelligent Feature Selection in Wireless Sensor Networks

    Get PDF
    Wireless sensor networks (WSN) are envisioned to revolutionize the paradigm of monitoring complex real-world systems at a very high resolution. However, the deployment of a large number of unattended sensor nodes in hostile environments, frequent changes of environment dynamics, and severe resource constraints pose uncertainties and limit the potential use of WSN in complex real-world applications. Although uncertainty management in Artificial Intelligence (AI) is well developed and well investigated, its implications in wireless sensor environments are inadequately addressed. This dissertation addresses uncertainty management issues of spatio-temporal patterns generated from sensor data. It provides a framework for characterizing spatio-temporal pattern in WSN. Using rough set theory and temporal reasoning a novel formalism has been developed to characterize and quantify the uncertainties in predicting spatio-temporal patterns from sensor data. This research also uncovers the trade-off among the uncertainty measures, which can be used to develop a multi-objective optimization model for real-time decision making in sensor data aggregation and samplin

    When Do Curricula Work in Federated Learning?

    Full text link
    An oft-cited open problem of federated learning is the existence of data heterogeneity at the clients. One pathway to understanding the drastic accuracy drop in federated learning is by scrutinizing the behavior of the clients' deep models on data with different levels of "difficulty", which has been left unaddressed. In this paper, we investigate a different and rarely studied dimension of FL: ordered learning. Specifically, we aim to investigate how ordered learning principles can contribute to alleviating the heterogeneity effects in FL. We present theoretical analysis and conduct extensive empirical studies on the efficacy of orderings spanning three kinds of learning: curriculum, anti-curriculum, and random curriculum. We find that curriculum learning largely alleviates non-IIDness. Interestingly, the more disparate the data distributions across clients the more they benefit from ordered learning. We provide analysis explaining this phenomenon, specifically indicating how curriculum training appears to make the objective landscape progressively less convex, suggesting fast converging iterations at the beginning of the training procedure. We derive quantitative results of convergence for both convex and nonconvex objectives by modeling the curriculum training on federated devices as local SGD with locally biased stochastic gradients. Also, inspired by ordered learning, we propose a novel client selection technique that benefits from the real-world disparity in the clients. Our proposed approach to client selection has a synergic effect when applied together with ordered learning in FL

    A Study on Data Filtering Techniques for Event-Driven Failure Analysis

    Get PDF
    Engineering & Systems DesignHigh performance sensors and modern data logging technology with real-time telemetry facilitate system failure analysis in a very precise manner. Fault detection, isolation and identification in failure analysis are typical steps to analyze the root causes of failures. This systematic failure analysis provides not only useful clues to rectify the abnormal behaviors of a system, but also key information to redesign the current system for retrofit. The main barriers to effective failure analysis are: (i) the gathered sensor data logs, usually in the form of event logs containing massive datasets, are too large, and further (ii) noise and redundant information in the gathered sensor data that make precise analysis difficult. Therefore, the objective of this thesis is to develop an event-driven failure analysis method in order to take into account both functional interactions between subsystems and diverse user???s behaviors. To do this, we first apply various data filtering techniques to data cleaning and reduction, and then convert the filtered data into a new format of event sequence information (called ???eventization???). Four eventization strategies: equal-width binning, entropy, domain knowledge expert, and probability distribution estimation, are examined for data filtering, in order to extract only important information from the raw sensor data while minimizing information loss. By numerical simulation, we identify the optimal values of eventization parameters. Finally, the event sequence information containing the time gap between event occurrences is decoded to investigate the correlation between specific event sequence patterns and various system failures. These extracted patterns are stored in a failure pattern library, and then this pattern library is used as the main reference source to predict failures in real-time during the failure prognosis phase. The efficiency of the developed procedure is examined with a terminal box data log of marine diesel engines.ope

    Denoising distributed acoustic sensing data with self-supervised deep learning

    Get PDF
    Εθνικό Μετσόβιο Πολυτεχνείο--Μεταπτυχιακή Εργασία. Διεπιστημονικό-Διατμηματικό Πρόγραμμα Μεταπτυχιακών Σπουδών (Δ.Π.Μ.Σ.) "Επιστήμη Δεδομένων και Μηχανική Μάθηση

    ANALYSIS AND ASSESSMENT OF LETHALITY AND SURVIVABILITY FOR THE MARINE LITTORAL REGIMENT

    Get PDF
    As the Marine Corps activates the Marine Littoral Regiment (MLR) to serve as the joint force’s reconnaissance and counter-reconnaissance effort, questions abound regarding the MLR’s ability to provide a persistent and lethal presence well inside the reach of our adversaries’ advanced long-range precision fires. In this study, the author uses agent-based combat simulations to inform future force design decisions, live-force experimentation, and tactics. The simulated scenario imagines a future MLR conducting sea control operations in the littorals of the Western Pacific against a peer naval threat. This research investigates the effect that a guard force of autonomous and/or semi-autonomous surface vessels, operating as the guard force of the MLR’s defense in depth, has on the survivability and lethality of the MLR’s land-based anti-ship missile platforms. Summary statistics generated by the simulation indicate that the future battlefield will see high losses on both sides. However, based on the results of 27,200 simulated engagements, this study finds that an MLR using a guard force of armed and unarmed “scouts” as described above can inflict a prohibitively high and unsustainable cost on an enemy naval force.Outstanding ThesisMajor, United States Marine CorpsApproved for public release. Distribution is unlimited

    Enhancing the usability of Satellite Earth Observations through Data Driven Models. An application to Sea Water Quality

    Get PDF
    Earth Observation from satellites has the potential to provide comprehensive, rapid and inexpensive information about land and water bodies. Marine monitoring could gain in effectiveness if integrated with approaches that are able to collect data from wide geographic areas, such as satellite observation. Integrated with in situ measurements, satellite observations enable to extend the punctual information of sampling campaigns to a synoptic view, increase the spatial and temporal coverage, and thus increase the representativeness of the natural diversity of the monitored water bodies, their inter-annual variability and water quality trends, providing information to support EU Member States’ action plans. Turbidity is one of the optically active water quality parameters that can be derived from satellite data, and is one of the environmental indicator considered by EU directives monitoring programmes. Turbidity is a visual property of water, related to the amount of light scattered by particles in water, and it can act as simple and convenient indirect measure of the concentration of suspended solids and other particulate material. A review of the state-of-the-art shows that most traditional methods to estimate turbidity from optical satellite images are based on semi-empirical models relying on few spectral bands. The choice of the most suitable bands to be used is often site and season specific, as it is related to the type and concentration of suspended particles. When investigating wide areas or long time series that include different optical water types, the application of machine learning algorithms seems to be promising due to their flexibility, responding to the need of a model that can adapt to varying water conditions with smooth transition, and their ability to exploit the wealth of spectral information. Moreover, machine learning models have shown to be less affected by atmospheric and other background factors. Atmospheric correction for water leaving reflectance, in fact, still remains one of the major challenges in aquatic remote sensing. The use of machine learning for remotely sensed water quality estimation has spread in recent years thanks to the advances in algorithm development, computing power, and availability of higher spatial resolution data. Among all existing algorithms, the choice of the complexity of the model derives from the nature and number of available data. The present study explores the use of Sentinel-2 MultiSpectral Instrument (MSI) Level-1C Top of Atmosphere spectral radiance to derive water turbidity, through application of a Polynomial Kernel Regularized Least Squares regression. This algorithms is characterized by a simple model structure, good generalization, global optimal solution, especially suitable for non-linear and high dimension problems. The study area is located in the North Tyrrhenian Sea (Italy), covering a coastline of about 100 km, characterized by a varied shoreline, embracing environments worthy of protection and valuable biodiversity, but also relevant ports, and three main river flow and sediment discharge. The coastal environment in this area has been monitored since 2001, according to the 2000/60/EC Water Framework Directive, and in 2008 EU Marine Strategy Framework Directive 2008/56/EC further strengthened the investigation in the area. A dataset of combination of turbidity measurements, expressed in nephelometric turbidity units (NTU), and values of the 13 spectral bands in the pixel corresponding to the sample location was used to calibrate and validate the model. The developed turbidity model shows good agreement of the estimated satellite-derived surface turbidity with the measured one, confirming that the use of ML techniques allows to reach a good accuracy in turbidity estimation from satellite Top of Atmosphere reflectance. Comparison between turbidity estimates obtained from the model with turbidity data from Copernicus CMEMS dataset named ’Mediterranean Sea, Bio-Geo-Chemical, L3, daily observation’, which was used as benchmark, produced consistent results. A band importance analysis revealed the contribution of the different spectral bands and the main role of the red-edge range. Finally, turbidity maps from satellite imagery were produced for the study area, showing the ability of the model to catch extreme events and, overall, how it represents an important tool to improve our understanding of the complex factors that influence water quality in our oceans

    EVALUATING ARTIFICIAL INTELLIGENCE METHODS FOR USE IN KILL CHAIN FUNCTIONS

    Get PDF
    Current naval operations require sailors to make time-critical and high-stakes decisions based on uncertain situational knowledge in dynamic operational environments. Recent tragic events have resulted in unnecessary casualties, and they represent the decision complexity involved in naval operations and specifically highlight challenges within the OODA loop (Observe, Orient, Decide, and Assess). Kill chain decisions involving the use of weapon systems are a particularly stressing category within the OODA loop—with unexpected threats that are difficult to identify with certainty, shortened decision reaction times, and lethal consequences. An effective kill chain requires the proper setup and employment of shipboard sensors; the identification and classification of unknown contacts; the analysis of contact intentions based on kinematics and intelligence; an awareness of the environment; and decision analysis and resource selection. This project explored the use of automation and artificial intelligence (AI) to improve naval kill chain decisions. The team studied naval kill chain functions and developed specific evaluation criteria for each function for determining the efficacy of specific AI methods. The team identified and studied AI methods and applied the evaluation criteria to map specific AI methods to specific kill chain functions.Civilian, Department of the NavyCivilian, Department of the NavyCivilian, Department of the NavyCaptain, United States Marine CorpsCivilian, Department of the NavyCivilian, Department of the NavyApproved for public release. Distribution is unlimited
    corecore