359 research outputs found

    Mapping and Deep Analysis of Image Dehazing: Coherent Taxonomy, Datasets, Open Challenges, Motivations, and Recommendations

    Get PDF
    Our study aims to review and analyze the most relevant studies in the image dehazing field. Many aspects have been deemed necessary to provide a broad understanding of various studies that have been examined through surveying the existing literature. These aspects are as follows: datasets that have been used in the literature, challenges that other researchers have faced, motivations, and recommendations for diminishing the obstacles in the reported literature. A systematic protocol is employed to search all relevant articles on image dehazing, with variations in keywords, in addition to searching for evaluation and benchmark studies. The search process is established on three online databases, namely, IEEE Xplore, Web of Science (WOS), and ScienceDirect (SD), from 2008 to 2021. These indices are selected because they are sufficient in terms of coverage. Along with definition of the inclusion and exclusion criteria, we include 152 articles to the final set. A total of 55 out of 152 articles focused on various studies that conducted image dehazing, and 13 out 152 studies covered most of the review papers based on scenarios and general overviews. Finally, most of the included articles centered on the development of image dehazing algorithms based on real-time scenario (84/152) articles. Image dehazing removes unwanted visual effects and is often considered an image enhancement technique, which requires a fully automated algorithm to work under real-time outdoor applications, a reliable evaluation method, and datasets based on different weather conditions. Many relevant studies have been conducted to meet these critical requirements. We conducted objective image quality assessment experimental comparison of various image dehazing algorithms. In conclusions unlike other review papers, our study distinctly reflects different observations on image dehazing areas. We believe that the result of this study can serve as a useful guideline for practitioners who are looking for a comprehensive view on image dehazing

    A stochastic method for representation, modelling and fusion of excavated material in mining

    Get PDF
    The ability to safely and economically extract raw materials such as iron ore from a greater number of remote, isolated and possibly dangerous locations will become more pressing over the coming decades as easily accessible deposits become depleted. An autonomous mining system has the potential to make the mining process more efficient, predictable and safe under these changing conditions. One of the key parts of the mining process is the estimation and tracking of bulk material through the mining production chain. Current state-of-the-art tracking and estimation systems use a deterministic representation for bulk material. This is problematic for wide-scale automation of mine processes as there is no measurement of the uncertainty in the estimates provided. A probabilistic representation is critical for autonomous systems to correctly interpret and fuse the available data in order to make the most informed decision given the available information without human intervention. This thesis investigates whether bulk material properties can be represented probabilistically through a mining production chain to provide statistically consistent estimates of the material at each stage of the production chain. Experiments and methods within this thesis focus on the load-haul-dump cycle. The development of a representation of bulk material using lumped masses is presented. A method for tracking and estimation of these lumped masses within the mining production chain using an 'Augmented State Kalman Filter' (ASKF) is developed. The method ensures that the fusion of new information at different stages will provide statistically consistent estimates of the lumped mass. There is a particular focus on the feasibility and practicality of implementing a solution on a production mine site given the current sensing technology available and how it can be adapted for use within the developed estimation system (with particular focus on remote sensing and volume estimation)

    Harnessing Big Data and Machine Learning for Event Detection and Localization

    Get PDF
    Anomalous events are rare and significantly deviate from expected pattern and other data instances, making them hard to predict. Correctly and timely detecting anomalous severe events can help reduce risks and losses. Many anomalous event detection techniques are studied in the literature. Recently, big data and machine learning based techniques have shown a remarkable success in a wide range of fields. It is important to tailor big data and machine learning based techniques for each application; otherwise it may result in expensive computation, slow prediction, false alarms, and improper prediction granularity.First, we aim to address the above challenges by harnessing big data and machine learning techniques for fast and reliable prediction and localization of severe events. Firstly, to improve storage failure prediction, we develop a new lightweight and high performing tensor decomposition-based method, named SEFEE, for storage error forecasting in large-scale enterprise storage systems. SEFEE employs tensor decomposition technique to capture latent spatio-temporal information embedded in storage event logs. By utilizing the latent spatio-temporal information, we can make accurate storage error forecasting without training requirements of typical machine learning techniques. The training-free method allows for live prediction of storage errors and their locations in the storage system based on previous observations that had been used in tensor decomposition pipeline to extract meaningful latent correlations. Moreover, we propose an extension to include severity of the errors as contextual information to improve the accuracy of tensor decomposition which in turn improves the prediction accuracy. We further provide detailed characterization of NetApp dataset to provide additional insight into the dynamics of typical large-scale enterprise storage systems for the community.Next, we focus on another application -- AI-driven Wildfire prediction. Wildfires cause billions of dollars in property damages and loss of lives, with harmful health threats. We aim to correctly detect and localize wildfire events in the early stage and also classify wildfire smoke based on perceived pixel density of camera images. Due to the lack of publicly available dataset for early wildfire smoke detection, we first collect and process images from the AlertWildfire camera network. The images are annotated with bounding boxes and densities for deep learning methods to use. We then adapt a transformer-based end-to-end object detection model for wildfire detection using our dataset. The dataset and detection model together form as a benchmark named the Nevada smoke detection benchmark, or Nemo for short. Nemo is the first open-source benchmark for wildfire smoke detection with the focus of the early incipient stage. We further provide a weakly supervised Nemo version to enable wider support as a benchmark

    Advances in Image Processing, Analysis and Recognition Technology

    Get PDF
    For many decades, researchers have been trying to make computers’ analysis of images as effective as the system of human vision is. For this purpose, many algorithms and systems have previously been created. The whole process covers various stages, including image processing, representation and recognition. The results of this work can be applied to many computer-assisted areas of everyday life. They improve particular activities and provide handy tools, which are sometimes only for entertainment, but quite often, they significantly increase our safety. In fact, the practical implementation of image processing algorithms is particularly wide. Moreover, the rapid growth of computational complexity and computer efficiency has allowed for the development of more sophisticated and effective algorithms and tools. Although significant progress has been made so far, many issues still remain, resulting in the need for the development of novel approaches

    Intelligent Transportation Related Complex Systems and Sensors

    Get PDF
    Building around innovative services related to different modes of transport and traffic management, intelligent transport systems (ITS) are being widely adopted worldwide to improve the efficiency and safety of the transportation system. They enable users to be better informed and make safer, more coordinated, and smarter decisions on the use of transport networks. Current ITSs are complex systems, made up of several components/sub-systems characterized by time-dependent interactions among themselves. Some examples of these transportation-related complex systems include: road traffic sensors, autonomous/automated cars, smart cities, smart sensors, virtual sensors, traffic control systems, smart roads, logistics systems, smart mobility systems, and many others that are emerging from niche areas. The efficient operation of these complex systems requires: i) efficient solutions to the issues of sensors/actuators used to capture and control the physical parameters of these systems, as well as the quality of data collected from these systems; ii) tackling complexities using simulations and analytical modelling techniques; and iii) applying optimization techniques to improve the performance of these systems. It includes twenty-four papers, which cover scientific concepts, frameworks, architectures and various other ideas on analytics, trends and applications of transportation-related data

    Current Air Quality Issues

    Get PDF
    Air pollution is thus far one of the key environmental issues in urban areas. Comprehensive air quality plans are required to manage air pollution for a particular area. Consequently, air should be continuously sampled, monitored, and modeled to examine different action plans. Reviews and research papers describe air pollution in five main contexts: Monitoring, Modeling, Risk Assessment, Health, and Indoor Air Pollution. The book is recommended to experts interested in health and air pollution issues

    Application of advanced technology to space automation

    Get PDF
    Automated operations in space provide the key to optimized mission design and data acquisition at minimum cost for the future. The results of this study strongly accentuate this statement and should provide further incentive for immediate development of specific automtion technology as defined herein. Essential automation technology requirements were identified for future programs. The study was undertaken to address the future role of automation in the space program, the potential benefits to be derived, and the technology efforts that should be directed toward obtaining these benefits

    Proceedings of the UCLA International Conference on Radiation and Remote Probing of the Atmosphere

    Get PDF
    Various articles are presented on multiple scattering problems and radiative transfer in the atmosphere. Particle size distribution and molecular absorption are also discussed

    COrE (Cosmic Origins Explorer) A White Paper

    Full text link
    COrE (Cosmic Origins Explorer) is a fourth-generation full-sky, microwave-band satellite recently proposed to ESA within Cosmic Vision 2015-2025. COrE will provide maps of the microwave sky in polarization and temperature in 15 frequency bands, ranging from 45 GHz to 795 GHz, with an angular resolution ranging from 23 arcmin (45 GHz) and 1.3 arcmin (795 GHz) and sensitivities roughly 10 to 30 times better than PLANCK (depending on the frequency channel). The COrE mission will lead to breakthrough science in a wide range of areas, ranging from primordial cosmology to galactic and extragalactic science. COrE is designed to detect the primordial gravitational waves generated during the epoch of cosmic inflation at more than 3σ3\sigma for r=(T/S)>=103r=(T/S)>=10^{-3}. It will also measure the CMB gravitational lensing deflection power spectrum to the cosmic variance limit on all linear scales, allowing us to probe absolute neutrino masses better than laboratory experiments and down to plausible values suggested by the neutrino oscillation data. COrE will also search for primordial non-Gaussianity with significant improvements over Planck in its ability to constrain the shape (and amplitude) of non-Gaussianity. In the areas of galactic and extragalactic science, in its highest frequency channels COrE will provide maps of the galactic polarized dust emission allowing us to map the galactic magnetic field in areas of diffuse emission not otherwise accessible to probe the initial conditions for star formation. COrE will also map the galactic synchrotron emission thirty times better than PLANCK. This White Paper reviews the COrE science program, our simulations on foreground subtraction, and the proposed instrumental configuration.Comment: 90 pages Latex 15 figures (revised 28 April 2011, references added, minor errors corrected
    corecore