23,433 research outputs found
Recommended from our members
Investigating the impact of image content on the energy efficiency of hardware-accelerated digital spatial filters
Battery-operated low-power portable computing devices are becoming an inseparable part of human daily life. One of the major goals is to achieve the longest battery life in such a device. Additionally, the need for performance in processing multimedia content is ever increasing. Processing image and video content consume more power than other applications. A widely used approach to improving energy efficiency is to implement the computationally intensive functions as digital hardware accelerators. Spatial filtering is one of the most commonly used methods of digital image processing. As per the Fourier theory, an image can be considered as a two-dimensional signal that is composed of spatially extended two-dimensional sinusoidal patterns called gratings. Spatial frequency theory states that sinusoidal gratings can be characterised by its spatial frequency, phase, amplitude, and orientation. This article presents results from our investigation into assessing the impact of these characteristics of a digital image on the energy efficiency of hardware-accelerated spatial filters employed to process the same image. Two greyscale images each of size 128 × 128 pixels comprising two-dimensional sinusoidal gratings at maximum spatial frequency of 64 cycles per image orientated at 0° and 90°, respectively, were processed in a hardware implemented Gaussian smoothing filter. The energy efficiency of the filter was compared with the baseline energy efficiency of processing a featureless plain black image. The results show that energy efficiency of the filter drops to 12.5% when the gratings are orientated at 0° whilst rises to 72.38% at 90°
Rapid gravity filtration operational performance assessment and diagnosis for preventative maintenance from on-line data
Rapid gravity filters, the final particulate barrier in many water treatment systems, are typically monitored using on-line turbidity, flow and head loss instrumentation. Current metrics for assessing filtration performance from on-line turbidity data were critically assessed and observed not to effectively and consistently summarise the important properties of a turbidity distribution and the associated water quality risk. In the absence of a consistent risk function for turbidity in treated water, using on-line turbidity as an indicative rather than a quantitative variable appears to be more practical. Best practice suggests that filtered water turbidity should be maintained below 0.1 NTU, at higher turbidity we can be less confident of an effective particle and pathogen barrier. Based on this simple distinction filtration performance has been described in terms of reliability and resilience by characterising the likelihood, frequency and duration of turbidity spikes greater than 0.1 NTU. This view of filtration performance is then used to frame operational diagnosis of unsatisfactory performance in terms of a machine learning classification problem. Through calculation of operationally relevant predictor variables and application of the Classification and Regression Tree (CART) algorithm the conditions associated with the greatest risk of poor filtration performance can be effectively modelled and communicated in operational terms. This provides a method for an evidence based decision support which can be used to efficiently manage individual pathogen barriers in a multi-barrier system
Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age
Simultaneous Localization and Mapping (SLAM)consists in the concurrent
construction of a model of the environment (the map), and the estimation of the
state of the robot moving within it. The SLAM community has made astonishing
progress over the last 30 years, enabling large-scale real-world applications,
and witnessing a steady transition of this technology to industry. We survey
the current state of SLAM. We start by presenting what is now the de-facto
standard formulation for SLAM. We then review related work, covering a broad
set of topics including robustness and scalability in long-term mapping, metric
and semantic representations for mapping, theoretical performance guarantees,
active SLAM and exploration, and other new frontiers. This paper simultaneously
serves as a position paper and tutorial to those who are users of SLAM. By
looking at the published research with a critical eye, we delineate open
challenges and new research issues, that still deserve careful scientific
investigation. The paper also contains the authors' take on two questions that
often animate discussions during robotics conferences: Do robots need SLAM? and
Is SLAM solved
- …