82,622 research outputs found

    Automated Processing of Webcam Images for Phenological Classification

    Get PDF
    Along with the global climate change, there is an increasing interest for its effect on phenological patterns such as start and end of the growing season. Scientific digital webcams are used for this purpose taking every day one or more images from the same natural motive showing for example trees or grassland sites. To derive phenological patterns from the webcam images, regions of interest are manually defined on these images by an expert and subsequently a time series of percentage greenness is derived and analyzed with respect to structural changes. While this standard approach leads to satisfying results and allows to determine dates of phenological change points, it is associated with a considerable amount of manual work and is therefore constrained to a limited number of webcams only. In particular, this forbids to apply the phenological analysis to a large network of publicly accessible webcams in order to capture spatial phenological variation. In order to be able to scale up the analysis to several hundreds or thousands of webcams, we propose and evaluate two automated alternatives for the definition of regions of interest, allowing for efficient analyses of webcam images. A semi-supervised approach selects pixels based on the correlation of the pixels’ time series of percentage greenness with a few prototype pixels. An unsupervised approach clusters pixels based on scores of a singular value decomposition. We show for a scientific webcam that the resulting regions of interest are at least as informative as those chosen by an expert with the advantage that no manual action is required. Additionally, we show that the methods can even be applied to publicly available webcams accessed via the internet yielding interesting partitions of the analyzed images. Finally, we show that the methods are suitable for the intended big data applications by analyzing 13988 webcams from the AMOS database. All developed methods are implemented in the statistical software package R and publicly available in the R package phenofun. Executable example code is provided as supplementary material

    Monitoring the impact of land cover change on surface urban heat island through google earth engine. Proposal of a global methodology, first applications and problems

    Get PDF
    All over the world, the rapid urbanization process is challenging the sustainable development of our cities. In 2015, the United Nation highlighted in Goal 11 of the SDGs (Sustainable Development Goals) the importance to "Make cities inclusive, safe, resilient and sustainable". In order to monitor progress regarding SDG 11, there is a need for proper indicators, representing different aspects of city conditions, obviously including the Land Cover (LC) changes and the urban climate with its most distinct feature, the Urban Heat Island (UHI). One of the aspects of UHI is the Surface Urban Heat Island (SUHI), which has been investigated through airborne and satellite remote sensing over many years. The purpose of this work is to show the present potential of Google Earth Engine (GEE) to process the huge and continuously increasing free satellite Earth Observation (EO) Big Data for long-term and wide spatio-temporal monitoring of SUHI and its connection with LC changes. A large-scale spatio-temporal procedure was implemented under GEE, also benefiting from the already established Climate Engine (CE) tool to extract the Land Surface Temperature (LST) from Landsat imagery and the simple indicator Detrended Rate Matrix was introduced to globally represent the net effect of LC changes on SUHI. The implemented procedure was successfully applied to six metropolitan areas in the U.S., and a general increasing of SUHI due to urban growth was clearly highlighted. As a matter of fact, GEE indeed allowed us to process more than 6000 Landsat images acquired over the period 1992-2011, performing a long-term and wide spatio-temporal study on SUHI vs. LC change monitoring. The present feasibility of the proposed procedure and the encouraging obtained results, although preliminary and requiring further investigations (calibration problems related to LST determination from Landsat imagery were evidenced), pave the way for a possible global service on SUHI monitoring, able to supply valuable indications to address an increasingly sustainable urban planning of our cities

    StructMatrix: large-scale visualization of graphs by means of structure detection and dense matrices

    Get PDF
    Given a large-scale graph with millions of nodes and edges, how to reveal macro patterns of interest, like cliques, bi-partite cores, stars, and chains? Furthermore, how to visualize such patterns altogether getting insights from the graph to support wise decision-making? Although there are many algorithmic and visual techniques to analyze graphs, none of the existing approaches is able to present the structural information of graphs at large-scale. Hence, this paper describes StructMatrix, a methodology aimed at high-scalable visual inspection of graph structures with the goal of revealing macro patterns of interest. StructMatrix combines algorithmic structure detection and adjacency matrix visualization to present cardinality, distribution, and relationship features of the structures found in a given graph. We performed experiments in real, large-scale graphs with up to one million nodes and millions of edges. StructMatrix revealed that graphs of high relevance (e.g., Web, Wikipedia and DBLP) have characterizations that reflect the nature of their corresponding domains; our findings have not been seen in the literature so far. We expect that our technique will bring deeper insights into large graph mining, leveraging their use for decision making.Comment: To appear: 8 pages, paper to be published at the Fifth IEEE ICDM Workshop on Data Mining in Networks, 2015 as Hugo Gualdron, Robson Cordeiro, Jose Rodrigues (2015) StructMatrix: Large-scale visualization of graphs by means of structure detection and dense matrices In: The Fifth IEEE ICDM Workshop on Data Mining in Networks 1--8, IEE

    Impacts of The Radiation Environment At L2 On Bolometers Onboard The Herschel Space Observatory

    Full text link
    We present the effects of cosmic rays on the detectors onboard the Herschel satellite. We describe in particular the glitches observed on the two types of cryogenic far- infrared bolometer inside the two instruments PACS and SPIRE. The glitch rates are also reported since the launch together with the SREM radiation monitors aboard Herschel and Planck spacecrafts. Both have been injected around the Lagrangian point L2 on May 2009. This allows probing the radiation environment around this orbit. The impacts on the observation are finally summarized.Comment: 8 pages, 13 figures, 2 images, Author Keywords: Bolometers, Infrared detectors, cryogenics, radiation effects, submillimeter wave technology IEEE Terms: Bolometers, Detectors, Instruments, Picture archiving and communication systems, Protons, Silicon, Space vehicles; Radiation and Its Effects on Components and Systems (RADECS), 2011 12th European Conference. Conference location: Sevilla. Date of Conference: 19-23 Sept. 2011. Session H: Radiation Environment: Space, Atmospheric and Terrestrial (PH2

    Uncertainty quantification for radio interferometric imaging: II. MAP estimation

    Get PDF
    Uncertainty quantification is a critical missing component in radio interferometric imaging that will only become increasingly important as the big-data era of radio interferometry emerges. Statistical sampling approaches to perform Bayesian inference, like Markov Chain Monte Carlo (MCMC) sampling, can in principle recover the full posterior distribution of the image, from which uncertainties can then be quantified. However, for massive data sizes, like those anticipated from the Square Kilometre Array (SKA), it will be difficult if not impossible to apply any MCMC technique due to its inherent computational cost. We formulate Bayesian inference problems with sparsity-promoting priors (motivated by compressive sensing), for which we recover maximum a posteriori (MAP) point estimators of radio interferometric images by convex optimisation. Exploiting recent developments in the theory of probability concentration, we quantify uncertainties by post-processing the recovered MAP estimate. Three strategies to quantify uncertainties are developed: (i) highest posterior density credible regions; (ii) local credible intervals (cf. error bars) for individual pixels and superpixels; and (iii) hypothesis testing of image structure. These forms of uncertainty quantification provide rich information for analysing radio interferometric observations in a statistically robust manner. Our MAP-based methods are approximately 10510^5 times faster computationally than state-of-the-art MCMC methods and, in addition, support highly distributed and parallelised algorithmic structures. For the first time, our MAP-based techniques provide a means of quantifying uncertainties for radio interferometric imaging for realistic data volumes and practical use, and scale to the emerging big-data era of radio astronomy.Comment: 13 pages, 10 figures, see companion article in this arXiv listin
    corecore