4,125 research outputs found

    EARLINET: towards an advanced sustainable European aerosol lidar network

    Get PDF
    The European Aerosol Research Lidar Network, EARLINET, was founded in 2000 as a research project for establishing a quantitative, comprehensive, and statistically significant database for the horizontal, vertical, and temporal distribution of aerosols on a continental scale. Since then EARLINET has continued to provide the most extensive collection of ground-based data for the aerosol vertical distribution over Europe. This paper gives an overview of the network's main developments since 2000 and introduces the dedicated EARLINET special issue, which reports on the present innovative and comprehensive technical solutions and scientific results related to the use of advanced lidar remote sensing techniques for the study of aerosol properties as developed within the network in the last 13 years. Since 2000, EARLINET has developed greatly in terms of number of stations and spatial distribution: from 17 stations in 10 countries in 2000 to 27 stations in 16 countries in 2013. EARLINET has developed greatly also in terms of technological advances with the spread of advanced multiwavelength Raman lidar stations in Europe. The developments for the quality assurance strategy, the optimization of instruments and data processing, and the dissemination of data have contributed to a significant improvement of the network towards a more sustainable observing system, with an increase in the observing capability and a reduction of operational costs. Consequently, EARLINET data have already been extensively used for many climatological studies, long-range transport events, Saharan dust outbreaks, plumes from volcanic eruptions, and for model evaluation and satellite data validation and integration. Future plans are aimed at continuous measurements and near-real-time data delivery in close cooperation with other ground-based networks, such as in the ACTRIS (Aerosols, Clouds, and Trace gases Research InfraStructure Network) www.actris.net, and with the modeling and satellite community, linking the research community with the operational world, with the aim of establishing of the atmospheric part of the European component of the integrated global observing system.Peer ReviewedPostprint (published version

    Neo: A Learned Query Optimizer

    Full text link
    Query optimization is one of the most challenging problems in database systems. Despite the progress made over the past decades, query optimizers remain extremely complex components that require a great deal of hand-tuning for specific workloads and datasets. Motivated by this shortcoming and inspired by recent advances in applying machine learning to data management challenges, we introduce Neo (Neural Optimizer), a novel learning-based query optimizer that relies on deep neural networks to generate query executions plans. Neo bootstraps its query optimization model from existing optimizers and continues to learn from incoming queries, building upon its successes and learning from its failures. Furthermore, Neo naturally adapts to underlying data patterns and is robust to estimation errors. Experimental results demonstrate that Neo, even when bootstrapped from a simple optimizer like PostgreSQL, can learn a model that offers similar performance to state-of-the-art commercial optimizers, and in some cases even surpass them

    Tupleware: Redefining Modern Analytics

    Full text link
    There is a fundamental discrepancy between the targeted and actual users of current analytics frameworks. Most systems are designed for the data and infrastructure of the Googles and Facebooks of the world---petabytes of data distributed across large cloud deployments consisting of thousands of cheap commodity machines. Yet, the vast majority of users operate clusters ranging from a few to a few dozen nodes, analyze relatively small datasets of up to a few terabytes, and perform primarily compute-intensive operations. Targeting these users fundamentally changes the way we should build analytics systems. This paper describes the design of Tupleware, a new system specifically aimed at the challenges faced by the typical user. Tupleware's architecture brings together ideas from the database, compiler, and programming languages communities to create a powerful end-to-end solution for data analysis. We propose novel techniques that consider the data, computations, and hardware together to achieve maximum performance on a case-by-case basis. Our experimental evaluation quantifies the impact of our novel techniques and shows orders of magnitude performance improvement over alternative systems

    Pay One, Get Hundreds for Free: Reducing Cloud Costs through Shared Query Execution

    Full text link
    Cloud-based data analysis is nowadays common practice because of the lower system management overhead as well as the pay-as-you-go pricing model. The pricing model, however, is not always suitable for query processing as heavy use results in high costs. For example, in query-as-a-service systems, where users are charged per processed byte, collections of queries accessing the same data frequently can become expensive. The problem is compounded by the limited options for the user to optimize query execution when using declarative interfaces such as SQL. In this paper, we show how, without modifying existing systems and without the involvement of the cloud provider, it is possible to significantly reduce the overhead, and hence the cost, of query-as-a-service systems. Our approach is based on query rewriting so that multiple concurrent queries are combined into a single query. Our experiments show the aggregated amount of work done by the shared execution is smaller than in a query-at-a-time approach. Since queries are charged per byte processed, the cost of executing a group of queries is often the same as executing a single one of them. As an example, we demonstrate how the shared execution of the TPC-H benchmark is up to 100x and 16x cheaper in Amazon Athena and Google BigQuery than using a query-at-a-time approach while achieving a higher throughput

    Long-term monitoring of geodynamic surface deformation using SAR interferometry

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 2014Synthetic Aperture Radar Interferometry (InSAR) is a powerful tool to measure surface deformation and is well suited for surveying active volcanoes using historical and existing satellites. However, the value and applicability of InSAR for geodynamic monitoring problems is limited by the influence of temporal decorrelation and electromagnetic path delay variations in the atmosphere, both of which reduce the sensitivity and accuracy of the technique. The aim of this PhD thesis research is: how to optimize the quantity and quality of deformation signals extracted from InSAR stacks that contain only a low number of images in order to facilitate volcano monitoring and the study of their geophysical signatures. In particular, the focus is on methods of mitigating atmospheric artifacts in interferograms by combining time-series InSAR techniques and external atmospheric delay maps derived by Numerical Weather Prediction (NWP) models. In the first chapter of the thesis, the potential of the NWP Weather Research & Forecasting (WRF) model for InSAR data correction has been studied extensively. Forecasted atmospheric delays derived from operational High Resolution Rapid Refresh for the Alaska region (HRRRAK) products have been compared to radiosonding measurements in the first chapter. The result suggests that the HRRR-AK operational products are a good data source for correcting atmospheric delays in spaceborne geodetic radar observations, if the geophysical signal to be observed is larger than 20 mm. In the second chapter, an advanced method for integrating NWP products into the time series InSAR workflow is developed. The efficiency of the algorithm is tested via simulated data experiments, which demonstrate the method outperforms other more conventional methods. In Chapter 3, a geophysical case study is performed by applying the developed algorithm to the active volcanoes of Unimak Island Alaska (Westdahl, Fisher and Shishaldin) for long term volcano deformation monitoring. The volcano source location at Westdahl is determined to be approx. 7 km below sea level and approx. 3.5 km north of the Westdahl peak. This study demonstrates that Fisher caldera has had continuous subsidence over more than 10 years and there is no evident deformation signal around Shishaldin peak.Chapter 1. Performance of the High Resolution Atmospheric Model HRRR-AK for Correcting Geodetic Observations from Spaceborne Radars -- Chapter 2. Robust atmospheric filtering of InSAR data based on numerical weather prediction models -- Chapter 3. Subtle motion long term monitoring of Unimak Island from 2003 to 2010 by advanced time series SAR interferometry -- Chapter 4. Conclusion and future work
    • …
    corecore