30,552 research outputs found

    Benefits of spatio-temporal modelling for short term wind power forecasting at both individual and aggregated levels

    Get PDF
    The share of wind energy in total installed power capacity has grown rapidly in recent years around the world. Producing accurate and reliable forecasts of wind power production, together with a quantification of the uncertainty, is essential to optimally integrate wind energy into power systems. We build spatio-temporal models for wind power generation and obtain full probabilistic forecasts from 15 minutes to 5 hours ahead. Detailed analysis of the forecast performances on the individual wind farms and aggregated wind power are provided. We show that it is possible to improve the results of forecasting aggregated wind power by utilizing spatio-temporal correlations among individual wind farms. Furthermore, spatio-temporal models have the advantage of being able to produce spatially out-of-sample forecasts. We evaluate the predictions on a data set from wind farms in western Denmark and compare the spatio-temporal model with an autoregressive model containing a common autoregressive parameter for all wind farms, identifying the specific cases when it is important to have a spatio-temporal model instead of a temporal one. This case study demonstrates that it is possible to obtain fast and accurate forecasts of wind power generation at wind farms where data is available, but also at a larger portfolio including wind farms at new locations. The results and the methodologies are relevant for wind power forecasts across the globe as well as for spatial-temporal modelling in general

    A simple recipe for making accurate parametric inference in finite sample

    Full text link
    Constructing tests or confidence regions that control over the error rates in the long-run is probably one of the most important problem in statistics. Yet, the theoretical justification for most methods in statistics is asymptotic. The bootstrap for example, despite its simplicity and its widespread usage, is an asymptotic method. There are in general no claim about the exactness of inferential procedures in finite sample. In this paper, we propose an alternative to the parametric bootstrap. We setup general conditions to demonstrate theoretically that accurate inference can be claimed in finite sample

    Incremental Lifecycle Validation Of Knowledge-based Systems Through Commonkads

    Get PDF
    This dissertation introduces a novel validation method for knowledge-based systems (KBS). Validation is an essential phase in the development lifecycle of knowledge-based systems. Validation ensures that the system is valid, reliable and that it reflects the knowledge of the expert and meets the specifications. Although many validation methods have been introduced for knowledge-based systems, there is still a need for an incremental validation method based on a lifecycle model. Lifecycle models provide a general framework for the developer and a mapping technique from the system into the validation process. They support reusability, modularity and offer guidelines for knowledge engineers to achieve high quality systems. CommonKADS is a set of models that helps to represent and analyze knowledge-based systems. It offers a de facto standard for building knowledge-based systems. Additionally, CommonKADS is a knowledge representation-independent model. It has powerful models that can represent many domains. Defining an incremental validation method based on a conceptual lifecycle model (such as CommonKADS) has a number of advantages such as reducing time and effort, ease of implementation when having a template to follow, well-structured design, and better tracking of errors when they occur. Moreover, the validation method introduced in this dissertation is based on case testing and selecting an appropriate set of test cases to validate the system. The validation method defined makes use of results of prior test cases in an incremental validation procedure. This facilitates defining a minimal set of test cases that provides complete and effective system coverage. CommonKADS doesn’t define validation, verification or testing in any of its models. This research seeks to establish a direct relation between validation and lifecycle models, and introduces a validation method for KBS embedded into CommonKAD

    Helium: lifting high-performance stencil kernels from stripped x86 binaries to halide DSL code

    Get PDF
    Highly optimized programs are prone to bit rot, where performance quickly becomes suboptimal in the face of new hardware and compiler techniques. In this paper we show how to automatically lift performance-critical stencil kernels from a stripped x86 binary and generate the corresponding code in the high-level domain-specific language Halide. Using Halide’s state-of-the-art optimizations targeting current hardware, we show that new optimized versions of these kernels can replace the originals to rejuvenate the application for newer hardware. The original optimized code for kernels in stripped binaries is nearly impossible to analyze statically. Instead, we rely on dynamic traces to regenerate the kernels. We perform buffer structure reconstruction to identify input, intermediate and output buffer shapes. We abstract from a forest of concrete dependency trees which contain absolute memory addresses to symbolic trees suitable for high-level code generation. This is done by canonicalizing trees, clustering them based on structure, inferring higher-dimensional buffer accesses and finally by solving a set of linear equations based on buffer accesses to lift them up to simple, high-level expressions. Helium can handle highly optimized, complex stencil kernels with input-dependent conditionals. We lift seven kernels from Adobe Photoshop giving a 75% performance improvement, four kernels from IrfanView, leading to 4.97× performance, and one stencil from the miniGMG multigrid benchmark netting a 4.25× improvement in performance. We manually rejuvenated Photoshop by replacing eleven of Photoshop’s filters with our lifted implementations, giving 1.12× speedup without affecting the user experience.United States. Dept. of Energy (Award DE-SC0005288)United States. Dept. of Energy (Award DE-SC0008923)United States. Defense Advanced Research Projects Agency (Agreement FA8759-14-2-0009)MIT Energy Initiative (Fellowship

    Moment tensor inversions of icequakes on Gornergletscher, Switzerland

    Get PDF
    We have determined seismic source mechanisms for shallow and intermediate-depth icequake clusters recorded on the glacier Gornergletscher, Switzerland, during the summers of 2004 and 2006. The selected seismic events are part of a large data set of over 80,000 seismic events acquired with a dense seismic network deployed in order to study the yearly rapid drainage of Gornersee lake, a nearby ice-marginal lake. Using simple frequency and distance scaling and Green’s functions for a homogeneous half-space, we calculated moment tensor solutions for icequakes with M_w-1.5 using a full-waveform inversion method usually applied to moderate seismic events (M_w>4) recorded at local to regional distances (≈50–700 km). Inversions from typical shallow events are shown to represent tensile crack openings. This explains well the dominating Rayleigh waves and compressive first motions observed at all recording seismograms. As these characteristics can be observed in most icequake signals, we believe that the vast majority of icequakes recorded in the 2 yr is due to tensile faulting, most likely caused by surface crevasse openings. We also identified a shallow cluster with somewhat atypical waveforms in that they show less dominant Rayleigh waves and quadrantal radiation patterns of first motions. Their moment tensors are dominated by a large double-couple component, which is strong evidence for shear faulting. Although less than a dozen such icequakes have been identified, this is a substantial result as it shows that shear faulting in glacier ice is generally possible even in the absence of extreme flow changes such as during glacier surges. A third source of icequakes was located at 100 m depth. These sources can be represented by tensile crack openings. Because of the high-hydrostatic pressure within the ice at these depths, these events are most likely related to the presence of water lenses that reduce the effective stress to allow for tensile faulting

    MODIS Information, Data, and Control System (MIDACS) system specifications and conceptual design

    Get PDF
    The MODIS Information, Data, and Control System (MIDACS) Specifications and Conceptual Design Document discusses system level requirements, the overall operating environment in which requirements must be met, and a breakdown of MIDACS into component subsystems, which include the Instrument Support Terminal, the Instrument Control Center, the Team Member Computing Facility, the Central Data Handling Facility, and the Data Archive and Distribution System. The specifications include sizing estimates for the processing and storage capacities of each data system element, as well as traffic analyses of data flows between the elements internally, and also externally across the data system interfaces. The specifications for the data system, as well as for the individual planning and scheduling, control and monitoring, data acquisition and processing, calibration and validation, and data archive and distribution components, do not yet fully specify the data system in the complete manner needed to achieve the scientific objectives of the MODIS instruments and science teams. The teams have not yet been formed; however, it was possible to develop the specifications and conceptual design based on the present concept of EosDIS, the Level-1 and Level-2 Functional Requirements Documents, the Operations Concept, and through interviews and meetings with key members of the scientific community
    • …
    corecore