769 research outputs found
Using conditional kernel density estimation for wind power density forecasting
Of the various renewable energy resources, wind power is widely recognized as one of the most promising. The management of wind farms and electricity systems can benefit greatly from the availability of estimates of the probability distribution of wind power generation. However, most research has focused on point forecasting of wind power. In this paper, we develop an approach to producing density forecasts for the wind power generated at individual wind farms. Our interest is in intraday data and prediction from 1 to 72 hours ahead. We model wind power in terms of wind speed and wind direction. In this framework, there are two key uncertainties. First, there is the inherent uncertainty in wind speed and direction, and we model this using a bivariate VARMA-GARCH (vector autoregressive moving average-generalized autoregressive conditional heteroscedastic) model, with a Student t distribution, in the Cartesian space of wind speed and direction. Second, there is the stochastic nature of the relationship of wind power to wind speed (described by the power curve), and to wind direction. We model this using conditional kernel density (CKD) estimation, which enables a nonparametric modeling of the conditional density of wind power. Using Monte Carlo simulation of the VARMA-GARCH model and CKD estimation, density forecasts of wind speed and direction are converted to wind power density forecasts. Our work is novel in several respects: previous wind power studies have not modeled a stochastic power curve; to accommodate time evolution in the power curve, we incorporate a time decay factor within the CKD method; and the CKD method is conditional on a density, rather than a single value. The new approach is evaluated using datasets from four Greek wind farms
On machine learning-based techniques for future sustainable and resilient energy systems
Permanently increasing penetration of converter-interfaced generation and renewable energy sources (RESs) makes modern electrical power systems more vulnerable to low probability and high impact events, such as extreme weather, which could lead to severe contingencies, even blackouts. These contingencies can be further propagated to neighboring energy systems over coupling components/technologies and consequently negatively influence the entire multi-energy system (MES) (such as gas, heating and electricity) operation and its resilience. In recent years, machine learning-based techniques (MLBTs) have been intensively applied to solve various power system problems, including system planning, or security and reliability assessment. This paper aims to review MES resilience quantification methods and the application of MLBTs to assess the resilience level of future sustainable energy systems. The open research questions are identified and discussed, whereas the future research directions are identified
Transistors
Contains reports on eight research projects.Lincoln Laboratory under Contract AF19(122)-45
Universality of finite-size corrections to the number of critical percolation clusters
Monte-Carlo simulations on a variety of 2d percolating systems at criticality
suggest that the excess number of clusters in finite systems over the bulk
value of nc is a universal quantity, dependent upon the system shape but
independent of the lattice and percolation type. Values of nc are found to high
accuracy, and for bond percolation confirm the theoretical predictions of
Temperley and Lieb, and Baxter, Temperley, and Ashley, which we have evaluated
explicitly in terms of simple algebraic numbers. Predictions for the
fluctuations are also verified for the first time.Comment: 13 pages, 2 figs., Latex, submitted to Phys. Rev. Let
Efficient Monte Carlo algorithm and high-precision results for percolation
We present a new Monte Carlo algorithm for studying site or bond percolation
on any lattice. The algorithm allows us to calculate quantities such as the
cluster size distribution or spanning probability over the entire range of site
or bond occupation probabilities from zero to one in a single run which takes
an amount of time scaling linearly with the number of sites on the lattice. We
use our algorithm to determine that the percolation transition occurs at
occupation probability 0.59274621(13) for site percolation on the square
lattice and to provide clear numerical confirmation of the conjectured
4/3-power stretched-exponential tails in the spanning probability functions.Comment: 8 pages, including 3 postscript figures, minor corrections in this
version, plus updated figures for the position of the percolation transitio
Comparing apples and oranges: assessment of the relative video quality in the presence of different types of distortions
<p>Abstract</p> <p>Video quality assessment is essential for the performance analysis of visual communication applications. Objective metrics can be used for estimating the relative quality differences, but they typically give reliable results only if the compared videos contain similar types of quality distortion. However, video compression typically produces different kinds of visual artifacts than transmission errors. In this article, we focus on a novel subjective quality assessment method that is suitable for comparing different types of quality distortions. The proposed method has been used to evaluate how well different objective quality metrics estimate the relative subjective quality levels for content with different types of quality distortions. Our conclusion is that none of the studied objective metrics works reliably for assessing the co-impact of compression artifacts and transmission errors on the subjective quality. Nevertheless, we have observed that the objective metrics' tendency to either over- or underestimate the perceived impact of transmission errors has a high correlation with the spatial and temporal activity levels of the content. Therefore, our results can be useful for improving the performance of objective metrics in the presence of both source and channel distortions.</p
A fast Monte Carlo algorithm for site or bond percolation
We describe in detail a new and highly efficient algorithm for studying site
or bond percolation on any lattice. The algorithm can measure an observable
quantity in a percolation system for all values of the site or bond occupation
probability from zero to one in an amount of time which scales linearly with
the size of the system. We demonstrate our algorithm by using it to investigate
a number of issues in percolation theory, including the position of the
percolation transition for site percolation on the square lattice, the
stretched exponential behavior of spanning probabilities away from the critical
point, and the size of the giant component for site percolation on random
graphs.Comment: 17 pages, 13 figures. Corrections and some additional material in
this version. Accompanying material can be found on the web at
http://www.santafe.edu/~mark/percolation
A reduced-reference perceptual image and video quality metric based on edge preservation
In image and video compression and transmission, it is important to rely on an objective image/video quality metric which accurately represents the subjective quality of processed images and video sequences. In some scenarios, it is also important to evaluate the quality of the received video sequence with minimal reference to the transmitted one. For instance, for quality improvement of video transmission through closed-loop optimisation, the video quality measure can be evaluated at the receiver and provided as feedback information to the system controller. The original image/video sequence-prior to compression and transmission-is not usually available at the receiver side, and it is important to rely at the receiver side on an objective video quality metric that does not need reference or needs minimal reference to the original video sequence. The observation that the human eye is very sensitive to edge and contour information of an image underpins the proposal of our reduced reference (RR) quality metric, which compares edge information between the distorted and the original image. Results highlight that the metric correlates well with subjective observations, also in comparison with commonly used full-reference metrics and with a state-of-the-art RR metric. © 2012 Martini et al
Exact results at the 2-D percolation point
We derive exact expressions for the excess number of clusters b and the
excess cumulants b_n of a related quantity at the 2-D percolation point.
High-accuracy computer simulations are in accord with our predictions. b is a
finite-size correction to the Temperley-Lieb or Baxter-Temperley-Ashley formula
for the number of clusters per site n_c in the infinite system limit; the bn
correct bulk cumulants. b and b_n are universal, and thus depend only on the
system's shape. Higher-order corrections show no apparent dependence on
fractional powers of the system size.Comment: 12 pages, 2 figures, LaTeX, submitted to Physical Review Letter
- …