2,326 research outputs found

    Wind Energy Forecasting: A Collaboration of the National Center for Atmospheric Research (NCAR) and Xcel Energy

    Get PDF
    The focus of this report is the wind forecasting system developed during this contract period with results of performance through the end of 2010. The report is intentionally high-level, with technical details disseminated at various conferences and academic papers. At the end of 2010, Xcel Energy managed the output of 3372 megawatts of installed wind energy. The wind plants span three operating companies1, serving customers in eight states2, and three market structures3. The great majority of the wind energy is contracted through power purchase agreements (PPAs). The remainder is utility owned, Qualifying Facilities (QF), distributed resources (i.e., 'behind the meter'), or merchant entities within Xcel Energy's Balancing Authority footprints. Regardless of the contractual or ownership arrangements, the output of the wind energy is balanced by Xcel Energy's generation resources that include fossil, nuclear, and hydro based facilities that are owned or contracted via PPAs. These facilities are committed and dispatched or bid into day-ahead and real-time markets by Xcel Energy's Commercial Operations department. Wind energy complicates the short and long-term planning goals of least-cost, reliable operations. Due to the uncertainty of wind energy production, inherent suboptimal commitment and dispatch associated with imperfect wind forecasts drives up costs. For example, a gas combined cycle unit may be turned on, or committed, in anticipation of low winds. The reality is winds stayed high, forcing this unit and others to run, or be dispatched, to sub-optimal loading positions. In addition, commitment decisions are frequently irreversible due to minimum up and down time constraints. That is, a dispatcher lives with inefficient decisions made in prior periods. In general, uncertainty contributes to conservative operations - committing more units and keeping them on longer than may have been necessary for purposes of maintaining reliability. The downside is costs are higher. In organized electricity markets, units that are committed for reliability reasons are paid their offer price even when prevailing market prices are lower. Often, these uplift charges are allocated to market participants that caused the inefficient dispatch in the first place. Thus, wind energy facilities are burdened with their share of costs proportional to their forecast errors. For Xcel Energy, wind energy uncertainty costs manifest depending on specific market structures. In the Public Service of Colorado (PSCo), inefficient commitment and dispatch caused by wind uncertainty increases fuel costs. Wind resources participating in the Midwest Independent System Operator (MISO) footprint make substantial payments in the real-time markets to true-up their day-ahead positions and are additionally burdened with deviation charges called a Revenue Sufficiency Guarantee (RSG) to cover out of market costs associated with operations. Southwest Public Service (SPS) wind plants cause both commitment inefficiencies and are charged Southwest Power Pool (SPP) imbalance payments due to wind uncertainty and variability. Wind energy forecasting helps mitigate these costs. Wind integration studies for the PSCo and Northern States Power (NSP) operating companies have projected increasing costs as more wind is installed on the system due to forecast error. It follows that reducing forecast error would reduce these costs. This is echoed by large scale studies in neighboring regions and states that have recommended adoption of state-of-the-art wind forecasting tools in day-ahead and real-time planning and operations. Further, Xcel Energy concluded reduction of the normalized mean absolute error by one percent would have reduced costs in 2008 by over $1 million annually in PSCo alone. The value of reducing forecast error prompted Xcel Energy to make substantial investments in wind energy forecasting research and development

    Redundant variables and Granger causality

    Get PDF
    We discuss the use of multivariate Granger causality in presence of redundant variables: the application of the standard analysis, in this case, leads to under-estimation of causalities. Using the un-normalized version of the causality index, we quantitatively develop the notions of redundancy and synergy in the frame of causality and propose two approaches to group redundant variables: (i) for a given target, the remaining variables are grouped so as to maximize the total causality and (ii) the whole set of variables is partitioned to maximize the sum of the causalities between subsets. We show the application to a real neurological experiment, aiming to a deeper understanding of the physiological basis of abnormal neuronal oscillations in the migraine brain. The outcome by our approach reveals the change in the informational pattern due to repetitive transcranial magnetic stimulations.Comment: 4 pages, 5 figures. Accepted for publication in Physical Review

    Comment on “Surface Plasmons and Nonlocality: A Simple Model”

    Full text link
    In the Comment [1], Schaich calculated the mode dispersion of surface plasmons supported by a planar metal-dielectric-metal (MIM) structure, and concluded that our model [2] fails to mimic the effect of nonlocality at high frequencies. Here, we shall clarify the difference between our calculations and that in Schaich’s Comment, and highlight the validity of our model for a general class of plasmonic structures.Published versio

    Statistical Communication Theory

    Get PDF
    Contains reports on four research projects

    Multivariate Granger Causality and Generalized Variance

    Get PDF
    Granger causality analysis is a popular method for inference on directed interactions in complex systems of many variables. A shortcoming of the standard framework for Granger causality is that it only allows for examination of interactions between single (univariate) variables within a system, perhaps conditioned on other variables. However, interactions do not necessarily take place between single variables, but may occur among groups, or "ensembles", of variables. In this study we establish a principled framework for Granger causality in the context of causal interactions among two or more multivariate sets of variables. Building on Geweke's seminal 1982 work, we offer new justifications for one particular form of multivariate Granger causality based on the generalized variances of residual errors. Taken together, our results support a comprehensive and theoretically consistent extension of Granger causality to the multivariate case. Treated individually, they highlight several specific advantages of the generalized variance measure, which we illustrate using applications in neuroscience as an example. We further show how the measure can be used to define "partial" Granger causality in the multivariate context and we also motivate reformulations of "causal density" and "Granger autonomy". Our results are directly applicable to experimental data and promise to reveal new types of functional relations in complex systems, neural and otherwise.Comment: added 1 reference, minor change to discussion, typos corrected; 28 pages, 3 figures, 1 table, LaTe

    Farm service agency employee intentions to use weather and climate data in professional services

    Get PDF
    This is a work of the U.S. Government and is not subject to copyright protection in the United States. This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.. Agricultural service providers often work closely with producers, and are well positioned to include weather and climate change information in the services they provide. By doing so, they can help producers reduce risks due to climate variability and change. A national survey of United States Department of Agriculture Farm Service Agency (FSA) field staff (n = 4621) was conducted in 2016. The survey was designed to assess FSA employees\u27 use of climate and weather-related data and explore their perspectives on climate change, attitudes toward adaptation and concerns regarding climate- and weather-driven risks. Two structural equation models were developed to explore relationships between these factors, and to predict respondents\u27 willingness to integrate climate and weather data into their professional services in the future. The two models were compared with assess the relative influence of respondents\u27 current use of weather and climate information. Findings suggest that respondents\u27 perceptions of weather-related risk in combination with their personal observations of weather variability help predict whether an individual intends to use weather and climate information in the future. Importantly, climate change belief is not a significant predictor of this intention; however, the belief that producers will have to adapt to climate change in order to remain viable is. Surprisingly, whether or not an individual currently uses weather and climate information is not a good predictor of whether they intend to in the future. This suggests that there are opportunities to increase employee exposure and proficiency with weather and climate information to meet the needs of American farmers by helping them to reduce risk

    Noise-induced dynamics in bistable systems with delay

    Full text link
    Noise-induced dynamics of a prototypical bistable system with delayed feedback is studied theoretically and numerically. For small noise and magnitude of the feedback, the problem is reduced to the analysis of the two-state model with transition rates depending on the earlier state of the system. In this two-state approximation, we found analytical formulae for the autocorrelation function, the power spectrum, and the linear response to a periodic perturbation. They show very good agreement with direct numerical simulations of the original Langevin equation. The power spectrum has a pronounced peak at the frequency corresponding to the inverse delay time, whose amplitude has a maximum at a certain noise level, thus demonstrating coherence resonance. The linear response to the external periodic force also has maxima at the frequencies corresponding to the inverse delay time and its harmonics.Comment: 4 pages, 4 figures, submitted to Physical Review Letter

    Increasing the Power to Detect Causal Associations by Combining Genotypic and Expression Data in Segregating Populations

    Get PDF
    To dissect common human diseases such as obesity and diabetes, a systematic approach is needed to study how genes interact with one another, and with genetic and environmental factors, to determine clinical end points or disease phenotypes. Bayesian networks provide a convenient framework for extracting relationships from noisy data and are frequently applied to large-scale data to derive causal relationships among variables of interest. Given the complexity of molecular networks underlying common human disease traits, and the fact that biological networks can change depending on environmental conditions and genetic factors, large datasets, generally involving multiple perturbations (experiments), are required to reconstruct and reliably extract information from these networks. With limited resources, the balance of coverage of multiple perturbations and multiple subjects in a single perturbation needs to be considered in the experimental design. Increasing the number of experiments, or the number of subjects in an experiment, is an expensive and time-consuming way to improve network reconstruction. Integrating multiple types of data from existing subjects might be more efficient. For example, it has recently been demonstrated that combining genotypic and gene expression data in a segregating population leads to improved network reconstruction, which in turn may lead to better predictions of the effects of experimental perturbations on any given gene. Here we simulate data based on networks reconstructed from biological data collected in a segregating mouse population and quantify the improvement in network reconstruction achieved using genotypic and gene expression data, compared with reconstruction using gene expression data alone. We demonstrate that networks reconstructed using the combined genotypic and gene expression data achieve a level of reconstruction accuracy that exceeds networks reconstructed from expression data alone, and that fewer subjects may be required to achieve this superior reconstruction accuracy. We conclude that this integrative genomics approach to reconstructing networks not only leads to more predictive network models, but also may save time and money by decreasing the amount of data that must be generated under any given condition of interest to construct predictive network models

    The efficacy of the Self-Administered Interview: A systematic review

    Get PDF
    Obtaining accurate information from eyewitnesses is a crucial element in criminal investigations. Interview strategies such as the Cognitive Interview (CI) and the Self‐Administered Interview (SAI) have been developed and implemented to minimise inaccuracies and enhance the recall and reliability of eyewitness evidence. The SAI is a recent development within forensic psychology. However, a question remains as to the effectiveness of the SAI as an investigative interview tool. A systematic review of published studies employing the SAI was conducted (n = 22), and all were considered in relation to three variables (estimator, system and methodological). As the number of studies within the evidence base was relatively small, we could not ascertain whether the SAI is an effective investigative interviewing tool. However, the results demonstrate a number of positive benefits of the SAI. We discuss the implications of these findings for policy and directions for future research

    Multiscale Computations on Neural Networks: From the Individual Neuron Interactions to the Macroscopic-Level Analysis

    Full text link
    We show how the Equation-Free approach for multi-scale computations can be exploited to systematically study the dynamics of neural interactions on a random regular connected graph under a pairwise representation perspective. Using an individual-based microscopic simulator as a black box coarse-grained timestepper and with the aid of simulated annealing we compute the coarse-grained equilibrium bifurcation diagram and analyze the stability of the stationary states sidestepping the necessity of obtaining explicit closures at the macroscopic level. We also exploit the scheme to perform a rare-events analysis by estimating an effective Fokker-Planck describing the evolving probability density function of the corresponding coarse-grained observables
    corecore