410,688 research outputs found
Towards Ex Vivo Testing of MapReduce Applications
2017 IEEE International Conference on Software Quality, Reliability and Security (QRS), 25-29 July 2017, Prague (Czech Republic)Big Data programs are those that process large data exceeding the capabilities of traditional technologies. Among newly proposed processing models, MapReduce stands out as it allows the analysis of schema-less data in large distributed environments with frequent infrastructure failures. Functional faults in MapReduce are hard to detect in a testing/preproduction environment due to its distributed characteristics. We propose an automatic test framework implementing a novel testing approach called Ex Vivo. The framework employs data from production but executes the tests in a laboratory to avoid side-effects on the application. Faults are detected automatically without human intervention by checking if the same data would generate different outputs with different infrastructure configurations. The framework (MrExist) is validated with a real-world program. MrExist can identify a fault in a few seconds, then the program can be stopped, not only avoiding an incorrect output, but also saving money, time and energy of production resource
Recommended from our members
An Assessment of PIER Electric Grid Research 2003-2014 White Paper
This white paper describes the circumstances in California around the turn of the 21st century that led the California Energy Commission (CEC) to direct additional Public Interest Energy Research funds to address critical electric grid issues, especially those arising from integrating high penetrations of variable renewable generation with the electric grid. It contains an assessment of the beneficial science and technology advances of the resultant portfolio of electric grid research projects administered under the direction of the CEC by a competitively selected contractor, the University of Californiaâs California Institute for Energy and the Environment, from 2003-2014
Recommended from our members
Challenges to the Integration of Renewable Resources at High System Penetration
Successfully integrating renewable resources into the electric grid at penetration levels to meet a 33 percent Renewables Portfolio Standard for California presents diverse technical and organizational challenges. This report characterizes these challenges by coordinating problems in time and space, balancing electric power on a range of scales from microseconds to decades and from individual homes to hundreds of miles. Crucial research needs were identified related to grid operation, standards and procedures, system design and analysis, and incentives, and public engagement in each scale of analysis. Performing this coordination on more refined scales of time and space independent of any particular technology, is defined as a âsmart grid.â âSmartâ coordination of the grid should mitigate technical difficulties associated with intermittent and distributed generation, support grid stability and reliability, and maximize benefits to California ratepayers by using the most economic technologies, design and operating approaches
Chance-Constrained Outage Scheduling using a Machine Learning Proxy
Outage scheduling aims at defining, over a horizon of several months to
years, when different components needing maintenance should be taken out of
operation. Its objective is to minimize operation-cost expectation while
satisfying reliability-related constraints. We propose a distributed
scenario-based chance-constrained optimization formulation for this problem. To
tackle tractability issues arising in large networks, we use machine learning
to build a proxy for predicting outcomes of power system operation processes in
this context. On the IEEE-RTS79 and IEEE-RTS96 networks, our solution obtains
cheaper and more reliable plans than other candidates
Optimal discrete stopping times for reliability growth tests
Often, the duration of a reliability growth development test is specified in advance and the decision to terminate or continue testing is conducted at discrete time intervals. These features are normally not captured by reliability growth models. This paper adapts a standard reliability growth model to determine the optimal time for which to plan to terminate testing. The underlying stochastic process is developed from an Order Statistic argument with Bayesian inference used to estimate the number of faults within the design and classical inference procedures used to assess the rate of fault detection. Inference procedures within this framework are explored where it is shown the Maximum Likelihood Estimators possess a small bias and converges to the Minimum Variance Unbiased Estimator after few tests for designs with moderate number of faults. It is shown that the Likelihood function can be bimodal when there is conflict between the observed rate of fault detection and the prior distribution describing the number of faults in the design. An illustrative example is provided
Recommended from our members
A review of microgrid development in the United States â A decade of progress on policies, demonstrations, controls, and software tools
Microgrids have become increasingly popular in the United States. Supported by favorable federal and local policies, microgrid projects can provide greater energy stability and resilience within a project site or community. This paper reviews major federal, state, and utility-level policies driving microgrid development in the United States. Representative U.S. demonstration projects are selected and their technical characteristics and non-technical features are introduced. The paper discusses trends in the technology development of microgrid systems as well as microgrid control methods and interactions within the electricity market. Software tools for microgrid design, planning, and performance analysis are illustrated with each tool's core capability. Finally, the paper summarizes the successes and lessons learned during the recent expansion of the U.S. microgrid industry that may serve as a reference for other countries developing their own microgrid industries
Recommended from our members
Determining Utility System Value of Demand Flexibility From Grid-interactive Efficient Buildings
This report focuses on ways current methods and practices that establish the value to electric utility systems of distributed energy resource (DER) investments can be enhanced to determine the value of demand flexibility in grid-interactive efficient buildings that can provide grid services. The report introduces key valuation concepts that are applicable to demand flexibility that these buildings can provide and links to other documents that describe these concepts and their implementation in more detail.The scope of this report is limited to the valuation of economic benefits to the utility system. These are the foundational values on which other benefits (and costs) can be built. Establishing the economic value to the grid of demand flexibility provides the information needed to design programs, market rules, and rates that align the economic interest of utility customers with building owners and occupants. By nature, DERs directly impact customers and provide societal benefits external to the utility system. Jurisdictions can use utility system benefits and costs as the foundation of their economic analysis but align their primary cost-effectiveness metric with all applicable policy objectives, which may include customer and societal (non-utility system) impacts.This report suggests enhancements to current methods and practices that state and local policymakers, public utility commissions, state energy offices, utilities, state utility consumer representatives, and other stakeholders might support. These enhancements can improve the consistency and robustness of economic valuation of demand flexibility for grid services. The report concludes with a discussion of considerations for prioritizing implementation of these improvements
- âŠ