92,597 research outputs found

    Attributes of Big Data Analytics for Data-Driven Decision Making in Cyber-Physical Power Systems

    Get PDF
    Big data analytics is a virtually new term in power system terminology. This concept delves into the way a massive volume of data is acquired, processed, analyzed to extract insight from available data. In particular, big data analytics alludes to applications of artificial intelligence, machine learning techniques, data mining techniques, time-series forecasting methods. Decision-makers in power systems have been long plagued by incapability and weakness of classical methods in dealing with large-scale real practical cases due to the existence of thousands or millions of variables, being time-consuming, the requirement of a high computation burden, divergence of results, unjustifiable errors, and poor accuracy of the model. Big data analytics is an ongoing topic, which pinpoints how to extract insights from these large data sets. The extant article has enumerated the applications of big data analytics in future power systems through several layers from grid-scale to local-scale. Big data analytics has many applications in the areas of smart grid implementation, electricity markets, execution of collaborative operation schemes, enhancement of microgrid operation autonomy, management of electric vehicle operations in smart grids, active distribution network control, district hub system management, multi-agent energy systems, electricity theft detection, stability and security assessment by PMUs, and better exploitation of renewable energy sources. The employment of big data analytics entails some prerequisites, such as the proliferation of IoT-enabled devices, easily-accessible cloud space, blockchain, etc. This paper has comprehensively conducted an extensive review of the applications of big data analytics along with the prevailing challenges and solutions

    A weather forecast model accuracy analysis and ECMWF enhancement proposal by neural network

    Get PDF
    This paper presents a neural network approach for weather forecast improvement. Predicted parameters, such as air temperature or precipitation, play a crucial role not only in the transportation sector but they also influence people's everyday activities. Numerical weather models require real measured data for the correct forecast run. This data is obtained from automatic weather stations by intelligent sensors. Sensor data collection and its processing is a necessity for finding the optimal weather conditions estimation. The European Centre for Medium-Range Weather Forecasts (ECMWF) model serves as the main base for medium-range predictions among the European countries. This model is capable of providing forecast up to 10 days with horizontal resolution of 9 km. Although ECMWF is currently the global weather system with the highest horizontal resolution, this resolution is still two times worse than the one offered by limited area (regional) numeric models (e.g., ALADIN that is used in many European and north African countries). They use global forecasting model and sensor-based weather monitoring network as the input parameters (global atmospheric situation at regional model geographic boundaries, description of atmospheric condition in numerical form), and because the analysed area is much smaller (typically one country), computing power allows them to use even higher resolution for key meteorological parameters prediction. However, the forecast data obtained from regional models are available only for a specific country, and end-users cannot find them all in one place. Furthermore, not all members provide open access to these data. Since the ECMWF model is commercial, several web services offer it free of charge. Additionally, because this model delivers forecast prediction for the whole of Europe (and for the whole world, too), this attitude is more user-friendly and attractive for potential customers. Therefore, the proposed novel hybrid method based on machine learning is capable of increasing ECMWF forecast outputs accuracy to the same level as limited area models provide, and it can deliver a more accurate forecast in real-time.Web of Science1923art. no. 514

    The Semantic Grid: A future e-Science infrastructure

    No full text
    e-Science offers a promising vision of how computer and communication technology can support and enhance the scientific process. It does this by enabling scientists to generate, analyse, share and discuss their insights, experiments and results in an effective manner. The underlying computer infrastructure that provides these facilities is commonly referred to as the Grid. At this time, there are a number of grid applications being developed and there is a whole raft of computer technologies that provide fragments of the necessary functionality. However there is currently a major gap between these endeavours and the vision of e-Science in which there is a high degree of easy-to-use and seamless automation and in which there are flexible collaborations and computations on a global scale. To bridge this practice–aspiration divide, this paper presents a research agenda whose aim is to move from the current state of the art in e-Science infrastructure, to the future infrastructure that is needed to support the full richness of the e-Science vision. Here the future e-Science research infrastructure is termed the Semantic Grid (Semantic Grid to Grid is meant to connote a similar relationship to the one that exists between the Semantic Web and the Web). In particular, we present a conceptual architecture for the Semantic Grid. This architecture adopts a service-oriented perspective in which distinct stakeholders in the scientific process, represented as software agents, provide services to one another, under various service level agreements, in various forms of marketplace. We then focus predominantly on the issues concerned with the way that knowledge is acquired and used in such environments since we believe this is the key differentiator between current grid endeavours and those envisioned for the Semantic Grid

    Open-architecture Implementation of Fragment Molecular Orbital Method for Peta-scale Computing

    Full text link
    We present our perspective and goals on highperformance computing for nanoscience in accordance with the global trend toward "peta-scale computing." After reviewing our results obtained through the grid-enabled version of the fragment molecular orbital method (FMO) on the grid testbed by the Japanese Grid Project, National Research Grid Initiative (NAREGI), we show that FMO is one of the best candidates for peta-scale applications by predicting its effective performance in peta-scale computers. Finally, we introduce our new project constructing a peta-scale application in an open-architecture implementation of FMO in order to realize both goals of highperformance in peta-scale computers and extendibility to multiphysics simulations.Comment: 6 pages, 9 figures, proceedings of the 2nd IEEE/ACM international workshop on high performance computing for nano-science and technology (HPCNano06
    • …
    corecore