21 research outputs found

    Data Mining in Smart Grids

    Get PDF
    Effective smart grid operation requires rapid decisions in a data-rich, but information-limited, environment. In this context, grid sensor data-streaming cannot provide the system operators with the necessary information to act on in the time frames necessary to minimize the impact of the disturbances. Even if there are fast models that can convert the data into information, the smart grid operator must deal with the challenge of not having a full understanding of the context of the information, and, therefore, the information content cannot be used with any high degree of confidence. To address this issue, data mining has been recognized as the most promising enabling technology for improving decision-making processes, providing the right information at the right moment to the right decision-maker. This Special Issue is focused on emerging methodologies for data mining in smart grids. In this area, it addresses many relevant topics, ranging from methods for uncertainty management, to advanced dispatching. This Special Issue not only focuses on methodological breakthroughs and roadmaps in implementing the methodology, but also presents the much-needed sharing of the best practices. Topics include, but are not limited to, the following: Fuzziness in smart grids computing Emerging techniques for renewable energy forecasting Robust and proactive solution of optimal smart grids operation Fuzzy-based smart grids monitoring and control frameworks Granular computing for uncertainty management in smart grids Self-organizing and decentralized paradigms for information processin

    Line-of-Sight Detection for 5G Wireless Channels

    Get PDF
    With the rapid deployment of 5G wireless networks across the globe, precise positioning has become essential for many vertical industries reliant on 5G. The predominantly non-line-of-sight (NLOS) propagation instigated by the obstacles in the surrounding environment, especially in metro city areas, has made it particularly difficult to achieve high estimation accuracy for positioning algorithms that necessitate direct line-of-sight (LOS) transmission. In this scenario, correctly identifying the line-of-sight condition has become extremely crucial in precise positioning algorithms based on 5G. Even though numerous scientific studies have been conducted on LOS identification in the existing literature, most of these research works are based on either ultra-wideband or Wi-Fi networks. Therefore, this thesis focuses on this hitherto less investigated area of line-of-sight detection for 5G wireless channels. This thesis examines the feasibility of LOS detection using three widely used channel models, the Tapped Delay Line (TDL), the Clustered Delay Line (CDL), and the Winner II channel models. The 5G-based simulation environment was constructed with standard parameters based on 3GPP specifications using MATLAB computational platform for the research. LOS and NLOS channels were defined to transmit random signal samples for each channel model where the received signal was subjected to Additive White Gaussian Noise (AWGN), imitating the authentic propagation environment. Variable channel conditions were simulated by randomly alternating the signal-to-noise ratio (SNR) of the received signal. The research mainly focuses on machine learning (ML) based LOS classification. Additionally, the threshold-based hypothesis was also deployed for the same scenarios as a benchmark. The main objectives of the thesis were to find the statistical features or the combination of statistical features of the channel impulse response (CIR) of the received signal, which provide the best results and to identify the most effective machine learning method for LOS/NLOS classification. Furthermore, the results were verified through actual measurement samples obtained during the NewSense project. The results indicate that the time-correlation feature of the channel impulse response used in isolation would be effective in LOS identification for 5G wireless channels. Additional derived features of the CIR do not significantly increase the classification accuracy. Positioning Reference Signals (PRS) were found to be more appropriate than Sounding Reference Signals (SRS) for LOS/NLOS classification. The study reinforced the significance of selecting the most suitable machine learning algorithm and kernel function as relevant for the task of obtaining the best results. The medium Gaussian support vector machines ML algorithm provided the overall highest precision in LOS classification for simulated data with up to 98% accuracy for the Winner II channel model with PRS. The machine learning algorithms proved to be considerably more effective than conventional threshold-based detection for both simulated and real measurement data. Additionally, the Winner II model with the richest features presented the best results compared with CDL and TDL channel models

    Voltage and Frequency Recovery in Power System and MicroGrids Using Artificial Intelligent Algorithms

    Get PDF
    This thesis developed an advanced assessment tools to recover the power system voltage margin to the acceptable values during the disturbance. First, the effect of disturbance in islanded microgrids are analyzed using power factor-based power-voltage curves and a comprehensive under voltage-frequency load shedding(UVFLS) method is proposed as a last resort in order to restore the system voltage and frequency. The effect of disturbance in conventional power system is investigated by introducing a phenomenon called fault induced delayed voltage recovery(FIDVR) and comprehensive real-time FIDVR assessments are proposed to employ appropriate emergency control approaches as fast as possible to maintain the system voltage margins within the desired range. Then, polynomial regression techniques have been used for predicting the FIDVR duration. Next, advanced FIDVR assessment is implemented which simultaneously predicts whether the event can be classified as FIDVR or not and also predicts the duration of FIDVR with high accuracy

    Text Similarity Between Concepts Extracted from Source Code and Documentation

    Get PDF
    Context: Constant evolution in software systems often results in its documentation losing sync with the content of the source code. The traceability research field has often helped in the past with the aim to recover links between code and documentation, when the two fell out of sync. Objective: The aim of this paper is to compare the concepts contained within the source code of a system with those extracted from its documentation, in order to detect how similar these two sets are. If vastly different, the difference between the two sets might indicate a considerable ageing of the documentation, and a need to update it. Methods: In this paper we reduce the source code of 50 software systems to a set of key terms, each containing the concepts of one of the systems sampled. At the same time, we reduce the documentation of each system to another set of key terms. We then use four different approaches for set comparison to detect how the sets are similar. Results: Using the well known Jaccard index as the benchmark for the comparisons, we have discovered that the cosine distance has excellent comparative powers, and depending on the pre-training of the machine learning model. In particular, the SpaCy and the FastText embeddings offer up to 80% and 90% similarity scores. Conclusion: For most of the sampled systems, the source code and the documentation tend to contain very similar concepts. Given the accuracy for one pre-trained model (e.g., FastText), it becomes also evident that a few systems show a measurable drift between the concepts contained in the documentation and in the source code.</p

    The blessings of explainable AI in operations & maintenance of wind turbines

    Get PDF
    Wind turbines play an integral role in generating clean energy, but regularly suffer from operational inconsistencies and failures leading to unexpected downtimes and significant Operations & Maintenance (O&M) costs. Condition-Based Monitoring (CBM) has been utilised in the past to monitor operational inconsistencies in turbines by applying signal processing techniques to vibration data. The last decade has witnessed growing interest in leveraging Supervisory Control & Acquisition (SCADA) data from turbine sensors towards CBM. Machine Learning (ML) techniques have been utilised to predict incipient faults in turbines and forecast vital operational parameters with high accuracy by leveraging SCADA data and alarm logs. More recently, Deep Learning (DL) methods have outperformed conventional ML techniques, particularly for anomaly prediction. Despite demonstrating immense promise in transitioning to Artificial Intelligence (AI), such models are generally black-boxes that cannot provide rationales behind their predictions, hampering the ability of turbine operators to rely on automated decision making. We aim to help combat this challenge by providing a novel perspective on Explainable AI (XAI) for trustworthy decision support.This thesis revolves around three key strands of XAI – DL, Natural Language Generation (NLG) and Knowledge Graphs (KGs), which are investigated by utilising data from an operational turbine. We leverage DL and NLG to predict incipient faults and alarm events in the turbine in natural language as well as generate human-intelligible O&M strategies to assist engineers in fixing/averting the faults. We also propose specialised DL models which can predict causal relationships in SCADA features as well as quantify the importance of vital parameters leading to failures. The thesis finally culminates with an interactive Question- Answering (QA) system for automated reasoning that leverages multimodal domain-specific information from a KG, facilitating engineers to retrieve O&M strategies with natural language questions. By helping make turbines more reliable, we envisage wider adoption of wind energy sources towards tackling climate change
    corecore