5,106 research outputs found

    Predicting Electrical Faults in Power Distribution Network

    Get PDF
    Electricity is becoming increasingly important in modern civilization, and as a result, the emphasis on and use of power infrastructure is gradually expanding. Simultaneously, investment and distribution modes are shifting from the large-scale centralized generation of electricity and sheer consumption to decentralized generators and extremely sophisticated clients. This transformation puts further strain on old infrastructure, necessitating significant expenditures in future years to ensure a consistent supply. Subsequent technical and prediction technologies can help to maximize the use of the current grid while lowering the probability of faults. This study discusses some of the local grid difficulties as well as a prospective maintenance and failure probabilistic model. To provide an effective and convenient power source to consumers, a high Volta protects and maintains under fault conditions. Most of the fault identification and localization approaches rely on real and reactive power converter observations of electronic values. This can be seen in metrics and ground evaluations derived via internet traffic. This paper provides a thorough examination of the mechanisms for error detection, diagnosis, and localization in overhead lines. The proposal is then able to make suggestions about the ways that can be incorporated to predict foreseen faults in the electrical network. The three classifiers, Random Forest, XGBoost and Decision tree are producing high accuracies, while Logistic Regression and SVM are producing realistic accuracy results

    Diffusion of Lexical Change in Social Media

    Full text link
    Computer-mediated communication is driving fundamental changes in the nature of written language. We investigate these changes by statistical analysis of a dataset comprising 107 million Twitter messages (authored by 2.7 million unique user accounts). Using a latent vector autoregressive model to aggregate across thousands of words, we identify high-level patterns in diffusion of linguistic change over the United States. Our model is robust to unpredictable changes in Twitter's sampling rate, and provides a probabilistic characterization of the relationship of macro-scale linguistic influence to a set of demographic and geographic predictors. The results of this analysis offer support for prior arguments that focus on geographical proximity and population size. However, demographic similarity -- especially with regard to race -- plays an even more central role, as cities with similar racial demographics are far more likely to share linguistic influence. Rather than moving towards a single unified "netspeak" dialect, language evolution in computer-mediated communication reproduces existing fault lines in spoken American English.Comment: preprint of PLOS-ONE paper from November 2014; PLoS ONE 9(11) e11311

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Artificial neural networks and their applications to intelligent fault diagnosis of power transmission lines

    Get PDF
    Over the past thirty years, the idea of computing based on models inspired by human brains and biological neural networks emerged. Artificial neural networks play an important role in the field of machine learning and hold the key to the success of performing many intelligent tasks by machines. They are used in various applications such as pattern recognition, data classification, stock market prediction, aerospace, weather forecasting, control systems, intelligent automation, robotics, and healthcare. Their architectures generally consist of an input layer, multiple hidden layers, and one output layer. They can be implemented on software or hardware. Nowadays, various structures with various names exist for artificial neural networks, each of which has its own particular applications. Those used types in this study include feedforward neural networks, convolutional neural networks, and general regression neural networks. Increasing the number of layers in artificial neural networks as needed for large datasets, implies increased computational expenses. Therefore, besides these basic structures in deep learning, some advanced techniques are proposed to overcome the drawbacks of original structures in deep learning such as transfer learning, federated learning, and reinforcement learning. Furthermore, implementing artificial neural networks in hardware gives scientists and engineers the chance to perform high-dimensional and big data-related tasks because it removes the constraints of memory access time defined as the von Neuman bottleneck. Accordingly, analog and digital circuits are used for artificial neural network implementations without using general-purpose CPUs. In this study, the problem of fault detection, identification, and location estimation of transmission lines is studied and various deep learning approaches are implemented and designed as solutions. This research work focuses on the transmission lines’ datasets, their faults, and the importance of identification, detection, and location estimation of them. It also includes a comprehensive review of the previous studies to perform these three tasks. The application of various artificial neural networks such as feedforward neural networks, convolutional neural networks, and general regression neural networks for identification, detection, and location estimation of transmission line datasets are also discussed in this study. Some advanced methods based on artificial neural networks are taken into account in this thesis such as the transfer learning technique. These methodologies are designed and applied on transmission line datasets to enable the scientist and engineers with using fewer data points for the training purpose and wasting less time on the training step. This work also proposes a transfer learning-based technique for distinguishing faulty and non-faulty insulators in transmission line images. Besides, an effective design for an activation function of the artificial neural networks is proposed in this thesis. Using hyperbolic tangent as an activation function in artificial neural networks has several benefits including inclusiveness and high accuracy

    Landslide Mapping and Susceptibility Assessment of Chittagong Hilly Areas, Bangladesh

    Get PDF
    Landslides are natural phenomena in mountainous areas that cause damage to properties and death to people around the world. In Bangladesh, landslides have caused enormous economic loss and casualty in Chittagong Hilly Areas (CHA). In this dissertation, a landslide inventory of CHA was prepared using Google Earth and field mapping. Google Earth-based mapping helped in recording landslides in inaccessible areas like forests. In contrast, field mapping helped in mapping landslides in accessible areas like areas near road networks. For absence data sampling of landslide susceptibility mapping, this research proposed the Mahalanobis distance (MD) based absence data sampling and compared it with the slope-based absence data sampling. Three Upazilas (subdistricts) of Rangamati district, Bangladesh was used as the study area. Fifteen landslide causal factors, including slope aspect, plan curvature, and geology, were used in the random forest model for landslide susceptibility mapping. The area under the success and prediction rate curves, statistical indices including the Kappa index, showed that both the absence data sampling method provided similar accuracy. But based on the Seed Cell Area Index (SCAI) MD based landslide susceptibility map was more consistent and did not overestimate the landslide susceptibility like the slope-based model. Finally, this study assessed the impact of three land use/land cover (LULC) scenarios: a. existing (2018); b. Proposed LULC (Planned); and c. Simulated (2028) LULC on landslide susceptibility of Rangamati municipality of Rangamati district. The random forest model was used, and it showed that high susceptibility zones would increase in both proposed and simulated LULC scenarios. It indicated that LULC change would increase the landslide susceptibility in the study area. The increase of landslide susceptibility is comparatively low in the proposed LULC, indicating the importance of implementing planned LULC in the study

    Predictive geohazard mapping using LiDAR and satellite imagery in Missouri and Oklahoma, USA

    Get PDF
    ”Light Detection and Ranging (LiDAR) and satellite imagery have become the most utilized remote sensing technologies for compiling inventories of surficial geologic conditions. Point cloud data obtained from multi-spectral remote sensing methods provide a detailed characterization of the surface features, in particular, the detailed surface manifestations of underlying geologic structures. When combined, point clouds eliminate bias from visual inconsistencies and/or statistical values. This research explores the competence of point clouds derived from LiDAR and Unmanned Aerial Systems (UAS) as a predictive tool in evaluating various geohazards. It combines these data sets with other remote sensing techniques to evaluate the sensitivity of the respective datasets to temporal changes in the earth’s surface (potentially detectable at a centimeter-scale). A two-phase research approach was employed to test several hazard mapping scenarios in three geographic areas in the U.S. Midcontinent as follows: 1) UAS-derived surficial deformations near the epicenter of the 2016 Mw 5.8 Pawnee, Oklahoma earthquake (Paper I); 2) UAS mapping of recent earthquake epicenters in Noble Payne and Pawnee counties of Oklahoma State (Paper II); and, 3) Evaluation of geohazards in Greater Cape Girardeau Southeast Missouri (Paper III). These analyses detected geomorphic changes in the study locations, such as ground subsidence, soil heave and expansion, liquefaction-induced structures, dynamically-induced consolidation, and surface fault rupture. The studies underscore the importance of early hazard identification and providing information to relevant data users to make informed decisions”--Abstract, page iv

    The characterisation and automatic classification of transmission line faults

    Get PDF
    Includes bibliographical references.A country's ability to sustain and grow its industrial and commercial activities is highly dependent on a reliable electricity supply. Electrical faults on transmission lines are a cause of both interruptions to supply and voltage dips. These are the most common events impacting electricity users and also have the largest financial impact on them. This research focuses on understanding the causes of transmission line faults and developing methods to automatically identify these causes. Records of faults occurring on the South African power transmission system over a 16-year period have been collected and analysed to find statistical relationships between local climate, key design parameters of the overhead lines and the main causes of power system faults. The results characterize the performance of the South African transmission system on a probabilistic basis and illustrate differences in fault cause statistics for the summer and winter rainfall areas of South Africa and for different times of the year and day. This analysis lays a foundation for reliability analysis and fault pattern recognition taking environmental features such as local geography, climate and power system parameters into account. A key aspect of using pattern recognition techniques is selecting appropriate classifying features. Transmission line fault waveforms are characterised by instantaneous symmetrical component analysis to describe the transient and steady state fault conditions. The waveform and environmental features are used to develop single nearest neighbour classifiers to identify the underlying cause of transmission line faults. A classification accuracy of 86% is achieved using a single nearest neighbour classifier. This classification performance is found to be superior to that of decision tree, artificial neural network and naĂŻve Bayes classifiers. The results achieved demonstrate that transmission line faults can be automatically classified according to cause
    • …
    corecore