117 research outputs found

    Logging Statements Analysis and Automation in Software Systems with Data Mining and Machine Learning Techniques

    Get PDF
    Log files are widely used to record runtime information of software systems, such as the timestamp of an event, the name or ID of the component that generated the log, and parts of the state of a task execution. The rich information of logs enables system developers (and operators) to monitor the runtime behavior of their systems and further track down system problems in development and production settings. With the ever-increasing scale and complexity of modern computing systems, the volume of logs is rapidly growing. For example, eBay reported that the rate of log generation on their servers is in the order of several petabytes per day in 2018 [17]. Therefore, the traditional way of log analysis that largely relies on manual inspection (e.g., searching for error/warning keywords or grep) has become an inefficient, a labor intensive, error-prone, and outdated task. The growth of the logs has initiated the emergence of automated tools and approaches for log mining and analysis. In parallel, the embedding of logging statements in the source code is a manual and error-prone task, and developers often might forget to add a logging statement in the software's source code. To address the logging challenge, many e orts have aimed to automate logging statements in the source code, and in addition, many tools have been proposed to perform large-scale log le analysis by use of machine learning and data mining techniques. However, the current logging process is yet mostly manual, and thus, proper placement and content of logging statements remain as challenges. To overcome these challenges, methods that aim to automate log placement and content prediction, i.e., `where and what to log', are of high interest. In addition, approaches that can automatically mine and extract insight from large-scale logs are also well sought after. Thus, in this research, we focus on predicting the log statements, and for this purpose, we perform an experimental study on open-source Java projects. We introduce a log-aware code-clone detection method to predict the location and description of logging statements. Additionally, we incorporate natural language processing (NLP) and deep learning methods to further enhance the performance of the log statements' description prediction. We also introduce deep learning based approaches for automated analysis of software logs. In particular, we analyze execution logs and extract natural language characteristics of logs to enable the application of natural language models for automated log le analysis. Then, we propose automated tools for analyzing log files and measuring the information gain from logs for different log analysis tasks such as anomaly detection. We then continue our NLP-enabled approach by leveraging the state-of-the-art language models, i.e., Transformers, to perform automated log parsing

    Ferroelectrics

    Get PDF
    Ferroelectric materials exhibit a wide spectrum of functional properties, including switchable polarization, piezoelectricity, high non-linear optical activity, pyroelectricity, and non-linear dielectric behaviour. These properties are crucial for application in electronic devices such as sensors, microactuators, infrared detectors, microwave phase filters and, non-volatile memories. This unique combination of properties of ferroelectric materials has attracted researchers and engineers for a long time. This book reviews a wide range of diverse topics related to the phenomenon of ferroelectricity (in the bulk as well as thin film form) and provides a forum for scientists, engineers, and students working in this field. The present book containing 24 chapters is a result of contributions of experts from international scientific community working in different aspects of ferroelectricity related to experimental and theoretical work aimed at the understanding of ferroelectricity and their utilization in devices. It provides an up-to-date insightful coverage to the recent advances in the synthesis, characterization, functional properties and potential device applications in specialized areas

    Intelligent Biosignal Processing in Wearable and Implantable Sensors

    Get PDF
    This reprint provides a collection of papers illustrating the state-of-the-art of smart processing of data coming from wearable, implantable or portable sensors. Each paper presents the design, databases used, methodological background, obtained results, and their interpretation for biomedical applications. Revealing examples are brain–machine interfaces for medical rehabilitation, the evaluation of sympathetic nerve activity, a novel automated diagnostic tool based on ECG data to diagnose COVID-19, machine learning-based hypertension risk assessment by means of photoplethysmography and electrocardiography signals, Parkinsonian gait assessment using machine learning tools, thorough analysis of compressive sensing of ECG signals, development of a nanotechnology application for decoding vagus-nerve activity, detection of liver dysfunction using a wearable electronic nose system, prosthetic hand control using surface electromyography, epileptic seizure detection using a CNN, and premature ventricular contraction detection using deep metric learning. Thus, this reprint presents significant clinical applications as well as valuable new research issues, providing current illustrations of this new field of research by addressing the promises, challenges, and hurdles associated with the synergy of biosignal processing and AI through 16 different pertinent studies. Covering a wide range of research and application areas, this book is an excellent resource for researchers, physicians, academics, and PhD or master students working on (bio)signal and image processing, AI, biomaterials, biomechanics, and biotechnology with applications in medicine

    30th International Conference on Condition Monitoring and Diagnostic Engineering Management (COMADEM 2017)

    Get PDF
    Proceedings of COMADEM 201

    Advances in Learning and Understanding with Graphs through Machine Learning

    Get PDF
    Graphs have increasingly become a crucial way of representing large, complex and disparate datasets from a range of domains, including many scientific disciplines. Graphs are particularly useful at capturing complex relationships or interdependencies within or even between datasets, and enable unique insights which are not possible with other data formats. Over recent years, significant improvements in the ability of machine learning approaches to automatically learn from and identify patterns in datasets have been made. However due to the unique nature of graphs, and the data they are used to represent, employing machine learning with graphs has thus far proved challenging. A review of relevant literature has revealed that key challenges include issues arising with macro-scale graph learning, interpretability of machine learned representations and a failure to incorporate the temporal dimension present in many datasets. Thus, the work and contributions presented in this thesis primarily investigate how modern machine learning techniques can be adapted to tackle key graph mining tasks, with a particular focus on optimal macro-level representation, interpretability and incorporating temporal dynamics into the learning process. The majority of methods employed are novel approaches centered around attempting to use artificial neural networks in order to learn from graph datasets. Firstly, by devising a novel graph fingerprint technique, it is demonstrated that this can successfully be applied to two different tasks whilst out-performing established baselines, namely graph comparison and classification. Secondly, it is shown that a mapping can be found between certain topological features and graph embeddings. This, for perhaps the the first time, suggests that it is possible that machines are learning something analogous to human knowledge acquisition, thus bringing interpretability to the graph embedding process. Thirdly, in exploring two new models for incorporating temporal information into the graph learning process, it is found that including such information is crucial to predictive performance in certain key tasks, such as link prediction, where state-of-the-art baselines are out-performed. The overall contribution of this work is to provide greater insight into and explanation of the ways in which machine learning with respect to graphs is emerging as a crucial set of techniques for understanding complex datasets. This is important as these techniques can potentially be applied to a broad range of scientific disciplines. The thesis concludes with an assessment of limitations and recommendations for future research

    Renewable Energy Integration in Distribution System with Artificial Intelligence

    Get PDF
    With the increasing attention of renewable energy development in distribution power system, artificial intelligence (AI) can play an indispensiable role. In this thesis, a series of artificial intelligence based methods are studied and implemented to further enhance the performance of power system operation and control. Due to the large volume of heterogeneous data provided by both the customer and the grid side, a big data visualization platform is built to feature out the hidden useful knowledge for smart grid (SG) operation, control and situation awareness. An open source cluster calculation framework with Apache Spark is used to discover big data hidden information. The data is transmitted with an Open System Interconnection (OSI) model to the data visualization platform with a high-speed communication architecture. Google Earth and Global Geographic Information System (GIS) are used to design the visualization platform and realize the results. Based on the data visualization platform above, the external manifestation of the data is studied. In the following work, I try to understand the internal hidden information of the data. A short-term load forecasting approach is designed based on support vector regression (SVR) to provide a higher accuracy load forecasting for the network reconfiguration. The nonconvexity of three-phase balanced optimal power flow is relaxed to an optimal power flow (OPF) problem with the second-order cone program (SOCP). The alternating direction method of multipliers (ADMM) is used to compute the optimal power flow in distributed manner. Considering the reality of distribution systems, a three-phase unbalanced distribtion system is built, which consists of the hourly operation scheduling at substation level and the minutes power flow operation at feeder level. The operaion cost of system with renewable generation is minimized at substation level. The stochastoc distribution model of renewable generation is simulated with a chance constraint, and the derived deterministic form is modeled with Gaussian Mixture Model (GMM) with genetic algorithm-based expectationmaximization (GAEM). The system cost is further reduced with OPF in real-time (RT) scheduling. The semidefinite programming (SDP) is used to relax the nonconvexity of the three-phase unbalanced distribution system into a convex problem, which helps to achieve the global optimal result. In the parallel manner, the ADMM is realizing getting the results in a short time. Clouds have a big impact on solar energy forecasting. Firstly, a convolutional neural network based mathod is used to estimate the solar irradiance, Secondly, the regression results are collected to predict the renewable generation. After that, a novel approach is proposed to capture the Global horizontal irradiance (GHI) conveniently and accurately. Considering the nonstationary property of the GHI on cloudy days, the GHI capturing is cast as an image regression problem. In traditional approaches, the image regression problem is treated as two parts, feature extraction and regression, which are optimized separately and no interconnections. Considering the nonlinear regression capability, a convolutional neural network (CNN) based image regression approach is proposed to provide an End-to- End solution for the cloudy day GHI capturing problem in this paper. For data cleaning, the Gaussian mixture model with Bayesian inference is employed to detect and eliminate the anomaly data in a nonparametric manner. The purified data are used as input data for the proposed image regression approach. The numerical results demonstrate the feasibility and effectiveness of the proposed approach

    Wearable and BAN Sensors for Physical Rehabilitation and eHealth Architectures

    Get PDF
    The demographic shift of the population towards an increase in the number of elderly citizens, together with the sedentary lifestyle we are adopting, is reflected in the increasingly debilitated physical health of the population. The resulting physical impairments require rehabilitation therapies which may be assisted by the use of wearable sensors or body area network sensors (BANs). The use of novel technology for medical therapies can also contribute to reducing the costs in healthcare systems and decrease patient overflow in medical centers. Sensors are the primary enablers of any wearable medical device, with a central role in eHealth architectures. The accuracy of the acquired data depends on the sensors; hence, when considering wearable and BAN sensing integration, they must be proven to be accurate and reliable solutions. This book is a collection of works focusing on the current state-of-the-art of BANs and wearable sensing devices for physical rehabilitation of impaired or debilitated citizens. The manuscripts that compose this book report on the advances in the research related to different sensing technologies (optical or electronic) and body area network sensors (BANs), their design and implementation, advanced signal processing techniques, and the application of these technologies in areas such as physical rehabilitation, robotics, medical diagnostics, and therapy

    1992 NASA/ASEE Summer Faculty Fellowship Program

    Get PDF
    For the 28th consecutive year, a NASA/ASEE Summer Faculty Fellowship Program was conducted at the Marshall Space Flight Center (MSFC). The program was conducted by the University of Alabama and MSFC during the period June 1, 1992 through August 7, 1992. Operated under the auspices of the American Society for Engineering Education, the MSFC program, was well as those at other centers, was sponsored by the Office of Educational Affairs, NASA Headquarters, Washington, DC. The basic objectives of the programs, which are the 29th year of operation nationally, are (1) to further the professional knowledge of qualified engineering and science faculty members; (2) to stimulate and exchange ideas between participants and NASA; (3) to enrich and refresh the research and teaching activities of the participants' institutions; and (4) to contribute to the research objectives of the NASA centers

    Smart Sensor Technologies for IoT

    Get PDF
    The recent development in wireless networks and devices has led to novel services that will utilize wireless communication on a new level. Much effort and resources have been dedicated to establishing new communication networks that will support machine-to-machine communication and the Internet of Things (IoT). In these systems, various smart and sensory devices are deployed and connected, enabling large amounts of data to be streamed. Smart services represent new trends in mobile services, i.e., a completely new spectrum of context-aware, personalized, and intelligent services and applications. A variety of existing services utilize information about the position of the user or mobile device. The position of mobile devices is often achieved using the Global Navigation Satellite System (GNSS) chips that are integrated into all modern mobile devices (smartphones). However, GNSS is not always a reliable source of position estimates due to multipath propagation and signal blockage. Moreover, integrating GNSS chips into all devices might have a negative impact on the battery life of future IoT applications. Therefore, alternative solutions to position estimation should be investigated and implemented in IoT applications. This Special Issue, “Smart Sensor Technologies for IoT” aims to report on some of the recent research efforts on this increasingly important topic. The twelve accepted papers in this issue cover various aspects of Smart Sensor Technologies for IoT

    Innovative Technologies and Services for Smart Cities

    Get PDF
    A smart city is a modern technology-driven urban area which uses sensing devices, information, and communication technology connected to the internet of things (IoTs) for the optimum and efficient utilization of infrastructures and services with the goal of improving the living conditions of citizens. Increasing populations, lower budgets, limited resources, and compatibility of the upgraded technologies are some of the few problems affecting the implementation of smart cities. Hence, there is continuous advancement regarding technologies for the implementation of smart cities. The aim of this Special Issue is to report on the design and development of integrated/smart sensors, a universal interfacing platform, along with the IoT framework, extending it to next-generation communication networks for monitoring parameters of interest with the goal of achieving smart cities. The proposed universal interfacing platform with the IoT framework will solve many challenging issues and significantly boost the growth of IoT-related applications, not just in the environmental monitoring domain but in the other key areas, such as smart home, assistive technology for the elderly care, smart city with smart waste management, smart E-metering, smart water supply, intelligent traffic control, smart grid, remote healthcare applications, etc., signifying benefits for all countries
    • …
    corecore