67,870 research outputs found

    Towards machine learning applied to time series based network traffic forecasting

    Get PDF
    This TFG will explore some specific use cases of the application of Machine Learning techniques to Software-Define Networks, in particular to overlay protocols such as LISP, VXLAN, etc.The aim of this project is to implement a network traffic forecasting model using time series and improve its performance with machine learning techniques, offering a better prediction based in outlier correction. This is a project developed in the Computer Architecture Department (DAC) at the Universitat Politècnica de Catalunya (UPC). Time Series modeling methodology is able to shape a trend and take care of any existing outlier, however it does not cover outlier impact on forecasting. In order to achieve more precision and better confidence intervals, the model combines outlier detection methodology and Artificial Neural Networks to quantify and predict outliers. A study is realized over external data to find out if there is an improvement and its effect on the predictions. Machine learning techniques as Artificial Neural Networks has proven to be an improvement of the current methodology to realize forecasting using Time Series modeling. Future work will be oriented to create an improved standard of this system focused on generalize the model.El objetivo de este proyecto es implementar un modelo de previsión de tráfico de red utilizando series temporales y mejorar su rendimiento con técnicas de aprendizaje automático, generando una mejor predicción basada en la corrección de valores atípicos. Se trata de un proyecto desarrollado en el Departamento de Arquitectura de Computadores (DAC) de la Universidad Politécnica de Cataluña (UPC). La metodología de modelado de series temporales es capaz de predecir una tendencia y hacerse cargo de cualquier valor atípico ya existente, sin embargo, no cubre el impacto de estos sobre la predicción. Con el fin de lograr una mayor precisión y mejores intervalos de confianza, el modelo combina la metodología de detección de valores atípicos y redes neuronales artificiales para cuantificar y predecir los atípicos. Un estudio se realiza sobre datos externos para averiguar si hay una mejora y su efecto sobre las predicciones. Las técnicas de aprendizaje automático, como redes neuronales artificiales, han demostrado ser una mejora de la metodología actual para realizar la predicción utilizando modelos de series de tiempo. El trabajo futuro se orientará para crear un mejor nivel de este sistema se centró en generalizar el modelo.L'objectiu d'aquest projecte és implementar un model de previsió de tràfic de xarxa utilitzant sèries temporals i millorar el seu rendiment amb tècniques d'aprenentatge automàtic, generant una millor predicció basada en la correcció de valors atípics. Es tracta d'un projecte desenvolupat al Departament d'Arquitectura de Computadors (DAC) de la Universitat Politècnica de Catalunya (UPC). La metodologia de modelatge de sèries temporals és capaç de predir una tendència i fer-se càrrec de qualsevol valor atípic ja existent, però, no cobreix l'impacte d'aquests sobre la predicció. Per tal d'aconseguir una major precisió i millors intervals de confiança, el model combina la metodologia de detecció de valors atípics i xarxes neuronals artificials per quantificar i predir els atípics. Un estudi es realitza sobre dades externes per esbrinar si hi ha una millora i el seu efecte sobre les prediccions. Les tècniques d'aprenentatge automàtic, com xarxes neuronals artificials, han demostrat ser una millora de la metodologia actual per a fer predicció utilitzant models de sèries de temps. El treball futur s'orientarà per crear un millor nivell d'aquest sistema es va centrar en generalitzar el model

    Detection of advanced persistent threat using machine-learning correlation analysis

    Get PDF
    As one of the most serious types of cyber attack, Advanced Persistent Threats (APT) have caused major concerns on a global scale. APT refers to a persistent, multi-stage attack with the intention to compromise the system and gain information from the targeted system, which has the potential to cause significant damage and substantial financial loss. The accurate detection and prediction of APT is an ongoing challenge. This work proposes a novel machine learning-based system entitled MLAPT, which can accurately and rapidly detect and predict APT attacks in a systematic way. The MLAPT runs through three main phases: (1) Threat detection, in which eight methods have been developed to detect different techniques used during the various APT steps. The implementation and validation of these methods with real traffic is a significant contribution to the current body of research; (2) Alert correlation, in which a correlation framework is designed to link the outputs of the detection methods, aims to identify alerts that could be related and belong to a single APT scenario; and (3) Attack prediction, in which a machine learning-based prediction module is proposed based on the correlation framework output, to be used by the network security team to determine the probability of the early alerts to develop a complete APT attack. MLAPT is experimentally evaluated and the presented sy

    The Challenge of Machine Learning in Space Weather Nowcasting and Forecasting

    Get PDF
    The numerous recent breakthroughs in machine learning (ML) make imperative to carefully ponder how the scientific community can benefit from a technology that, although not necessarily new, is today living its golden age. This Grand Challenge review paper is focused on the present and future role of machine learning in space weather. The purpose is twofold. On one hand, we will discuss previous works that use ML for space weather forecasting, focusing in particular on the few areas that have seen most activity: the forecasting of geomagnetic indices, of relativistic electrons at geosynchronous orbits, of solar flares occurrence, of coronal mass ejection propagation time, and of solar wind speed. On the other hand, this paper serves as a gentle introduction to the field of machine learning tailored to the space weather community and as a pointer to a number of open challenges that we believe the community should undertake in the next decade. The recurring themes throughout the review are the need to shift our forecasting paradigm to a probabilistic approach focused on the reliable assessment of uncertainties, and the combination of physics-based and machine learning approaches, known as gray-box.Comment: under revie

    Machine-learning-based calving prediction from activity, lying, and ruminating behaviors in dairy cattle

    Get PDF
    The objective of this study was to use automated activity, lying, and rumination monitors to characterize prepartum behavior and predict calving in dairy cattle. Data were collected from 20 primiparous and 33 multiparous Holstein dairy cattle from September 2011 to May 2013 at the University of Kentucky Coldstream Dairy. The HR Tag (SCR Engineers Ltd., Netanya, Israel) automatically collected neck activity and rumination data in 2-h increments. The IceQube (IceRobotics Ltd., South Queensferry, United Kingdom) automatically collected number of steps, lying time, standing time, number of transitions from standing to lying (ly-. ing bouts), and total motion, summed in 15-min increments. IceQube data were summed in 2-h increments to match HR Tag data. All behavioral data were collected for 14 d before the predicted calving date. Retrospective data analysis was performed using mixed linear models to examine behavioral changes by day in the 14 d before calving. Bihourly behavioral differences from baseline values over the 14 d before calving were also evaluated using mixed linear models. Changes in daily rumination time, total motion, lying time, and lying bouts occurred in the 14 d before calving. In the bihourly analysis, extreme values for all behaviors occurred in the final 24 h, indicating that the monitored behaviors may be useful in calving prediction. To determine whether technologies were useful at predicting calving, random forest, linear discriminant analysis, and neural network machine -learning techniques were constructed and implemented using R version 3.1.0 (R Foundation for Statistical Computing, Vienna, Austria). These methods were used on variables from each technology and all combined variables from both technologies. A neural network analysis that combined variables from both technologies at the daily level yielded 100.0% sen-sitivity and 86.8% specificity. A neural network analysis that combined variables from both technologies in bihourly increments was used to identify 2-h periods in the 8 h before calving with 82.8% sensitivity and 80.4% specificity. Changes in behavior and machine-learning alerts indicate that commercially marketed behavioral monitors may have calving prediction potential

    Predictive Maintenance on the Machining Process and Machine Tool

    Get PDF
    This paper presents the process required to implement a data driven Predictive Maintenance (PdM) not only in the machine decision making, but also in data acquisition and processing. A short review of the different approaches and techniques in maintenance is given. The main contribution of this paper is a solution for the predictive maintenance problem in a real machining process. Several steps are needed to reach the solution, which are carefully explained. The obtained results show that the Preventive Maintenance (PM), which was carried out in a real machining process, could be changed into a PdM approach. A decision making application was developed to provide a visual analysis of the Remaining Useful Life (RUL) of the machining tool. This work is a proof of concept of the methodology presented in one process, but replicable for most of the process for serial productions of pieces

    Prognosis of Bearing Acoustic Emission Signals Using Supervised Machine Learning

    Get PDF
    © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Acoustic emission (AE) technique can be successfully utilized for condition monitoring of various machining and industrial processes. To keep machines function at optimal levels, fault prognosis model to predict the remaining useful life (RUL) of machine components is required. This model is used to analyze the output signals of a machine whilst in operation and accordingly helps to set an early alarm tool that reduces the untimely replacement of components and the wasteful machine downtime. Recent improvements indicate the drive on the way towards incorporation of prognosis and diagnosis machine learning techniques in future machine health management systems. With this in mind, this work employs three supervised machine learning techniques; support vector machine regression, multilayer artificial neural network model and gaussian process regression, to correlate AE features with corresponding natural wear of slow speed bearings throughout series of laboratory experiments. Analysis of signal parameters such as signal intensity estimator and root mean square was undertaken to discriminate individual types of early damage. It was concluded that neural networks model with back propagation learning algorithm has an advantage over the other models in estimating the RUL for slow speed bearings if the proper network structure is chosen and sufficient data is provided.Peer reviewe
    corecore