301,982 research outputs found

    Quadri-dimensional approach for data analytics in mobile networks

    Get PDF
    The telecommunication market is growing at a very fast pace with the evolution of new technologies to support high speed throughput and the availability of a wide range of services and applications in the mobile networks. This has led to a need for communication service providers (CSPs) to shift their focus from network elements monitoring towards services monitoring and subscribers’ satisfaction by introducing the service quality management (SQM) and the customer experience management (CEM) that require fast responses to reduce the time to find and solve network problems, to ensure efficiency and proactive maintenance, to improve the quality of service (QoS) and the quality of experience (QoE) of the subscribers. While both the SQM and the CEM demand multiple information from different interfaces, managing multiple data sources adds an extra layer of complexity with the collection of data. While several studies and researches have been conducted for data analytics in mobile networks, most of them did not consider analytics based on the four dimensions involved in the mobile networks environment which are the subscriber, the handset, the service and the network element with multiple interface correlation. The main objective of this research was to develop mobile network analytics models applied to the 3G packet-switched domain by analysing data from the radio network with the Iub interface and the core network with the Gn interface to provide a fast root cause analysis (RCA) approach considering the four dimensions involved in the mobile networks. This was achieved by using the latest computer engineering advancements which are Big Data platforms and data mining techniques through machine learning algorithms.Electrical and Mining EngineeringM. Tech. (Electrical Engineering

    Multi-level Feature Fusion-based CNN for Local Climate Zone Classification from Sentinel-2 Images: Benchmark Results on the So2Sat LCZ42 Dataset

    Get PDF
    As a unique classification scheme for urban forms and functions, the local climate zone (LCZ) system provides essential general information for any studies related to urban environments, especially on a large scale. Remote sensing data-based classification approaches are the key to large-scale mapping and monitoring of LCZs. The potential of deep learning-based approaches is not yet fully explored, even though advanced convolutional neural networks (CNNs) continue to push the frontiers for various computer vision tasks. One reason is that published studies are based on different datasets, usually at a regional scale, which makes it impossible to fairly and consistently compare the potential of different CNNs for real-world scenarios. This study is based on the big So2Sat LCZ42 benchmark dataset dedicated to LCZ classification. Using this dataset, we studied a range of CNNs of varying sizes. In addition, we proposed a CNN to classify LCZs from Sentinel-2 images, Sen2LCZ-Net. Using this base network, we propose fusing multi-level features using the extended Sen2LCZ-Net-MF. With this proposed simple network architecture and the highly competitive benchmark dataset, we obtain results that are better than those obtained by the state-of-the-art CNNs, while requiring less computation with fewer layers and parameters. Large-scale LCZ classification examples of completely unseen areas are presented, demonstrating the potential of our proposed Sen2LCZ-Net-MF as well as the So2Sat LCZ42 dataset. We also intensively investigated the influence of network depth and width and the effectiveness of the design choices made for Sen2LCZ-Net-MF. Our work will provide important baselines for future CNN-based algorithm developments for both LCZ classification and other urban land cover land use classification

    A Study on Tools And Techniques Used For Network Forensic In A Cloud Environment: An Investigation Perspective

    Get PDF
    The modern computer environment has moved past the local data center with a single entry and exit point to a global network comprising many data centers and hundreds of entry and exit points, commonly referred as Cloud Computing, used by all possible devices with numerous entry and exit point for transactions, online processing, request and responses traveling across the network, making the ever complex networks even more complex, making traversing, monitoring and detecting threats over such an environment a big challenge for Network forensic and investigation for cybercrimes. It has demanded in depth analysis using network tools and techniques to determine how best information can be extracted pertinent to an investigation. Data mining technique providing great aid in finding relevant clusters for predicting unusual activities, pattern matching and fraud detection in an environment, capable to deal with huge amount of data. The concept of network forensics in cloud computing requires a new mindset where some data will not be available, some data will be suspect, and some data will be court ready and can fit into the traditional network forensics model. From a network security viewpoint, all data traversing the cloud network backplane is visible and accessible by the cloud service provider. It is not possible to think now that one physical device will only have one operating system that needs to be taken down for investigation. Without the network forensics investigator, understanding the architecture of the cloud environment systems and possible compromises will be overlooked or missed. In this paper, we focus on the role of Network Forensic in a cloud environment, its mapping few of the available tools and contribution of Data Mining in making analysis, and also to bring out the challenges in this field

    Big Data for Traffic Monitoring and Management

    Get PDF
    The last two decades witnessed tremendous advances in the Information and Com- munications Technologies. Beside improvements in computational power and storage capacity, communication networks carry nowadays an amount of data which was not envisaged only few years ago. Together with their pervasiveness, network complexity increased at the same pace, leaving operators and researchers with few instruments to understand what happens in the networks, and, on the global scale, on the Internet. Fortunately, recent advances in data science and machine learning come to the res- cue of network analysts, and allow analyses with a level of complexity and spatial/tem- poral scope not possible only 10 years ago. In my thesis, I take the perspective of an In- ternet Service Provider (ISP), and illustrate challenges and possibilities of analyzing the traffic coming from modern operational networks. I make use of big data and machine learning algorithms, and apply them to datasets coming from passive measurements of ISP and University Campus networks. The marriage between data science and network measurements is complicated by the complexity of machine learning algorithms, and by the intrinsic multi-dimensionality and variability of this kind of data. As such, my work proposes and evaluates novel techniques, inspired from popular machine learning approaches, but carefully tailored to operate with network traffic. In this thesis, I first provide a thorough characterization of the Internet traffic from 2013 to 2018. I show the most important trends in the composition of traffic and users’ habits across the last 5 years, and describe how the network infrastructure of Internet big players changed in order to support faster and larger traffic. Then, I show the chal- lenges in classifying network traffic, with particular attention to encryption and to the convergence of Internet around few big players. To overcome the limitations of classical approaches, I propose novel algorithms for traffic classification and management lever- aging machine learning techniques, and, in particular, big data approaches. Exploiting temporal correlation among network events, and benefiting from large datasets of op- erational traffic, my algorithms learn common traffic patterns of web services, and use them for (i) traffic classification and (ii) fine-grained traffic management. My proposals are always validated in experimental environments, and, then, deployed in real opera- tional networks, from which I report the most interesting findings I obtain. I also focus on the Quality of Experience (QoE) of web users, as their satisfaction represents the final objective of computer networks. Again, I show that using big data approaches, the network can achieve visibility on the quality of web browsing of users. In general, the algorithms I propose help ISPs have a detailed view of traffic that flows in their network, allowing fine-grained traffic classification and management, and real-time monitoring of users QoE

    Feature selection in detection of adverse drug reactions from the Health Improvement Network (THIN) database

    Get PDF
    Adverse drug reaction (ADR) is widely concerned for public health issue. ADRs are one of most common causes to withdraw some drugs from market. Prescription event monitoring (PEM) is an important approach to detect the adverse drug reactions. The main problem to deal with this method is how to automatically extract the medical events or side effects from high-throughput medical events, which are collected from day to day clinical practice. In this study we propose a novel concept of feature matrix to detect the ADRs. Feature matrix, which is extracted from big medical data from The Health Improvement Network (THIN) database, is created to characterize the medical events for the patients who take drugs. Feature matrix builds the foundation for the irregular and big medical data. Then feature selection methods are performed on feature matrix to detect the significant features. Finally the ADRs can be located based on the significant features. The experiments are carried out on three drugs: Atorvastatin, Alendronate, and Metoclopramide. Major side effects for each drug are detected and better performance is achieved compared to other computerized methods. The detected ADRs are based on computerized methods, further investigation is needed.Comment: International Journal of Information Technology and Computer Science (IJITCS), in print, 201

    Computer Numerical Control CNC Machine Health Prediction using ‎Multi-domain Feature Extraction and Deep Neural Network Regression

    Get PDF
    Tool wear monitoring has become more vital in intelligent production to enhance Computer Numerical Control CNC machine health state. Multidomain features may effectively define tool wear status and help tool wear prediction. Prognostics and health management (PHM) plays a vital role in condition-based maintenance (CBM) to prevent rather than detect malfunctions in machinery. This has great advantage of saving costs of fault repair including human effort, financial costs as long as power and energy consumption. The huge evolution of Industrial Internet of Things (IIOT) and industrial big data analytics has made Deep Learning a growing field of research. The PHM society has held many competitions including PHM10 concerning CNC milling machine cutters data for tool wear prediction The purpose of this paper is to predict tool wear of CNC cutters and. We adopted a multi-domain feature extraction method for health statement of the cutters. and a deep neural network DNN method for tool wear prediction
    • …
    corecore